Generative AI is becoming more accurate and relevant to our academic and personal lives at breakneck pace. Just as we cannot bury our heads in the sand and ignore its existence, we cannot deny its usefulness, either. But how far is too far?
Students and professors alike are searching for sound and ethical means of using AI in the classroom. Ethical usage looks different from person to person and from field to field. The use of this technology, in forms like ChatGPT, Gemini and Azure, in interdisciplinary humanities classes is a particular ordeal.
The subjectivity and sensibility the humanities are characterized by is not a computer’s strong suit. The studies of English, philosophy, religion and ethnic diaspora all require not mere objective truth, but subjective opinion and interpretation.
I’ve seen classmates wield AI cleverly in their reading-intensive classes to draw up condensed, tidy summaries of a David Foster Wallace article or Edmund Spenser’s "Faerie Queene." James Joyce’s "Ulysses" could be AI-ed into a mere two paragraphs, a colossus of written language broken down into a blurb. This is all well and good, until it’s not. No generated summary can perfectly distill dense works of literature without losing something essential. My issue with the use of generative AI to produce legible summaries is twofold.
One, condensing large works into mere paragraphs only hinders our understanding. We lose important context, detail and meaning. We misattribute authority and accuracy to technology, without knowing how wrong it can be.
Two, the act of reading is more than worthwhile. Reading is not a waste of time, writing is not a waste of time. The skill not just of analysis, but of synthesis — to find truths that build a larger meaning, independently, is becoming a rare commodity.
The use of AI to summarize and save time only serves to enfeeble our imaginations and give dangerous authority to an algorithm. Even equipped with millions of data points, AI’s crude mechanization of language poses an intimate threat to what it means to be human, because there is no spirit in those words. Just like any other man-made thing, AI technology is flawed. The neat and tidy summary received merely skims the surface, drilling only knee-deep to extract the crudest, shallowest understanding.
We are then left with minimal understandings of works that have made humanity itself.
If we have no sense of knowing or experiencing beyond what we gather from a screen, we lose our humanity. There is something intrinsically human to the act of storytelling, passing on stories from which we take individual meaning. What we fail to understand is the act of reading itself is worthwhile.