This past Thanksgiving, my dad asked me if my peers were using ChatGPT to write their papers. I told him that I don't know for certain what other students do, but I certainly don't use it. Even if I wanted to cheat, ChatGPT as it exists today is simply too rudimentary to write college-level analysis.
That isn't to say that AI hasn't transformed my learning over the past few years. For example, I'll sometimes use AI to summarize readings. To be clear, I never skip actually reading the text. Instead, the summary allows me to get a grasp on the main ideas so the task of reading is less daunting. I would never use it to skip the readings; understanding academic writing is an essential part of my college education.
Obviously, other people don't feel the same way. Countless students are forgoing reading for these summaries. And although analytic papers might not be written by AI, I have a suspicion that some of my classmates' formulaic discussion posts were. I've even seen presentations that made me question to what extent humans were involved in its creation (a level of audacity I could never hope to attain).
Needless to say, the proliferation of AI-generated content has presented a real problem for teachers. How are they supposed to ensure that students' work is their own? The current tools claiming to detect AI-generated writing are neither accurate nor reliable (Weber-Wulff et al, 2023), and I would predict that detection will only become more difficult as generative AI improves.
In response to this problem, some have suggested that we need to stop assigning work that AI has now made “obsolete.” This suggestion does raise a fair question: if a computer can complete some task for us, is it really valuable? But this misses the point of assigning work in the first place.
This current debate about AI reminds me of our elementary-school debates about calculators. Why were we bothering to learn how to multiply and divide if we would soon have phones that could do that for us? I forget if my teachers gave reasonable answers to this question (probably not), but I have one now. I would imagine it's difficult to do high-level math if you haven't mastered low-level math first. More abstractly, actually working through these tasks gives you a better number sense, which allows you to better understand how numbers work together (and spend less time pulling out tools).
The same principle holds in our current age of generative AI and writing. How are students supposed to learn to communicate large ideas if they can't communicate small ideas? Hopefully you agree with me that some simple tasks can be pedagogically valuable, even if not economically valuable in themselves.
But at the end of the day, it doesn't matter that these assignments are supposedly important for one's education. Students will still use AI, because students have generally internalized that education is not the goal of school. For good reason too—students are rewarded for the products of their labor, not for the labor itself. We shouldn't be surprised when students trend towards the most efficient manner of production.
This discrepancy between what is educationally important and what is rewarded has been a problem for a lot longer than ChatGPT has been around. Using AI to write papers is the spiritual successor of cramming before a test, or reading SparkNotes instead of the book. Though these practices provide little educational benefit, they are much quicker and easier ways to get good grades. It's worth remembering that grades are what materially matter to students.
If we don't want students to use generative AI because it's counterproductive to education, we need to redesign our schools so that education is actually the goal. This doesn't mean just a rhetorical shift; this requires that our schools reward the end product less and focus on the process more. After all, the process is where the learning happens.
For example, a writing class shouldn't just assign papers. Instead, teachers should break down the papers so that students go through the process with the class. They should go through outlines, drafts, and revisions—some of which can be completed collaboratively with their teachers or peers. My writing improved the most when I was required to actually talk to my teachers about my thoughts and my progress throughout.
We have made our schools mirror the economy, and we now pay the price. The object of examination should not be the writing; it should be the writer. In such a class, using AI would be nonsensical.