“I Quit Teaching Because of ChatGPT”

What happens when students use calculators to solve math problems? They save time on mechanical operations and arrive at a singular correct solution, so why not use AI tools like ChatGPT for writing?

Comparing the use of AI ChatGPT for writing to using a calculator for math problems is a mistaken comparison, especially in the context of humanities. While a calculator is a problem-solving tool that reduces the time needed to solve mechanical operations that students have already been taught to produce a singular correct solution, the aim of humanities education is to “shape people” by “giving them the ability to think about things that they wouldn’t naturally be prompted to think about.

Experienced scholars understand that writing is a process intertwined with thinking, as noted by historian Lynn Hunt. For many educators, writing is a crucial part of developing thoughts and ideas, a process that takes considerable effort and time. However, the ease of using AI has led many students to avoid the discomfort associated with this intellectual labor, and that’s what a seasoned professor with nearly 20 years’ experience who taught academic writing to doctoral students at a technical college found that out the hard way. Many of his students, who were computer scientists well-versed in the mechanisms of generative AI, relied heavily on tools like ChatGPT to draft their work. Several admitted to using AI to turn their notes into full articles, despite recognizing the limitations and ethical issues of AI and despite assignments structured to discourage over-reliance on AI, and many failed as it became clear they lacked writing skills.

These tools did not always help avoid plagiarism and allowed students to bypass the effort required to truly understand their work. Outsourcing writing to AI deprived students of the opportunity to think deeply about their research. Generative AI, while democratizing in some aspects by correcting grammar for non-native English speakers, often altered the meaning of texts. Students lacked the skills to recognize and correct these changes, failing to develop their voices as research writers.

Ultimately, the educator found himself spending more time providing feedback on AI-generated text than on students’ original work, prompting the decision to leave teaching.

Some educators will adapt to AI, shifting focus from mechanical tasks to fostering critical thinking. However, this requires students to embrace the discomfort of not knowing and trusting in their cognitive abilities—a willingness that was largely absent in their experience.

Leave a comment