It’s late at night, and you have three assignments due the next day and four more due later this week. Exhausted, you decide to quickly finish your English essay on Shakespeare. You head over to ChatGPT and type in “write me an essay on the works of Shakespeare.”
Using artificial intelligence (AI) as a cheap shortcut for assignments actively circumvents the process of learning and undermines the brain’s effort to develop and learn. While AI can seem like a useful tool, it often ends up doing more harm than good.
A study on generative AI usage amongst ages 14-22 shows that 51% of students have utilized language models at some point – 12% of students use AI once or twice a month, 11% once or twice a week and 4% nearly every day.
While these numbers may seem small, it’s likely that they will grow significantly in the near future. Since the release of ChatGPT in November of 2022, AI has taken the world by storm, and its popularity has only grown every year. With more and more students being exposed to artificial language models such as ChatGPT, the dangers of artificial intelligence have become all the more clear.
“AI doesn’t know anything, it just knows what is likely,” English department head Peter Galalis said.
Over half of people who use artificial language models use it to gather information and brainstorm. While these seem to be completely innocent uses for artificial intelligence, in reality this type of usage could be very harmful for students’ learning habits. AI can only repeat what information is given to it. If that information is biased, then AI will repeat the biased information as truth. Historical documents are ridden with the biases of their time, and if AI presents them without the nuances that they were written with, the documents lose their historical value.
For the same reason, the information that is recited by AI models is not necessarily accurate. Artificial intelligence is unable to distinguish between jokes and misinformation, which leads it to form nonsensical answers that could even go as far as to harm the readers, such as when Google’s AI recommended eating one small rock a day.
AI also tends to come up with bland and unoriginal perspectives. When presented with a choice, such as a brainstorming prompt, AI repeats the most common trends and ideas found on the internet. This leads it to copy the patterns online and come up with boring answers without offering any new views or information. The repetition of information poorly reflects on students’ true creativity and robs them of a chance to use their brain effectively.
“AI is very much an efficiency tool meant to make certain tasks and work more efficient for people,” Galalis said. “Learning happens as a result of friction, intellectual friction – people struggling with ideas. AI circumvents all that effort.”
The goal of education should be to train the brains of students to be able to accurately take in and process information. AI actively prevents this from happening. While the problem of students avoiding and disliking learning goes much further into the education system as a whole, it is undeniable that AI is not the solution and does not help a student learn.