top of page

“AI in the Classroom: A Shortcut, a Tool, or Something to Worry About?”

  • molloycommunicatio
  • May 30
  • 4 min read

Written By: Sofia Alvarez


It used to be that when students were stuck on an assignment, they turned to Google, a classmate, or maybe an old textbook. Now, more and more students are asking for help from something entirely different: artificial intelligence.

Websites like ChatGPT, Grammarly, and QuillBot have become common in college, sitting in tabs alongside Canvas, Google Docs, and Spotify. Some students use them to develop ideas or correct grammar. Others go a step farther, utilizing them to write early drafts or even entire discussion posts. In any case, artificial intelligence is here and is changing the way we do school. According to a 2024 survey by the Digital Education Council, 86% of students worldwide use AI for academic purposes, with more than half using it frequently. That number is hardly surprising given how many tasks, deadlines, and distractions college students face.

Samantha L., a 21-year-old senior majoring in Criminal Justice, says she began using ChatGPT last year when she became overwhelmed during finals week. "I wasn't trying to get it to write my whole paper," she clarified. "But I had a topic, and I had no idea how to begin. So I asked it to help me in creating an outline."

She went on to say that it felt like she had a study partner who never slept. She continued to write everything herself, but just to double-check her explanations. AI has not replaced Samantha's work; rather, it has increased her confidence in it. She also said she doesn’t hide the fact that she uses AI. “It’s just another tool,” she said. “I treat it like Grammarly or spellcheck.” But not every student sees it the same way. Kevin M., a 21-year-old junior majoring in biology, admits to using AI in ways that clearly exceed the line. "There were times where I used it to write a full paragraph for a discussion post," he told me. "I'd edit it to make it sound more like me, but it wasn't really my own writing."Kevin explained that those instances usually occurred when he was overflowing with assignments and simply attempting to keep up. "It's not something I'm proud of," he said. "But sometimes it seemed like the only option. I was working and had other classes, so it was either that or falling behind." That tension between using AI as assistance vs a shortcut, is one that many students are now dealing with. There are few clear regulations, and the pressure to remain on top of everything is overwhelming.

In a typical week, students balance lectures, readings, quizzes, work shifts, and personal life. When you're exhausted or falling behind, AI can help you catch up quickly. Even students who strive to use it appropriately can find themselves falling into the "just this once" category. Personally, I’ve used ChatGPT to help me understand an assignment better or to write an outline when I was stuck. I’ve never turned in something it wrote completely, but I’ve seen students do that without hesitation. Sometimes it’s obvious when a post is written by AI. The tone is overly formal, or the structure is just too perfect.

What concerns me isn't just the cheating; it's what we're losing. If we rely too heavily on AI, are we still learning how to convey our own ideas? Are we giving up the ability to struggle through a first draft? At the same time, I get it. College can be quite exhausting. When you're overloaded with work and AI can save you a few hours, it can be tempting. That is why I do not believe that a straight ban is the best solution. Both Samantha and Kevin said they’d be open to learning how to use AI the right way if schools actually taught it. “Right now it’s like we’re all guessing,” Kevin said. “No one really tells you what’s okay or not okay. You hear different things from different professors.” Samantha agreed. “If someone showed me how to use it responsibly, I’d follow it. But no one’s explaining that. They just say ‘don’t cheat’ and move on.”

A majority of professors mention AI in class, and many syllabi now include university policies, but that doesn’t always mean students understand what counts as acceptable use. A 2024 survey by Inside Higher Ed found that only 16% of students felt they knew when they were allowed to use AI, even when their college had published a policy on appropriate use for coursework. There is a significant difference between using AI to clean up a sentence and relying on it to do all of the work. Most students already understand this. The problem isn't that we don't understand the line; it's that, under pressure, the line begins to blur. Especially when you're exhausted and just want to get things done before the deadline. I believe the solution is somewhere in the middle. We should be able to use AI with structure, expectations, and transparency. Allow students to use it to brainstorm or refine their work, but educate them how to keep their own voice.

Artificial intelligence will be a part of our future. That is no longer a debate. The main challenge is how we can ensure that technology helps us learn rather than simply doing tasks faster. We still need to understand how to write, how to present an argument, and how to communicate clearly. These are skills that AI cannot replace. However, until schools catch up, students will continue to figure it out on their own—sometimes utilizing AI to get ahead, sometimes to remain afloat, and sometimes simply hoping no one notices. So, where should colleges go from here? The first step is to listen. Instead of focusing just on rules and punishments, schools should make time for genuine discussions—about how students are already using AI, where the confusion lies, and what responsible use entails. Clear examples would go far beyond general warnings. Students who feel they may ask questions without fear of being judged or accused are more likely to use AI in thoughtful, honest ways. Schools do not have to prohibit it; instead, they can help lead it.

Recent Posts

See All

Comments


bottom of page