Entrance to Green Library. Heidi Cuevas, PantherNOW

Student perspective: AI for writing is a cautionary tale

Facebook
Twitter
LinkedIn
Email

Sonia Stolar | Contributing Writer

The normalized presence of artificial intelligence in classrooms should be questioned and debated, instead of welcome with open arms. 

Chat bots, such as Chat GPT and CoPilot, crept into the education system with sonic speed, completely changing academics irreversibly.

I used to be skeptical of AI from the very start. Seeing it as a downward spiral, it appalled me (and still does) how so many succumb to the convenience of chat bots.

My stance against AI created this illusion that I am tackling challenges with my workload that some don’t. When a deadline is approaching,  loads of work is yet to be done, nerves  on edge, my mindset is still: “I’d rather fail than have a bot do it for me.” 

Then again, that is how it used to be before the introduction of chat bots: slouching over learning material and actually doing the work.

Regardless of modern luxuries, hard work—not shortcuts— should be taught.

At first, AI’s taking over student papers was rooted in the “forbidden fruit tastes the sweetest” narrative. The complete prohibition of it from instructors almost enabled students to indulge in the comforts provided by the accessibility of it.

Now, with colleges getting more accustomed with the inevitability of AI infiltrating educational establishments, it suddenly became a part of the curriculum.

Seeing a writing professor use “AI Guidelines” for the first time was refreshing to me. Showing how to apply a technology that is not going anywhere and is only evolving as a tool and not a replacement is valuable.

On one hand, if no one shows students how to benefit from AI and only forbid it, they will keep using it aimlessly. It is easy to copy and paste assignment directions. It is more complicated to come up with prompts that could help enhance a student’s original writing.

The guidelines were present specifically for that: a pathway to stronger writing.

On the other hand, in other classes I started seeing a trend of conversations with chat bots becoming a part of the curriculum.

So it seems some students are being taught that they must integrate AI in their coursework.

The dilemma is that forbiddance equates to unknowingly enabling, while incorporating it in education as a requirement is forcing students’ hand to indulge in AI.

It is difficult to find the golden mean in a world where being tech savvy is an essential quality on almost every resume.

Although, I don’t think that making it a part of assignments is the way to solve the issue of leaving students either unskillful with utilizing AI or completely relying on it.

The moment Chat GPT appears on an assignment’s rubric, a student is thrown the idea that the assignment could not be completed without it.

It is a practice used by educators to make sure students know how to use the technology that is already present in workspaces.

Nonetheless, even with its presence all around us, technology remains unreliable.

On Oct. 20, Canvas shut down and WhatsApp group chats flooded with panic over approaching deadlines and contacting professors. Who says that Chat GPT or CoPilot won’t backfire one day, and no matter how many times one would try to refresh the tab, it won’t resuscitate.

Learning that AI has to be a part of our lives is unlearning critical, creative and independence skills that everyone needs in their life regardless of their major or career path.

One decent solution could be to redirect focus toward “AI Guidelines”.

If the goal is for students to be ready for the inevitable integration of AI in the workforce, then we should be taught how to use it as a tool.

What I’ve seen in other classes is a professor assigning students an AI written story to assess. AI hallucinations—a term coined to escape referring to those as mistakes, which is what they are—happen more often than one would think.

Newer AI models have 79% of inaccuracies such as making up false information, as per New York Times.

It was exhilarating to see how a lesson on recognizing the patterns of false information is simultaneously a learning experience on how a chatbot is, more often than not, incapable of producing high quality work.

The professor created a cautionary tale for students—such “hallucinations” could turn up in their work.

Instilling an understanding that AI will not effectively do a student’s work for them should be a given in classes where chatbots are directly involved in the rubric.

Other than that, making the use of AI completely optional is another way to disrupt an illusion of its necessity.

The bottom line is that college should not be a ground for future lawyers, engineers, artists, teachers and so on to acquire a belief that they cannot function without chatbots.

Students are in college to familiarize themselves with how their own minds can solve problems and how to utilize tools that can enhance their skills, not to be graded on dialogues with AI.

About Post Author

Ad Space
Search this website