Opinion | ChatGPT can create academic dependency

Don’t use ChatGPT to plagiarize, instead learn how to enhance your own writing.

Luke Krchak, Opinions Contributor


ChatGPT has risen to stardom this past year, leading the artificial intelligence version of the Industrial Revolution.

ChatGPT is a closed AI, meaning that it still requires direct user input and responds with information that it has collected all over the internet. Future AI could create new information rather than splicing together old information.

Part of college is doing the assignments offered by classes to best prepare you and your brain for life after college. ChatGPT should be used like a calculator to enhance your use of math but not to complete the assignment for you. The University of Iowa should limit ChatGPT to prevent plagiarism.

ChatGPT can show quick examples of what a writing sample should look like. However, using ChatGPT to complete the assignment itself would skip the skills learned from the assignment.

This is one of the main reasons ChatGPT should not be used in the classroom — it lacks original thinking.

Assignments big and small over the semester are often used to show a personal side from the student, asking how to apply the materials to their lives and interests. It is important to know not only hard skills like writing and mathematics, but also to be able to think on your own.

The use of ChatGPT for plagiarism is just another way to ensure students learn less and are not as useful for the next generation of society. The future will use AI, so ChatGPT does have some place in the classroom and in daily life. But it is wise to still train skills with limited AI use.

In my testing of ChatGPT, I have found it often breaks conventions taught on how to write papers. It also lacks a prime interest to me — a heart or a soul. It reads like it does not care about the topic, and to me, writing can show off your passion about your research or of something you created.

The writing seems to answer the question in the most logical sense. When asked to write about the American Civil War, it states the war was brutal on civilians. It does not add how the people felt or how the writer feels about it.

Another thing to note from these tests is that ChatGPT can write about something that does not exist and act like it does.

I asked it to write a research paper on “Gligs Bosani,” a fake particle. It wrote that this particle was discovered in the ‘90s and how the Large Hadron Collider discovered it.

The Higgs Boson is a real particle proposed in 1964 and discovered in 2012 with the LHC, which gives particles mass. ChatGPT spliced information about the Higgs Boson with new information about the “Gligs Bosani.”

ChatGPT has a way to go before it can achieve human writing, and even then, what is the point? If I ask it to write a play, it can meet the parameters of the input but doesn’t add anything more. There is no personality, no hidden meaning, and no sense of individuality.

The UI trains more than the next generation of authors, but writing is still a personal endeavor. It is another skill to be trained in, even if you don’t want to be the next Shakespeare.

The UI should find ways to stop ChatGPT plagiarism and encourage writing from all disciplines.


Columns reflect the opinions of the authors and are not necessarily those of the Editorial Board, The Daily Iowan, or other organizations in which the author may be involved.