How ChatGPT robs students of the motivation to write and think for themselves

How ChatGPT robs students of the motivation to write and think for themselves

chatgpt

Credit: Pixabay/CC0 Public Domain

When the OpenAI company launched its new artificial intelligence program, ChatGPT, in late 2022, educators began to worry. ChatGPT could generate text that looked like it was written by a human. How could teachers detect if students were using language generated by an AI chatbot to cheat on a writing assignment?

As a linguist who studies the effects of technology on the way people read, write, and think, I believe there are other equally pressing concerns besides cheating. These include whether AI, in general, threatens students’ writing abilities, the value of writing as a process, and the importance of viewing writing as a vehicle for thinking.

As part of the research for my new book on the effects of artificial intelligence on human writing, I surveyed young adults in the US and Europe on a number of questions related to those effects. They reported a litany of concerns about how AI tools can undermine what they do as writers. However, as I point out in my book, these concerns have been brewing for a long time.

Users see negative effects

Tools like ChatGPT are just the latest in a progression of AI programs for editing or generating text. In fact, the potential for AI to undermine both writing skills and the motivation to write your own composition has been decades in the making.

Spell check and now sophisticated grammar and style programs like Grammarly and Microsoft Editor are among the best-known AI-powered editing tools. In addition to correcting spelling and punctuation, they identify grammatical problems and offer alternative wording.

AI text generation developments have included autocomplete for online searches and predictive text messaging. Enter “Was Rome” into a Google search and you’ll get a list of options like “Was Rome built in one day?”. Type “ple” into a text and you’ll be offered “please” and “much.” These tools inject themselves into our writing efforts uninvited, incessantly asking us to follow their suggestions.

The young adults in my surveys appreciated AI’s assistance with spelling and word completion, but also spoke of the negative effects. One survey participant said that “At some point, if you depend on a predictive text [program]you’re going to lose your spelling skills.” Another observed that “Spell check and AI software…can…be used by people who want to take an easier way out.”

One respondent mentioned being lazy when relying on predictive text messages: “It’s okay when I’m feeling particularly lazy.”

diminished self expression

AI tools can also affect a person’s typing voice. One person in my survey said that with predictive text messages, “I don’t feel like I typed it.”

A secondary school student in Britain echoed the same concern about individual writing style when describing Grammarly: “Grammarly can remove the artistic voice of students… Instead of using their own unique writing style, Grammarly can take that away from students by suggesting severe changes in their work.”

Similarly, Evan Selinger, a philosopher, worried that predictive texting would reduce the power of writing as a form of mental activity and personal expression.

“By encouraging us not to overthink our words, predictive technology can subtly change the way we interact with each other,” Selinger wrote. “We give others more of the algorithm and less of ourselves… [A]Automation…can stop us from thinking.”

In literate societies, writing has long been recognized as a way to help people think. Many people have quoted author Flannery O’Connor’s comment that “I write because I don’t know what I think until I read what I say.” Many other accomplished writers, from William Faulkner to Joan Didion, have also expressed this sentiment. If the AI ​​text generation writes for us, we diminish the opportunities to think about the problems ourselves.

One scary consequence of using programs like ChatGPT to generate language is that the text is grammatically perfect. A finished product. It turns out that the lack of errors is a sign that the AI, not a human, probably wrote the words, since even accomplished writers and editors make mistakes. Human writing is a process. We question what we originally wrote, rewrite it, or sometimes start over entirely.

challenges in schools

When doing writing assignments, ideally there should be an ongoing dialogue between the teacher and the student: discuss what the student wants to write about. Share and comment on initial drafts. Then it is time for the student to reconsider and revise. But this practice often does not happen. Most teachers do not have time to play a collaborative editorial and educational role. In addition, they may lack interest or the necessary skills, or both.

Conscientious students sometimes undertake aspects of the process themselves, as professional authors often do. But the temptation to lean on text generation and editing tools like Grammarly and ChatGPT makes it all too easy for people to substitute out-of-the-box technological outputs for opportunities to think and learn.

Educators are brainstorming how to make good use of AI writing technology. Some point to the potential of AI to drive thinking or collaborate. Before ChatGPT came along, commercial companies like Sudowrite licensed an earlier version of the same underlying program, GPT-3. Users can enter a phrase or sentence and then ask the software to complete more words, potentially stimulating the creativity of the human writer.

A fading sense of belonging

However, there is a slippery slope between collaboration and usurpation. Writer Jennifer Lepp admits that as she relied more and more on Sudowrite, the resulting text “didn’t feel like mine anymore. It was very uncomfortable to look back on what I wrote and not really feel connected to the words or the ideas.”

Students are even less likely than seasoned writers to recognize where to draw the line between a writing aid and letting an AI text generator take care of their content and style.

As technology becomes more powerful and pervasive, I expect schools to make an effort to teach students about the ins and outs of generative AI. However, the lure of efficiency can make it hard to resist trusting AI to polish a writing task or do much of the writing for you. Spell checking, grammar checking and autocomplete programs have already paved the way.

Writing as a human process.

I asked ChatGPT if it was a threat to the motivation of humans to write. The bot’s response:

“There will always be a demand for creative and original content that requires the unique insight and insight of a human writer.”

He continued: “[W]Writing serves many purposes beyond content creation, such as self-expression, communication, and personal growth, which can continue to motivate people to write, even if certain types of writing can be automated.”

I was heartened to find that the program seemed to recognize its own limitations.

My hope is that educators and students will too. The purpose of doing writing assignments should be more than submitting work for a grade. Developing a written work should be a journey, not just a destination.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The conversation

Citation: How ChatGPT robs students of the motivation to write and think for themselves (2023, Jan 19) Retrieved Jan 19, 2023 from https://phys.org/news/2023-01-chatgpt-students.html

This document is subject to copyright. Apart from any fair dealing for private study or research purposes, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Reply

Your email address will not be published. Required fields are marked *