Reconsidering Writing Pedagogy in the Era of ChatGPT
Lee-Ann Kastman Breuch, Kathleen Bolander, Alison Obright, Asmita Ghimire, Stuart Deets, and Jessica Remcheck
Discussion
Our sessions with undergraduate students yielded many rich insights. To help make sense of our findings, we return to our overall research questions which included the following:
- How are undergraduate students understanding ChatGPT as an academic writing tool?
- To what extent are students incorporating ChatGPT into their writing product(s)?
- How are students thinking about ChatGPT in their writing process?
How are undergraduate students understanding ChatGPT as an academic writing tool?
Our findings suggest that students in our study saw both the pros and the cons of ChatGPT as an academic writing tool, and they had several questions. In addition, as our filter analysis demonstrated, students had a multidimensional response to ChatGPT. Regarding pros, students in this study consistently rated ChatGPT texts on the high end of the scale when considering expectations and satisfaction. Students also rated ChatGPT highly in terms of “relevance” of information provided in ChatGPT texts. In addition, students used overwhelmingly positive words to describe their experience with ChatGPT texts, mostly in terms of ChatGPT functionality (noted as “function”), citing words such as “time-saving,” “fast,” “convenient,” and “efficient.” Similarly, qualitative student comments coded in the filter of “low order concerns” demonstrated the ways students might view ChatGPT texts uncritically, noting its ability to produce texts quickly, with relevant content, correct grammar and mechanics, and overall clear writing. Perhaps the biggest benefit students noted was the way ChatGPT texts might help them at various points in the writing process, whether in getting started with writing assignments, organizing ideas, or even editing content. Overall, students were impressed with the initial capabilities of ChatGPT in multiple areas: error-free prose, logical organization, relevant content and ideas, and fast production. Said differently, students’ first impressions of ChatGPT texts might reflect an overly positive response to prose that is quickly and clearly produced.
However, these initial reactions to ChatGPT changed as students read ChatGPT texts more closely. As student comments pointed out several concerns and questions about ChatGPT texts. Through qualitative coding, our study categorized these concerns across eight filters, which suggested that students had a multidimensional response to ChatGPT as they considered the tool within larger rhetorical contexts of academic writing. Students most frequently noted a concern about “information literacy,” which was expressed through critiques about information included in the texts, whether it was credible (or even real), where the information came from, and the presence or absence of citations. Students also noted the basic nature of ChatGPT output and its lack of depth or idea development, which we coded through the filter of “logic and organization.” Several students questioned whether ChatGPT texts actually reflected quality college level writing. Students were also concerned about the ethics of using ChatGPT texts, which they expressed through questions or assertions about plagiarism, and the ways in which ChatGPT displaced the voices of students as authors. In addition, regarding ethics, some students expressed concerns about the ways ChatGPT might stifle the learning process, especially if writing is seen as a learning activity in the academy. As one student asked, “what does [ChatGPT] mean for research?” and "what does ChatGPT do to the future of education?" These ethical concerns were connected to student comments coded in the “self and experience” filter, in which students expressed ways they would want to revise or change ChatGPT texts to include more of their individual ideas and thinking. Many students said they would not use ChatGPT produced texts for academic assignments simply because it was not their own work. They emphasized the importance of ownership in their work and even said it would be easier to do the work themselves rather than editing ChatGPT produced texts. As our findings showed, students had a number of questions about ChatGPT including how it worked and whether it was acceptable for use.
To what extent are students incorporating ChatGPT into their writing product(s)?
Our study could not answer this question about using ChatGPT for homework adequately because it presented sample prompts that were hypothetical and not placed in the context of real classes. We had hoped to learn from students whether they would be inclined to integrate ChatGPT texts as their academic homework; because most of the prompts were not connected to any actual class contexts, students could not provide an answer for this question. One task that was closer to academic homework involved the task that asked students to write a prompt that would address a writing assignment they might complete for their major. While we did not require students to add a prompt from an actual class for this task, many of them wrote prompts reflecting work they had done for a class in their major. We did ask students to rate the likelihood that they would use ChatGPT texts “unaltered” as their academic homework; average ratings from students for each of the five ChatGPT texts ranged between 1.5 and 2.5 on a scale of 1 to 5 with 1 being not very likely and 5 being very likely to use texts unaltered. These were the lowest ratings our study recorded, suggesting that students were not overly enthusiastic about using ChatGPT texts for their own homework, or at the very least, they had some significant questions about doing so. While these ratings reflect student caution, again, we cannot be sure of these results due to the overall hypothetical nature of the usability test. We also note that students may have reacted negatively to this question because all research team members were faculty and graduate students from the Writing Studies Department. This reality of the usability sessions may have impacted student answers to this question.
How are students thinking about ChatGPT in their writing process?
Of all the research questions, our study yielded the most information on this question about the writing process. Through ratings of ChatGPT texts, we learned that students highly rated the likelihood that they would use ChatGPT to generate ideas as part of their academic writing process. In addition, these results were supplemented by our coding of qualitative comments in the “process as filter” category, which was the most frequently coded category in our data set.
In response to many of the prompts, students articulated ways that the ChatGPT texts were useful as starting points, with student comments such as: “I would definitely take a look at those sources and maybe use those sources as a jumping point” (Participant G). Similarly, another student responded to the literacy narrative text produced by ChatGPT by saying, “seeing this as an overall like as a story kind of gave me ideas of ... what I would need to do and then I could apply that to my own my own life for a different story that I wanted to tell” (Participant L). Students also commented on ways that ChatGPT would be helpful for creating an outline or initial organization of ideas: “If I was like really struggling, I can see how it'd be helpful just to get an idea of like an outline” (Participant QQ). Students were also aware of the line between idea generation and copying ChatGPT: “I think this is like a good way to generate an idea for like a paper or something. But not necessarily copying and pasting” (Participant WW). Students were impressed by the ways ChatGPT generated ideas, included relevant content, provided initial outlines of organization, and provided a foundation for further writing.
The findings of this study are important because they go against the depiction of ChatGPT as a technology that promotes the written product over the notion of the writing process. As Sid Dobrin (2023) noted, ChatGPT naturally brings up the oft cited binary in writing pedagogy of “product versus process” (p. 22). This binary means that writing can be described both in terms of a finished written product such as a report, a poster, a tweet, or a memo, and in terms of the writing process, which includes writing activities such as prewriting, writing, and rewriting. ChatGPT raises this binary because the technology produces a product, literally in seconds, in response to a specific prompt. This function of ChatGPT immediately raises the question of whether or not students would use ChatGPT as a substitute for the writing process, much like calculators have been said to replace the thinking processes involved in working through complex calculations. It is tempting to think of ChatGPT as a “writing calculator.” As we have conducted this study, we have often heard the questions “has ChatGPT replaced writing?” and “is writing over as we know it”? The results of our study—though exploratory and limited to a small sample—support the ideas that ChatGPT does not replace writing, that it is not a writing calculator, and that it could even be useful in one’s writing process. Student responses noted the usefulness of seeing a sample text from ChatGPT as a way to generate ideas, and also for seeing an initial organizational structure of ideas.