Thinking About ChatGPT

If the news about  ChatGPT had broken at anytime other than less than a month before the end of the fall academic term, the handwringing about it would have been more obvious. There have been a few articles and blog posts describing how this will upend everything and make education and certain jobs obsolete. My response to all of those is “sure…sure.” 

I think this is an unsurprising technologic development. Yes, it is going to change things. Yes, we need to be careful about how we use it. No, it isn’t going to be catastrophic for human intelligence. 

Before considering the application/ implications of this and similar tools, let me describe it. ChatGPT is a computer program that answers questions, responds to prompts, and otherwise compose text. Further it can evaluate text. Ask ChapGPT to compose an essay on Paul Revere, and it will blink for a moment, then display the composition. Take that essay ad enter it back into ChatGPT, and ask it to comment on the essay and it will describe it as a clear and concise summary. 

Using ChatGPT, one gets a sense of participating in a Touring Test in which the domain is high school essays. Thes essays also read much like those that would have earned students top mark when I taught middle school science. That—of course—is the cause of much of the concern: students no longer need to write essays, and teachers no longer need to grade them. It is true that ChatGPT does mimic, replicate, probably even make obsolete much that is done in the name of writing instruction and practice. If we dig a little deeper into some of the issues related to teaching, we can see things are not as simple as they might appear. (Some will reject my contention by suggesting that the examples and illustrations I am using are too simplistic. I do not disagree. My goal with this post is to encourage readers to think about and explore ChatGPT in greater detail.) 

Why do we write? It only takes reviewing a few examples of essays written with ChapGPT to see the essay are very superficial. These are excellent restatements of what can be found in other places, and they largely comprise statements with which one can find no dispute, but that give little clarity either. Most writers agree, this is not really why we write. Engaging stories, challenges to authority, creating contradictory explanations, reconciling disparities are all reasons writers write. It’s interesting that writing that accomplishes any of these must be clear and organized (which ChatGPT is—exceedingly so), but that it is also something else (which ChatGPT is not). We will be able to have a healthy relationship with ChatGPT if we recognize our best writing is something beyond clear and concise. 
 
It is a tool. I came of age at a time when electronic calculators where first becoming available. Until then, “pencil and paper” was how we calculated, and even though we slogged through and fund answers, we didn’t really understand math, and I am confident in saying few of my classmates really used math. The wacky ideas about averaging I see among my teaching colleagues—even the mathematical ones confirm my speculations.) Over time, we adopted calculators and adapted to them so that it is now acceptable, even preferable that one use them for efficient and accurate calculations. Perhaps AI-aided writing will be the tools we expect users leverage to compose clear writing quickly. 

There are some specific implications for teachers as well: 

Composition is a solidary task. Most writers—even those who preferred writing location is coffee shop or busy room in the library—are solitary when writing. They often brainstorm and edit in groups, but the best composition is done alone. Writers have always used their tools when composing. Dictionaries, thesauri, encyclopedia (for clarification or plagiarism). As educators, we want our evaluation to be based on students’ abilities rather than their access to other resources. The best writing teachers (at least time ones I encountered over 35 years in the field and another 17 as a student) understood the multifarious nature of writing and the many tasks that make one a good writer, so their grades were based on more than the final composition. This is as true as ever. 

Outcomes are not the outcomes. The last generation of educators have been trained that the “outcomes” matter. The measurable outcome that will be the indicator of learning is defined and producing it is all that matters. This is a behaviorist approach to the work (a framework that has been rejected by learning sciences for decades. A healthy relationship with ChatGPT will necessitate educators (and students, parents, policy makers, and politicians) recognize outcomes are not evidence of learning. It is in the messy thinking, questioning, drafting, and improving that all learners experience. If the purpose of the course can be replicate with a ChatGPT essay, then problem is deeper than the AI.  

 It’s tie to rethink motivation. Learning requires engagement; students must think about the new material. Teachers often differentiate internal and external motivation. In traditional conceptualizations of motivation, teachers have little influence on internal motivation, and they can use external motivators, but those are known to be less effective, and even detrimental to student learning, Motivation is a factor that s more sophisticated, however. Consider situational motivation. It is external, it that it is created by the teacher through problems, prompts, and questions they use to frame activities. It is internal in that the students decide to engage as they find the situation motivating. It we are seeing our students motivated to use ChatGPT rather than write themselves, maybe it is time to rethink motivation. 

There is one thing that is clear. As we watch the negotiation s ChatPGT and similar tools, those who take extreme positions for example… 
 

  • It will destroy everything 
  • It must be banned and never used 
  • It cannot be avoided, thus we must accept it 

 
Are folks we can safely ignore.