Current Events

Lesson Plan: Teaching and Learning in the Era of ChatGPT

First students learn about and share their thoughts on the issues A.I. chatbots raise for schools. Then, we invite them to help design both ethical guidelines and curriculum projects that use the tool for learning.

<a href="https://www.nytimes.cf/2022/12/15/opinion/chatgpt-education-ai-technology.html">Related Opinion Essay</a><div><div><div class="highlighter--icon highlighter--icon-copy"></div><div class="highlighter--icon highlighter--icon-change-color"></div><div class="highlighter--icon highlighter--icon-delete"></div></div></div>
Credit...Illustration by The New York Times; photographs by kyoshino and Khuruchon Chanthanyakorn, via Getty Images

Featured Article: “Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach” by Kalley Huang

Late last year, a major breakthrough in the world of artificial intelligence put extraordinary powers in the hands of anyone with access to the internet. Kevin Roose, the Times Tech columnist, calls ChatGPT “the best artificial intelligence chatbot ever released to the general public”:

It can write jokes (some of which are actually funny), working computer code and college-level essays. It can also guess at medical diagnoses, create text-based Harry Potter games and explain scientific concepts at multiple levels of difficulty.

He also notes that the technology has “already sent many educators into a panic” since, of course, students can use it to do their work for them, passing off A.I.-generated essays and problem sets as their own.

In this lesson, we invite students to be part of the conversation about the issues these chatbots raise for teaching and learning. Via free links to some of the many recent Times articles and Opinion essays on the topic, we also invite them to help design ethical guidelines and curriculum projects that envision how schools may use A.I. chatbots safely and productively for learning.

In addition, we have posted a related Student Opinion forum that asks, “How Should Schools Respond to ChatGPT?” When your classes are ready, we invite them to share their thoughts.

Finally, like many playing with the chatbot, we couldn’t resist asking it to help us write this lesson, and we’ve included one of its ideas in our Going Further section. If your students try any the suggestions below, we’d love to hear about the results. Write to us at LNFeedback@nytimes.com, or post a comment here.


Farhad Manjoo

Is a hot dog a sandwich?

ChatGPT ChatGPT

Whether or not a hot dog is considered a sandwich is a matter of debate. Some people consider a hot dog to be a type of sandwich, while others do not. Ultimately, the definition of a sandwich is a matter of personal interpretation, so whether or not you consider a hot dog to be a sandwich is up to you.

Ask your students, What do you know or have you heard about ChatGPT? Have you experimented with it yourself? What did you think of it? What questions do you have?

Depending on their responses, they may need a broad introduction to the tool. If so, we recommend an episode of The Daily called “Did Artificial Intelligence Just Get Too Smart?” (Students can follow along via the transcript.)

If your students have heard about ChatGPT and want to see it in action but your school is among those that have blocked it, you can look at the examples in the column ChatGPT Has a Devastating Sense of Humor.” A part of that conversation, in which the Times columnist Farhad Manjoo asks the chatbot “Is a hot dog a sandwich?,” is embedded above.

But if possible, try playing with the tool itself by putting in a prompt and seeing what happens. Your prompt might be based on a previous or current assignment, or it could be something your class develops on the spot. As the chatbot responds, ask students to pose additional suggestions to refine the output.

Finally, discuss questions like these as a class before reading our featured article about the tool’s implications for schools.

  • What did you notice? What did you wonder?

  • What does ChatGPT seem to do well? What does it seem to do poorly? Why do you think that is?

  • Do you think it is the game-changer that many seem to believe it is?

  • Why do you think many educators are worried about this new technology? Do you agree with any of those concerns?

  • Before we read a related article that looks closely at the role of this tool in education, what do you think? How should schools respond to ChatGPT? Why?

Image
The University of Florida campus in Gainesville. Colleges and universities have been reluctant to ban the new chatbot because administrators doubt the move would be effective.Credit...Todd Anderson for The New York Times

Read the article, “Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach,” then answer the following questions:

1. This article opens with an anecdote. What does it tell you? Why do you think the writer started that way? What is your reaction to the story?

2. Across the country, educators are “starting to overhaul classrooms in response to ChatGPT, prompting a potentially huge shift in teaching and learning.” Have you seen that in your school or district or in any of your classes? How have your teachers addressed ChatGPT?

3. Generative A.I. like ChatGPT is in its early days. What related tools and technologies are on the horizon, according to the article? Are you excited about the possibilities?

4. Much of this piece is devoted to how colleges and professors are responding to the tool by rethinking policies and curriculum. What are some things they are doing? Which of those things seem most sensible or useful to you? Are there any you think your school should adopt? Are there any you think your school should avoid? Why?

5. The last few paragraphs look at student use of ChatGPT. Do you agree that it can be a useful tool for learning? If so, how? When do you think using ChatGPT for assignments might cross the line from learning to cheating? Give specific examples.

6. There are more than 3,500 reader comments on this article. Why do you think this topic has prompted so much discussion, worry and argument? Scanning the comments, do you see any that express something you think, too? Which and why?

Image
Credit...Nata Metlukh

Now that you have read about how schools are thinking about ChatGPT, you’re ready to weigh in yourself.

Our related Student Opinion prompt asks, “How Should Schools Respond to ChatGPT?” and lists a number of questions based on a Times piece by Mr. Roose, the technology columnist, headlined “Don’t Ban ChatGPT in Schools. Teach With It.”

What do you think? Post a comment and join the conversation.

Image
Credit...Larry Buchanan/The New York Times

A quiz from The Times’s Upshot team asks, “Did a Fourth Grader Write This? Or the New Chatbot?” Here is how they introduce it:

We used real essay prompts from the National Assessment of Educational Progress (the standardized test from the Department of Education, known as the nation’s report card). We asked the bot to produce essays based on those prompts — sometimes with a little coaching, and always telling it to write like a student of the appropriate age. We put what it wrote side by side with sample answers written by real children.

We asked some experts on children’s writing to take our variation on the Turing test, live on a call with us. They were a fourth-grade teacher; a professional writing tutor; a Stanford education professor; and Judy Blume, the beloved children’s author. None of them could tell every time whether a child or a bot wrote the essay. See how you do.

Can you do better than the children’s author Judy Blume? Take the quiz and see.

As mentioned in our introduction, we asked ChatGPT to help us compose this piece. Our prompt: “Write a lesson plan that introduces ChatGPT to high school students and explores its implications for teaching and learning.”

The results were bare-bones — nothing almost anyone who has spent time in a classroom couldn’t have quickly developed themselves. But we’ve chosen what we thought was the most imaginative idea and expanded it in this section. Here is what ChatGPT suggested:

Activity 2 (20 minutes):

  • Divide students into small groups and have them brainstorm ideas for potential projects or applications using ChatGPT.

  • Have each group present their ideas to the class and discuss the feasibility and potential impact of each project.

Though we imagine it will take more than 20 minutes, we invite your students to try. Ask them to recast a school assignment — perhaps a real one they will be tackling this semester — or come up with an entirely new project. Whatever they design, the students should be prepared to defend its use of ChatGPT as both ethical and productive, and show how it enhances their learning.

If they would like help imagining what such an assignment could look like beyond the ideas shared in the featured article or the related Student Opinion forum, here are three additional resources:

  • In “What Would Plato Say About ChatGPT?” the Opinion columnist Zeynep Tufekci offers a philosophical argument for how the tool can be used in schools, as well as some practical tips for how.

  • In the Times podcast Hard Fork, the hosts Casey Newton and Mr. Roose speak to Cherie Shields, a high school English teacher in Sandy, Ore., about the ways she is weaving the tool into her teaching. You can find a transcript here.

  • A piece from outside The Times, from a site called Ditch That Textbook, is also full of helpful context and ideas.

Finally, here is a tip that is also a plug for The Learning Network and other educational resources that focus on current events and literature. As Mr. Roose points out, unlike Google, ChatGPT doesn’t crawl through the web for information on current events, and its knowledge is restricted to things it learned before 2021. Therefore, incorporating current issues, events, ideas and literature into an assignment is one easy way to keep students from being over-reliant on the tool.

Image
Credit...Derek Abella

In his column “How to Use ChatGPT and Still Be a Good Person,” Brian X. Chen points out that “We have reached a turning point with artificial intelligence, and now is a good time to pause and assess: How can we use these tools ethically and safely?”

Mr. Chen mentions not just ChatGPT, but also tools like Lensa AI. He writes, “Like smartphones and social networks when they first emerged, A.I. feels fun and exciting. Yet (and I’m sorry to be a buzzkill), as is always the case with new technology, there will be drawbacks, painful lessons and unintended consequences.”

What, in your students’ opinions, might be the drawbacks, lessons and unintended consequences? You might invite them to brainstorm a list in small groups.

At the end of the article, Mr. Chen suggests three rules of thumb for navigating this new world ethically and safely. We challenge your students — perhaps in their small groups — to do the same. What rules would they put in place for using this kind of technology in school? In their personal lives? Why?

After they have come up with some guidelines, invite your students to compare them with what other small groups have devised, and then read what Mr. Chen recommends.

They might then pool and refine their suggestions in future classes, and test them by using them to guide their work from the semester. Finally, if the guidelines they have developed work well, the students might even present them to their school or district administration, along with some samples of the work that results.

Image
Credit...Diana Ejaita

In September, we asked students, “Are A.I.-Generated Pictures Art?” We posed follow-up questions like “Should pictures created with artificial intelligence be considered art — equal to what an artist might create with a pen, a brush or a lump of clay?” and “What is the value of visual art in our world? What might we gain from A.I.-created art? What might we lose?” Over 300 teenagers weighed in.

Now we’re posing similar questions about writing. Someone recently brought the issue to the musician Nick Cave:

I asked ChatGPT to write a song in the style of Nick Cave and this is what it produced. What do you think?

You can read the A.I.-created lyrics and Mr. Cave’s full response at his blog, The Red Hand Files, but here is some of what he said:

ChatGPT may be able to write a speech or an essay or a sermon or an obituary but it cannot create a genuine song. It could perhaps in time create a song that is, on the surface, indistinguishable from an original, but it will always be a replication, a kind of burlesque.

Songs arise out of suffering, by which I mean they are predicated upon the complex, internal human struggle of creation and, well, as far as I know, algorithms don’t feel. Data doesn’t suffer. ChatGPT has no inner being, it has been nowhere, it has endured nothing, it has not had the audacity to reach beyond its limitations, and hence it doesn’t have the capacity for a shared transcendent experience, as it has no limitations from which to transcend. ChatGPT’s melancholy role is that it is destined to imitate and can never have an authentic human experience, no matter how devalued and inconsequential the human experience may in time become.

In a recent edition of her newsletter, Tressie McMillan Cottom, the Times Opinion columnist, says something similar:

A.I. writes prose the way horror movies play with dolls. Chucky, Megan, the original Frankenstein’s monster. The monster dolls appear human and can even tell stories. But they cannot make stories. Isn’t that why they are monsters? They can only reflect humanity’s vanities back at humans. They don’t make new people or chart new horizons or map new experiences. They are carbon copies of an echo of the human experience.

She continues:

Even when the essays are a good synthesis of other essays, written by humans, they are not human. Frankly, they creep me out precisely because they are so competent and yet so very empty. ChatGPT impersonates sentiment with sophisticated word choice but still there’s no élan. The essay does not invoke curiosity or any other emotion. There is a voice, but it is mechanical. It does not incite, offend or seduce. That’s because real voice is more than grammatical patternmaking.

If “This is a great time to think about the line between human and machine,” as Ms. McMillan Cottom says, what is that line? How do you see the “humanness” in the art you love? What makes it different from something created by a machine? Will A.I. ever be capable of creating real art? Will it replace human artists?

And finally, how do these questions apply to learning and the role of schooling? If a machine can spit out a decent five-paragraph essay examining the causes of World War I or looking at symbolism in “The Bluest Eye,” why read and write and study the humanities? How might schools respond to this new technology not by doubling down on practices that can be easily gamed, but instead by helping students reach their full “humanness” through encouraging their original thinking and unique voices?