That lead you just heard was written by ChatGPT, the chatbot that took the digital world by storm after it was released in November 2022.
[“Also sprach Zarathustra” by Richard Strauss]
Here’s ChatGPT describing what it does: “ChatGPT is a generative AI model developed by OpenAI that generates human-like text based on the input it receives, capable of engaging in conversation, providing information, or generating creative content across various topics and contexts.”
I used ChatGPT to write parts of my introduction, but we can use ChatGPT for many other purposes. Here’s what some Northwestern students have said about ChatGPT.
HENRY MICHAELSON: I use it for a lot of essay-based assignments.
ULYSSES FERNANDEZ: It is very helpful for homeworks, especially if I’m having a hard time getting started on a homework, I could put the code into ChatGPT and it’ll explain it. Or if I get stuck, like an error that I can’t find, I can put my code into ChatGPT and it will find where the error is and recommend some things that could help fix it.
EMILY CHI: I think ChatGPT is a really useful tool when it comes to brainstorming and creating concepts and ideas, especially for essays. Just having a basis, an outline for what I want to write before. I think it’s helpful.
YIYAO DU: I feel like I value my own creativity when it comes to homeworks and writings and things that I feel like I should express my own thoughts. But I do use ChatGPT when I have to get a grasp of ideas, of concepts that was taught in class.
SAM RADINSKY: I usually use it more for recreational purposes, like for recipes or traveling.
According to separate fall 2023 surveys from both the Associated Student Government and Northwestern IT, the majority of students use AI at least occasionally or once a month.
Some faculty members became concerned about the frequency with which students use generative AI.
VICTORIA GETIS: We saw a lot of worry on the part of faculty that new tools that were being introduced and rolled out without a huge amount of discrimination as to how the tools should be used made faculty worry that students would use these tools to basically do their thinking for them. And then shortly after the tools were rolled out and the press got ahold of them and we started seeing story after story about ChatGPT and OpenAI, the stories about how the large language models very often would hallucinate and create language predictions that didn’t actually have any basis in reality made people worry even more.
That’s Victoria Getis, director of teaching and learning technologies at Northwestern IT and chair of the Provost’s Generative AI Advisory Committee.
Large language models, including ChatGPT, may generate content that is not only inaccurate but also reflective of societal biases.
GETIS: The tools are trained on the internet and all the data that are out there, and unfortunately, the internet has a lot of biases built into it If you think about the content that you see. And so it’s not too surprising that something trained on a biased corpus of material would also display bias.
How did the committee respond to the emergence of ChatGPT? Here’s another of its members: Elizabeth Lenaghan, director of the Cook Family Writing Program and assistant director of the Writing Place.
ELIZABETH LENAGHAN: Writing is a means of thinking. It’s a means of figuring out your thoughts on issues and figuring out what your identity is, all of those things are so integral to the writing process and have the potential to be sort of stripped away if we’re relying on generative AI too much.
But according to both Getis and Lenaghan, that doesn’t mean faculty members should immediately condemn or ignore generative AI.
LENAGHAN: I feel that faculty who teach writing and teach lots of different things have some sort of obligation to acknowledge the fact that Al is a part of the world now and that it is going to be a part of the workplaces and things that many of our students engage in beyond the classroom.
Last August, the committee hosted a series of faculty workshops on generative AI tools and possible ways to incorporate them into classrooms. About half of the students surveyed through Canvas last fall said their professors allowed AI to be used within limits.
Some professors created videos about assignments incorporating AI. These videos are available on the university’s website. In one video, Ken Alder, a history professor, talked about an instance when students ran an assignment prompt through ChatGPT and marked up its response for errors.
KEN ALDER: It basically was the most banal Wikipedia version of their project. And so in a weird way, it was a great lesson for them about what constituted research in depth.
In another video, Ignacio Cruz, an assistant professor of communication, discussed using AI tools to evaluate students’ résumés.
IGNACIO CRUZ: We can unpack these systems to an extent to be able to show them the role that they play in their own experiences in the job market and beyond.
And Chin-Hung Chang, an associate professor of Chinese, shared her experiences about an essay-writing assignment that incorporated ChatGPT throughout the process.
CHIN-HUNG CHANG: They could learn the language structures from the writing. For example, you can say “and,” you can say “in addition,” you can say “as well as.” So, it was a good tool for them to immerse or expose themselves to the language outside the classroom.
What do the students themselves think of ChatGPT and other models built on generative AI?
RADINSKY: I think it’s a useful tool, but oftentimes the human brain can do better.
MICHAELSON: I think it’s helpful if you use it in the right way. It’s a good tool to get feedback on essays and help edit, but it’s not great at writing them on its own. So it’s really just like, you get out what you put in.
For WNUR News, I’m Edward Simon Cruz.
[“Also sprach Zarathustra” by Richard Strauss]
Music used: “Also sprach Zarathustra” by Richard Strauss