Ten Suggestions on Teaching With/About Generative Artificial Intelligence (AI) in the Writing Classroom

Anthropomorphic cat wearing suit and tie, and standing in front of a chalkboard. Image created with Stable Diffusion.
Anthropomorphic cat wearing suit and tie, and standing in front of a chalkboard. Image created with Stable Diffusion.

I’ve been spending a lot of time studying and using generative AI technologies and thinking about their pedagogical implications, and over the summer, I invested more energy into taking intensive online classes relating to generative AI on LinkedIn Learning, which I wrote about here and here. The suggestions below are a distillation of some of the important ideas that I have learned and plan to implement after my sabbatical this year concludes. Readings associated with these points can be found on my extensive generative AI pedagogy bibliography. Maybe you will find some of these helpful to your thinking for your own classes as we make our way into the science fictional future together!

  1. Build ethical and legal issues of generative AI into every discussion and assignment. Of course, a separate module or a whole course can be focused on these topics, but students need to see how ethical and legal issues are tightly woven into how these technologies are developed, the challenges that they present, and how to be prepared to avoid, mitigate, or resolve those challenges. By weaving ethical and legal issues into the quotidian, it helps students think critically about these issues throughout the learning process and it avoids the conclusion that ethics and legal concerns are just an afterthought.
  2. Show students how bias in generative AI is real. Since generative AI is trained on datasets of work created by people, the AI systems will reflect the biases inherent in the content of the dataset and the ways different people might be represented in the dataset (e.g., more books by white male authors and fewer by writers of color or women writers). Bias is unfortunately baked in. Help students explore how these biases reveal themselves insidiously, might be discovered through prompting, and how to mitigate them (if possible) in the way they use generative AI as a part of their workflow.
  3. Help students become responsible generative AI users. Students need to be taught how to document, cite, and acknowledge the use of AI in their work at school and later in the workplace. This can refute earlier use of ChatGPT and similar sites that fueled what some might consider plagiarism or cheating. Helping students see how it’s okay to use these tools when allowed and properly documented helps them see how they are a tool to support their work rather than a way to avoid working.
  4. Reveal how generative AI technologies are designed, developed, and operated. By learning how generative AI is built and deployed, students get to see how the sausage is made. They will learn that generative AI isn’t magical or all knowing or perfect. Instead, they will realize that years of research and development in mathematics and computer science led to the current state of the art with these technologies, which is still lacking. They will discover the limitations of what these technologies offer (e.g., text generating AI primarily performs sentence completion and has no understanding of what it is doing, or its training data has gaps, deficiencies, biases, etc. that directly affect the text generated). This can be paired with lessons on how large language models are trained, how they are a black box in terms of how they work, and initiatives to build explainable artificial intelligence (XAI).
  5. Approach generative AI as another layer for students’ digital literacy development. Considering AI’s biases, falsehoods, so-called hallucinations, and off-topic responses, pairing generative AI with instruction on vetting information, using research tools (online and off), and applying one’s own skepticism will combat the notion of AI’s trustworthiness, expertise, and authoritativeness. Also, it gives students another source for comparing, contrasting, and verifying when checking facts and establishing reliability of various sources of information.
  6. Introduce generative AI as a new tool for students to add to writing and creative workflows. Some students might like to think that generative AI is a one-stop shop, but we can reveal to them how it can support different elements within a larger creative framework that depends on their cognition, imagination, and effort to produce deliverables. It can aid with ideation, brainstorming, planning, and outlining, as well as handling less important writing tasks, such as replying to an email or DM. An important corollary to this is the fact that prompt engineering is a skill unto itself that students have to learn and develop. In some cases, figuring out the best prompt might require more time, energy, and collaboration with others to accomplish than had the students done the writing output themselves.
  7. Refocus on editing, revision, and the writing process to incorporate generative AI text into student work. One way to accomplish this is teaching students higher level editing and revision tasks using AI generated text as the material for editing. Another way is to teach students how to use editing tools, such as those built into Microsoft Word, Google Docs, and LibreOffice, to work with the text generated by AI.
  8. Harness generative AI as a learning tool to support student experimentation and discovery by example. Students can ask the generative AI to summarize their writing, rewrite their writing for different audiences, turn outlines into paragraphs, etc. However, for students to gain some benefit from this, there needs to be a reflective writing exercise that gives the student an opportunity to dissect what the AI did to the student’s original composition and then based on what the student learns in reflection, they attempt their own new composition with the same goal as that given to the generative AI. The AI’s output can be combined with the student’s reflection and composition for evaluation by peers or the instructor, depending on how you are providing feedback to students on their work.
  9. Recognize writing students as technical communicators, because they use generative AI technology in their writing processes. I am thinking of part of the Society for Technical Communication’s definition of tech comm: “Communicating by using technology, such as web pages, help files, or social media sites.” Using AI to create outputs or as a part of the writing process means that students are using technology to communicate in a deeper way than how we might have thought of this before. Acknowledging this with students might make more of them aware of this as a career path or how they might leverage their communication skills as they transition into the workplace.
  10. Warn students about the possible jeopardy they face by providing their writing, prompts, questions, and personal identifying information to online-based generative AI tools like ChatGPT. Anything you type into the system is saved and associated with you. This means that your inputs might be used to train and fine tune future versions of the generative AI system, and the data collected about you based on what you type and how you use the system might be utilized by the system provider or sold to third parties (e.g., for advertising, adjusting insurance rates, making loan decisions, etc.). This can be connected to a larger discussion of how to protect one’s self online, practice privacy best practices, employ obfuscation techniques, etc. Teaching students how to use their own locally hosted LLMs, such as Meta’s LLaMA and its derivatives. This gives them more control over how their data, and it gives them the option to fine tune their local model to better fit their needs.