Questions Around the Ethical Use of AI in the Classroom

AI Exchange

Background by Pawel Czerwinski on Unsplash

Authors:Rachel Azima, Writing Center Director and Associate Professor of Practice, English, UNL, and Amy Ort, Senior Instructional Designer, Center for Transformative Teaching, UNL

The emergence of widely available generative AI technologies such as ChatGPT has prompted a wide range of responses from academics. Some fully embrace these technologies as the future of the workforce, while others fear the ways they could be used to replace human labor and negatively affect student behaviors. Before deciding on specific course policies, we encourage instructors to think carefully about how these technologies function and who might feel the most impact from their use.  

Incorporating AI into the Classroom

For anyone considering asking students to use generative AI as a required part of a course, it is vital to consider what such experimentation actually means for students and for the companies that offer access to these technologies. At a special plenary session on ChatGPT at the 2023 Conference on College Composition and Communication, Charles Woods asked the audience how many people had read OpenAI’s privacy policy (which protects users) and terms of service (which protect the company). Few had, and even fewer had assigned students to do so. Woods pointed out how accustomed we’ve all grown to skimming or automatically agreeing to these policies, but that does not mean this is behavior we should encourage in an educational setting. If we are asking students to give up a modicum of privacy by using online tools, we should very seriously consider assigning reading the privacy policy so that students understand what they are agreeing to.

It is also well worth considering how every time we interact with these tools, we are teaching them and making them better at mimicking what we write. These are not altruistic services; these are tools that are provided in order to generate profit. When companies are all too eager to replace human writers and artists with AI tools in many industries, it behooves us as educators to consider seriously what role we want to play in engaging with, promoting, or discouraging their use.

Discouraging Student Use of AI

At present, the most common concern for faculty is how to ensure that students are completing assigned work rather than outsourcing it to AI technology. The growing presence of AI tools that students can use to generate writing for their courses merely brings into relief what has always been true: there is no substitute for authentic student engagement with course content, and writing to learn represents one such engagement. As the Association for Writing Across the Curriculum puts it, “Writing to learn is an intellectual activity that is crucial to the cognitive and social development of learners and writers. This vital activity cannot be replaced by AI language generators.”

As a first step, we highly encourage instructors to think about key questions related to your course goals: why are students writing in your class? What purpose does it serve? What kinds of thinking are you hoping to promote, and how does writing support this thinking? Is it simply to demonstrate they have absorbed key principles or elements of course content? Or do you want students to demonstrate their ability to think through problems or questions in particular ways?

The answers to these questions may help you better understand what limits you want to set around AI use and what adjustments you may need to make to your existing assignments. As you work through this process, also consider student motivation: what gets students excited to complete work on their own? And what might drive them to take shortcuts such as AI or other forms of plagiarism? One useful motivator is building opportunities for students to connect on a personal level with the topic. Students are more likely to invest their own time and energy into completing written assignments when these assignments tap into their own interests and career goals in some form. Are there ways the topic of your course intersects with your students’ lives and concerns? Consider the key learning goals you have for your students. Can they be accomplished in a way where students can bring their own interests to bear?

Scaffolding assignments into smaller pieces represents another useful strategy for ensuring students complete their own work. Allowing students to tackle individual parts of larger assignments makes the writing process feel more manageable to them while also demonstrating to you that they are, in fact, doing their own thinking and writing. Similarly, crafting multiple rough drafts allows the opportunity to make mistakes without affecting their course grade, which also makes it more likely they’ll do their own writing. There are numerous resources available online for instructors looking to scaffold their assignments (e.g., UMN.) The Writing Center is also happy to consult with instructors looking to scaffold writing in their courses.

Once you develop assignment structures and course policies around AI use, it is well worth taking class time to have candid conversations about the pros and cons of AI tools to help students understand how they work: they are entirely probabilistic, which means that they do not actually think, but simply determine what is statistically most likely to be the next word. If developing critical thinking skills are important for success in the class, talking with students about what they will not be learning or gaining by injudicious use of AI tools seems in order. As part of the conversation, explain your course policies related to AI use and talk about how they’re designed to encourage the kinds of thinking you expect students to do in your class.

Avoiding Overcorrection

Just a few years ago, Covid shifted the educational landscape in a huge way, and now it may feel like the same thing is happening again. It may feel tempting to go back to the ‘old school’ method of education where all assignments are completed on pen and paper or via oral presentation in class. There are situations in which this may be a reasonable response, but we caution against implementing this across the board.

For many students with disabilities, the ability to use computers to write papers and complete exams has made the process far more manageable and customizable, allowing them to do their best work. By asking all students to do work by hand, you put those students in the situation of either having to work sub-optimally or to get an accommodation and be visibly singled out from the rest of the class.

Computers have also significantly improved the ability to write well by having built-in grammar and spelling help. They make it easy to delete, edit, and re-organize once writing has taken place. As an instructor, would you want to go back to writing longhand and give up all of these features? If you ask students to write in class, what they produce will be shorter, contain more mistakes, and likely have less depth than what they could produce in the same amount of time on a computer. If you truly want to see the best work students can produce, having them write by hand under time constraints is unlikely to fulfill your goal.

AI, Equity, and Social Justice

The ways in which AI tools and questions of equity and social justice intersect are varied and complicated, but we want to stress a couple of points here. One is that, while it may be tempting to use AI detectors to encourage students to do their own writing, we encourage you to avoid using them. Detectors are highly unreliable, turning up frequent false positives. OpenAI, the company that developed ChatGPT, has shut down its own AI detector because it doesn’t work well enough to be useful. Detectors also tend to be biased against non-native speakers of English, disproportionately identifying their writing as written by bots. AI technologies are evolving rapidly, so detectors will always be several steps behind.

Maintaining academic integrity is essential, but it can be more useful to take the approach of encouraging honesty rather than trying to catch every cheater. Focusing on the latter puts you in an adversarial relationship with students that can erode the trust between yourself and students that is so critical to academic success. Instead, we encourage you to use this moment to connect with students about why you ask them to write. The writing generated by AI bots such as ChatGPT (at least at present) is highly normative and lacking in creativity. This may offer a potent opportunity for connecting with students: while not every field prioritizes the idea of individual voice, many do. Talking candidly with students about how chat-bot-generated writing is emptied of individuality may help you tap into their desire to express themselves, thereby increasing the incentive to produce their own work.

Take Home Message

Artificial Intelligence is changing the way we write and teach. These technologies are not going away, and they will only become more sophisticated as time goes on. As instructors, we need to think carefully about what this means for our classrooms, our disciplines, and the ways that we prepare our students for the future. Part of the conversation around AI use needs to include the ethical questions of why we use AI, who benefits from it, and what we can do to ensure that our students are engaging in authentic learning experiences that best prepare them for their future careers

. 

Add new comment