Academic Integrity in the Age of AI

Academic Integrity in the Age of AI

Female student using AI with her laptop

Students seeking shortcuts and other ways to reduce their workload is nothing new. From copying the work of other students to purchasing study guides or summaries of assigned readings, challenges to academic integrity existed long before the internet. The digital age made certain forms of plagiarism especially tempting with the ease of copying and pasting from internet sources. With each new technological era, instructors and academic institutions have adjusted their strategies for preventing and spotting plagiarism, from how assignments are designed to the use of detection tools to the threat of adverse consequences if students are caught. The arrival of ChatGPT and other natural language processing forms of artificial intelligence (AI) present the latest challenge to academic integrity.

Man using tablet to work with ChatGPT

When ChatGPT and other generative AI tools exploded onto the scene in the middle of the school year in 2022–2023, the immediate reaction of many schools and colleges was to ban their use. However, many schools have now rescinded these bans as they’ve seen how widespread and inevitable usage of generative AI has become. While many “AI detection tools” have sprung up, they have not been shown to be consistently reliable. Instead, many schools are now focusing their energies not on trying to eliminate the use of generative AI altogether, but on how to design assignments and teach students how to use AI in ways that will still allow them to build writing, research, and critical-thinking skills. Here are some tips on how to encourage a culture of academic integrity in the age of AI.

Adjust Writing Assignments 

Depending on your learning outcomes, there may be instances where you need to be certain that students are not using AI in their writing. In these cases, you may want to assign in-class writing assignments and assessments where students do not have access to AI technology. However, bear in mind that this strategy can have other drawbacks; some students do not show their best thinking in the limited time frame of an in-class essay. Longer-term assignments like research projects will necessarily require work outside of class. And of course, if you are teaching virtually, “in-class” writing assignments are still done using technology. For all these reasons, in-class writing and assignment completion isn’t a complete “solution” to the academic integrity challenges of AI. Think of it as one option that can be deployed when possible, but it’s important to have other strategies in place as well that allow you to evaluate student understanding and skills even when they do have access to AI.

Break Down Assignments 

Assignments where the type of work being evaluated is a final finished product are particularly vulnerable to being completed largely or entirely with AI. Instead, many instructors are shifting to breaking down assignments into individual components of a research process, which essentially requires students to “show their work.” Even without the influence of AI, this process-focused assignment design can benefit students, as it encourages them to be intentional and methodical in their research process.

Add Personal Reflections 

While AI can easily summarize a topic, it’s much more difficult for it to describe a personal experience from a student’s life. Consider asking students to describe their research process, mistakes or changes they made along the way, or how they plan to structure the next phase of their research. These types of questions build structure and accountability into the process, and also help students who are just developing their research skills articulate what they are doing and why.

Teach AI as a Tool 

It may seem counterintuitive to teach students how to use generative AI if you want them to avoid using it. But students will start experimenting with it on their own—and some of them already have. Help students think of AI as a tool, one that can be used productively or dangerously. They will benefit from guidance on how to use this tool most effectively while still building their own skills. Talk with your students about ways that AI can help them in their research and understanding, as well as its limitations. This will probably require you to do your own research and experimenting with generative AI to see how it can be used appropriately in your discipline or in conjunction with your specific learning outcomes. Remember to verify the parameters for AI use allowed at your institution, and to keep in mind privacy and confidentiality concerns of these models. 

Teacher in front of video screen

Teach Students about the Shortcomings of AI 

Part of teaching students about using AI as a tool is making clear its limitations and dangers. It’s especially important for students to understand, for example, that ChatGPT and tools like it may produce example sources and citations that are made up. The ChatGPT tool is built on a dataset that only goes up to 2021, so the model’s information about recent events is out of date. Because many of the major AI tools in popular use, such as Bard and ChatGPT, don’t provide transparency to the user about the sources for the responses they generate, students need to know that any response they receive from generative AI may be inaccurate, biased, and in need of verification from known, authoritative sources.

Determine Your AI Policies and Communicate with Students 

Your institution may have specific policies about using AI, or request that you utilize AI detection tools. Make sure your students know what these policies are and the class or institutional consequences for not following them. Remember, though, that detection tools can be unreliable, and that some form of AI use by your students is likely to occur. This doesn’t have to be a bad thing, if you structure assignments so that students are using AI as a tool that can help them in their research and learning but not complete assignments for them. But it will definitely take some effort to revise assignments and re-clarify learning outcomes.

How We Can Help 

The multimedia in Infobase’s Information Literacy – Core has recently been updated to incorporate an exploration of AI in several of our most popular and relevant tutorials for students, including Evaluating Information, Why Visual Literacy Matters, Academic Integrity, Choosing and Using Keywords, and Developing a Research Focus. Additionally, both Information Literacy – Core and Credo Source customers now have access to a suite of AI literacy-related resources for educators and librarians that subscribers can access in the Help sites for both of these products.

Information Literacy – Core and Credo Source


Not a subscriber to Information Literacy – Core or Credo Source? Why not take a FREE trial?

See also:

Sources:

Badke, William. “AI Challenges to Information Literacy.” Computers in Libraries. Volume 43, No. 3. April 2023. https://www.infotoday.com/cilmag/apr23/Badke–AI-Challenges-to-Information-Literacy.shtml 

Pun, Ray. “Using ChatGPT to Engage in Library Instruction? Challenges and Opportunities.” February 22, 2023. https://www.youtube.com/watch?v=ddkbWPp9ezM 

“Teaching With and About AI.” University of New Mexico College of University Libraries and Learning Sciences. 2023. https://create.piktochart.com/output/75e5a7a39319-teaching-with-and-about-ai 

“What is ChatGPT?” OpenAI. Accessed September 1, 2023.  https://help.openai.com/en/articles/6783457-what-is-chatgpt