In higher education, we tend to privilege the outcomes of learning – usually via performance through the production of a particular kind of artefact – over the process of learning. We have relied heavily on using these artefacts to infer learning because we can’t see the learning directly (and it’s far easier than trying to develop a deep understanding of our students and their progress over time). With the widespread use of GenAI tools, which can just produce the artefact, these tools perform a neat circumvention of the association between the product and the process. This is problematic for a number of reasons.
Friction “challenges the linear, totalized, and technological solutionist narratives of ‘clean interfaces and tightly controlled interactions’” – Peter Krapp (2011), Noise Channels: Glitch and Error in Digital Culture
Technology aims to reduce user “friction”, striving for a scenario where users encounter no barriers in achieving their goals. It prioritizes ease of use, convenience, and minimal cognitive load. This runs counter to what we actually want – we want friction in the learning process. As Julie Dirksen describes, “If you don’t want the material to flow smoothly past (or around) your learners, then you want to provide a little friction—something that requires learners to chew on the material, cognitively speaking” (p. 166). Cognitive friction introduces challenges to spark critical thinking and problem-solving, help with knowledge transfer and retention, cultivate resilience and perseverance, and encourage reflection and meta-cognition.
Leslie Allison and Tiffany DeRewal highlight this tension using the research process as an example.
Technology user design values…. |
The research process values… |
Minimizing number of clicks to complete task | Lots of clicks — finding multiple types of information |
Reducing mental effort in interaction with product | High level of cognition — embrace ambiguity and complexity |
Exclusively positive emotional experiences of interaction with the product | Moments of productive frustration and negative emotions that can help one grow and learn |
Increasing profit | Increasing knowledge |
Speed | Deliberation |
Ultimately, LLMs are problematic for student learning because they collapse the varied tasks of a recursive research process, prioritize speed over deliberation, discourage ambiguity and uncertainty, and give a false sense of confidence.
So, what do we do?
Allison and DeRewal propose that to create friction in the learning process, we should be strategic and critical about our use of GenAI. They recommend the use of GenAI tools for one or fewer stages of the research process per project.
This recommendation underscores the importance of deliberate decision-making around if and how to incorporate GenAI tools into the learning process. To do so, also requires a shift in what we mean by academic integrity.
Rethinking “plagiarism” and “cheating”
Matt Miller suggests that we’re going to have to update our definitions of “plagiarism” and “cheating” to reflect different uses of AI if we want our education to be relevant to our students’ future. The question is: where will we draw the line?
Miller provides several possible uses of AI to reflect on which you would consider “cheating” or “plagiarism”. For example, would you consider a student using AI for brainstorming ideas to be considered cheating or plagiarism? What about using a spelling or grammar checker?
Within the spectrum of these practices, what are the ethical thresholds? At what point, in what contexts, or with what technologies do we cross into cheating? – Paul Fyfe (2022)
It’s likely your reaction to these questions relates to how our current education system operates or how you’ve taught in the past. Paul Fyfe suggests that educational institutions continue to uphold the concept of originality as an ideal. However, these technologies have blurred the boundaries of independent work. Some scholars advocate for a perspective that emphasizes honesty in the process of producing work, while others stress the importance of maintaining distinct boundaries around individual effort. There are even suggestions for the adoption of a new framework, such as Sarah Eaton‘s concept of ‘post-plagiarism’ through a standard of hybrid human writing.
Information Box Group
At McMaster, undergraduate and graduate course outlines should include a statement on the acceptable and unacceptable use of generative artificial intelligence in the course. If no syllabus statement is included, students should ask the educator for clarification on expectations, and if generative AI use is permitted, receive written confirmation before using generative AI in the course.
Undeclared and/or unauthorized use of AI tools to produce coursework is considered a form of academic misconduct. Instructors who incorporate GenAI into courses should explain to students how generative AI material should be acknowledged or cited.
Rethinking assessments
Many traditional assessments are vulnerable to inappropriate use of GenAI. In the short term, instructors are being advised to adapt assessments to render it challenging or onerous to use GenAI or have students engage with GenAI tools to develop digital literacy skills. In the long term, we may see a shift towards more authentic assessments that students are intrinsically motivated to complete.
This chapter on Designing Assessments in the Age of Generative AI offers a series of shorter-term, “quick fix” strategies to help counteract or embrace the easy access to generative AI, as well as a workbook to guide you through the redesign of an assessment. There are also a growing number of resource directories compiling examples of redesigned assignments that can be a useful starting point for ideas.
Information Box Group
Looking for ideas? Check out these resource directories:
100+ Creative Ideas to Use AI in Education
TextGenEd: An Introduction to Teaching with Text Generation Technologies
Reflecting on impacts in your classroom
- How might GenAI tools change the way assessments are conducted in your course? Are there certain types of assignments or exams that could be vulnerable to inappropriate use of GenAI?
- What ethical thresholds and guidelines need to be established regarding the use of GenAI in your classroom?
- In the long term, how might the widespread use of GenAI tools impact the skills students need and the nature of future assessments? Are there opportunities to use GenAI tools to complement, rather than replace, traditional learning and teaching methods?
References
Allison, L. & DeRewal, T. Critical AI: Situating Student Research Practices in the Era of LLMs. Rowan University.
Dirksen, J. Design for How People Learn (2nd Ed.), New Riders, Peachpit Press, Pearson Education. 2016, 296 pp
Eaton, S. E. (2022). The Academic Integrity Technological Arms Race and its Impact on Learning, Teaching, and Assessment. Canadian Journal of Learning and Technology, 48(2), Article 2. https://doi.org/10.21432/cjlt28388
Fyfe, P. (2023). How to cheat on your final paper: Assigning AI for student writing. AI & SOCIETY, 38(4), 1395–1405. https://doi.org/10.1007/s00146-022-01397-z
Keegin, J. M. (2023, May 23). ChatGPT Is a Plagiarism Machine. The Chronicle of Higher Education. https://www.chronicle.com/article/chatgpt-is-a-plagiarism-machine
Krapp, P. (2011). Noise Channels: Glitch and Error in Digital Culture
Laquintano, T., Schnitzler, C., & Vee, A. TextGenEd: An Introduction to Teaching with Text Generation Technologies. The WAC Clearinghouse. https://doi.org/10.37514/TWR-J.2023.1.1.02
Miller, M. (2022). It’s time to rethink “plagiarism” and “cheating.” https://ditchthattextbook.com/ai/
Nerantzi, C., Abegglen, S., Karatsiori, M., & Martinez-Arboleda, A. (2023). 101 creative ideas to use AI in education, A crowdsourced collection. Zenodo. https://doi.org/10.5281/zenodo.8355454
Paul R. MacPherson Institute for Leadership, innovation, and Excellence in Teaching. (2023). Designing Assessments in the Age of Generative AI. Paul R. MacPherson Institute for Leadership, Innovation and Excellence in Teaching. https://ecampusontario.pressbooks.pub/mcmasterteachgenerativeai/part/designing-assessments-in-the-age-of-generative-ai/
Perkovic, I. (2023, October 23). How Do I Cite Generative AI? McMaster LibGuides. https://libguides.mcmaster.ca/cite-gen-ai
Queens University. Designing Authentic Assessments. Teaching and Learning in Higher Education. Retrieved October 30, 2023, from https://www.queensu.ca/teachingandlearning/modules/assessments/20_s2_12_designing_authentic_assessments.html
Surovell, E. (2023, February 8). ChatGPT Has Everyone Freaking Out About Cheating. It’s Not the First Time. The Chronicle of Higher Education. https://www.chronicle.com/article/chatgpt-has-everyone-freaking-out-about-cheating-its-not-the-first-time
Task Force on Generative AI in Teaching and Learning. (2023, June). Provisional Guidelines on the Use of Generative AI in Teaching and Learning. Academic Excellence – Office of the Provost. https://provost.mcmaster.ca/office-of-the-provost-2/generative-artificial-intelligence/task-force-on-generative-ai-in-teaching-and-learning/provisional-guidelines-on-the-use-of-generative-ai-in-teaching-and-learning/
Terry, O. K. (2023, May 12). I’m a Student. You Have No Idea How Much We’re Using ChatGPT. The Chronicle of Higher Education. https://www.chronicle.com/article/im-a-student-you-have-no-idea-how-much-were-using-chatgpt
University of North Dakota. (2023). AI Assignment Library. University of North Dakota Scholarly Commons. https://commons.und.edu/ai-assignment-library/