Overview
Generative AI is a type of artificial intelligence that uses machine learning to generate new content by analyzing and processing vast amounts of data from diverse sources. Generative AI tools can generate text, images, video, sound and code.
Different tools are trained on different datasets and with different training methods. The generated responses of these tools are probabilistic, which can result in errors in responses.
Large language models (LLMs), for instance, specialize in analyzing and processing text and generating new text. Different LLMs have distinct datasets and employ unique training methods. GPT 3 and GPT 4 are examples of LLMs. OpenAI’s ChatGPT is a chatbot created on GPT 3 or GPT 4.
- A useful glossary of AI terms can be found here and here.
While generative AI is not new, OpenAI’s launch of ChatGPT in November 2022 marked the fastest recorded adoption of a technology tool.
Over the intervening months, the release of similar text-based generative AI tools from Microsoft’s Bing to Google’s Bard, in addition to improvements in tools have contributed to a perception of an explosion of AI.
Indeed, the rapid proliferation of tools and advancements in technology saw over 100 leaders in AI technology write an open letter urging a collective pause on AI developments more powerful than GPT 4 to give time for security and safety features to develop and for the creation of regulation and governance structures.
The need for such regulation or governance extends to full nations, but also to specific sectors, such as post-secondary education, and in turn, McMaster University.
Broader issues related to generative AI include privacy of personal data, risks of misinformation, existential risks, environmental costs, labour exploitation, and copyright.
Expandable List
- Create informative, well-written text: prose, poetry, dialogue, code
- Provide examples and references *references may be ‘hallucinated’
- Generate outlines, questions, tables, long form text
- Summarize inputted text
- Provide feedback on text – both form and structure
- Explain concepts at different levels of understanding
- Translate between languages
- Remember within a chat thread – follow-up prompts
- Hallucinations: confident declarations that are factually inaccurate (e.g., references to articles that don’t exist)
- Uneven access material after 2021: the free version of ChatGPT cannot access the web, though the subscription model can, as can Microsoft’s Bing
- Biases in training data are replicated in generated responses
- There is variation in responses based on the wording and framing of the user’s prompt
- If using ChatGPT 3 (free version) there can be lag times or delays in access if demand is high; Bing and ChatGPT 4 do not experience these delays.
As the use of generative Artificial Intelligence (AI) becomes widespread, McMaster University has struck a task force to explore its potential for enhancing teaching and learning, as well as the challenges associated with its use. This page shares some of the work of the Task Force, as well as resources related to generative AI and teaching and learning.
The Task Force on Generative Artificial Intelligence in Teaching and Learning started work on May 1. It is co-chaired by Kim Dej, vice-provost, Teaching and Learning, and Matheus Grasselli, deputy provost. Faculty, staff and students from across the university are on the Task Force, which will submit recommendations to Susan Tighe, provost and vice-president (Academic) by early September on the use of generative AI in teaching and learning.
Questions and suggestions for the Task Force can be directed to vptl@mcmaster.ca and deputyprovost@mcmaster.ca
Expandable List
Scope and Outcomes
I. Purpose
The purpose of this task force is to investigate the impacts posed by generative AI on teaching and learning at McMaster University, and to provide strategic guidance and actionable recommendations for educators planning for fall courses.
II. Background
Generative AI, exemplified by models such as ChatGPT, Bing and Bard, is reshaping the landscape of education, bringing forth both opportunities and challenges. As a research-intensive university, it is imperative that we adapt to these developments and ensure that our pedagogical approaches and academic integrity standards remain robust, relevant, and effective.
III. Objectives
The task force will pursue the following objectives:
- Review the current state of generative AI technology and its implications for higher education, with a focus on potential benefits and challenges for the McMaster context
- Discuss risks to academic integrity that may arise from the use of generative AI and recommend proactive teaching strategies to mitigate these risks and communicate these with the community
- Draft and endorse guidelines, resources and training for educators enabling them to make informed decisions in their teaching strategies, including if/when/how to use generative AI in teaching practices (e.g. designing assignments, grading/feedback)
- Draft and endorse resources for students to familiarize them with generative AI tools to build both digital literacy skills and confidence in appropriate use of generative AI for learning.
- Comment on drafted recommendations to share with the Senate Committee on Academic Integrity for consideration on the responsible adoption of generative AI in teaching and learning.
While the Task force may discuss intersections with generative AI and research, service and staff activities at the University, its mandate is to address the impact of generative AI on teaching and learning, with a focus on student learning. Recommendations pertaining to research, staff work, or service work fall outside the scope of this task force.
IV. Membership
The task force will be co-chaired by the Vice-Provost, Teaching and Learning and Deputy Provost, with coordination provided by the MacPherson Institute. The task force will consist of members with expertise in the following areas:
- Faculty representatives from diverse disciplines
- Representatives from University Technology Services
- Educational developers
- AI and machine learning researchers
- Librarians and information specialists
- Student Affairs representatives
- Student representatives
- Privacy and academic integrity experts
V. Timeline
The task force will convene in May 2023 and aim to submit a final report with recommendations by August 2023. Updates will be shared with the Provost’s office and relevant stakeholders periodically throughout the process.
VI. Reporting
The task force will report directly to the Vice-Provost, Teaching and Learning and the Deputy Provost and will provide monthly updates to the Senate Committee on Academic Integrity through a summary report and the Academic Integrity Officer.
VIII. Support
The Generative AI Task Force in Teaching and Learning will be supported by MacPherson Institute staff that will:
- Schedule and coordinate logistics of task force meetings and events
- Circulate an agenda and documentation prior to each meeting
- Prepare and distribute minutes after the completion of each meeting
- Support the writing of the Task force recommendations
VIII. Review
The task force’s recommendations will be submitted by September 10th and will be reviewed and assessed for implementation by the Vice-President, Academic in consultation with relevant stakeholders across the university. The task force may be reconvened to address any subsequent developments in generative AI that warrant further examination or action.
Proposed Meeting Topics, Dates and Prework
- Current State of AI in Teaching and Learning: May 16, 3:10-4:30
- Briefing package prepared and distributed a week in advance
- Outcome: Describe perceived benefits and challenges in McMaster context from both student and faculty perspectives
- Academic Integrity and AI: June 26, 3:10-4:30
- Briefing package prepared and distributed a week in advance
- Outcome: Recommend proactive mitigation and communication strategies for educators to foster academic integrity at McMaster, including teaching practice and student approaches.
- Policy Considerations for AI: July 18, 3:10-4:30
- Draft of considerations for policies distributed a week in advance
- Outcome: Feedback (and endorsement) of considerations for policies to be submitted to the Senate Committee on Academic Integrity for consideration
- Student and Faculty Guidelines and Training in AI: August 29, 3:10-4:30
- Drafted Pressbook “AI in Teaching and Learning at McMaster University” and training options distributed a week in advance
- Drafted resource “AI in Learning at McMaster University” and conversation guides distributed a week in advance
- Outcome: Feedback (and endorsement) of drafted AI resources and training supports for faculty and students
Name | Unit Represented |
Dr. Kim Dej, Co-chair | Vice-Provst, Teaching and Learning |
Dr. Matheus Grasselli, Co-chair | Deputy Provost |
Dr. Erin Aspenlieder, Coordinator | MacPherson Institute |
Dr. Erin Allard | MacPherson Institute |
Sean Beaudette | Student Success Center |
Dr. Ben Bolker | Faculty of Science |
Dr. Dina Brooks | Faculty of Health Sciences/SGS |
Clark Cipryk | University Technology Services |
Brad Coburn | Secretariat |
Dr. Sarah Dickson Anderson | Faculty of Engineering |
Letizia Dondi | GSA President |
Dr. Greg Flynn | Faculty of Social Sciences |
Zachary Gan | Arts and Science – undergraduate |
Richard Godsmark | University Technology Services |
Dr. Elzbieta Grodek | Faculty of Humanities |
Dr. Bhagwati Gupta | Faculty of Science/SGS |
Helen Kula | Library |
Kim Mason | Academic Intergrity |
Chanel Morrison | Faculty of Science – undergraduate |
Dr. Jennifer Nash | Faculty of Health Sciences |
Dr. Richard Paige | Faculty of Engineering |
Melissa Poole | Registrar’s Office |
Jovan Popovic | Undergraduate student |
Dr. Mat Savelli | Faculty of Social Sciences/Arts and Science |
Dr. Aron Schat | Faculty of Business |
Dr. Jonathan Sherbino | Faculty of Health Sciences |
Chelsey Smith | McMaster Continuing Education |
Ian Steinberg | Graduate student |
Dr. Jonathan Stokes | Faculty of Health Sciences/SGS |
Dr. Cliff van der Linden | Faculty of Social Sciences |
Dr. Michael Wong | Faculty of Health Sciences |
Trudi Wright | Privacy and Records |
Dr. Manaf Zargoush | Faculty of Business |
These overarching provisional principles have guided the work of the Task Force on Generative AI in Teaching and Learning and will continue to be updated through conversations with our campus community.
- Students want to learn, and instructors want to support their learning.
- Participatory learning – learning which happens in relationships and community – continues to be a valuable and vital way for students to learn.
- Assessments that require students to document the process of learning continue to be meaningful for student learning.
- Generative AI poses risks, as well as opportunities. Individuals will have different reactions and different expectations for the technology.
- Disciplinary differences and departmental cultures will vary around the use of generative AI.
The intention of these guidelines is to offer a starting point for instructors to understand the potential uses of generative AI in their teaching and student learning and for developing courses for the fall term at McMaster University.
These guidelines were developed by the Task Force on Generative AI in Teaching and Learning and will continue to be updated as the Task Force explores additional topics and as technology rapidly changes.
Members of the Task Force also invite feedback and suggestions on these guidelines through this form. It is expected these guidelines will be updated again in time for winter course preparation.
Potential policy changes implied by these guidelines will be addressed by the relevant governance bodies.
Staff at the MacPherson Institute are available to consult with instructors regarding these guidelines; further resources for instructors and students are being developed and will be available by the fall. Instructors can email mi@mcmaster.ca for support.
Guidelines
A McMaster specific citation guide is in development through the Library and will be ready for fall 2023.
Until then, please consider citation options such as:
“[Generative AI tool]. (YYYY/MM/DD of prompt). “Text of prompt”. Generated using [Name of Tool.] Website of tool”
e.g. “ChatGPT4. (2023/05/31). “Suggest a cookie recipe that combines oatmeal, chocolates chips, eggs and sugar.” Generated using OpenAI’s ChatGPT. https://chat.opeani.com”
Instructors may also consider requiring students to include a reflective summary at the end of each assessment that documents what generative AI tools were used, what prompts were used – including a complete chat log – and how generated content was evaluated and incorporated.
Other citation guidelines can be viewed at:
- MLA Guidelines on citing generative AI
- APA Guidelines on citing generative AI
- Chicago FAQ on generative AI
- A quick guide provided from the University of Waterloo, with a McMaster version coming in Fall 2023.
Use Prohibited
Students are not permitted to use generative AI in this course. In alignment with McMaster academic integrity policy, it “shall be an offence knowingly to … submit academic work for assessment that was purchased or acquired from another source”. This includes work created by generative AI tools. Also state in the policy is the following, “Contract Cheating is the act of “outsourcing of student work to third parties” (Lancaster & Clarke, 2016, p. 639) with or without payment.” Using Generative AI tools is a form of contract cheating. Charges of academic dishonesty will be brought forward to the Office of Academic Integrity.
Some Use Permitted
Example One
Students may use generative AI in this course in accordance with the guidelines outlined for each assessment, and so long as the use of generative AI is referenced and cited following citation instructions given in the syllabus. Use of generative AI outside assessment guidelines or without citation will constitute academic dishonesty. It is the student’s responsibility to be clear on the limitations for use for each assessment and to be clear on the expectations for citation and reference and to do so appropriately.
Example Two
Students may use generative AI for [editing/translating/outlining/brainstorming/revising/etc] their work throughout the course so long as the use of generative AI is referenced and cited following citation instructions given in the syllabus. Use of generative AI outside the stated use of [editing/translating/outling/brainstorming/revising/etc] without citation will constitute academic dishonesty. It is the student’s responsibility to be clear on the limitations for use and to be clear on the expectations for citation and reference and to do so appropriately.
Example Three
Students may freely use generative AI in this course so long as the use of generative AI is referenced and cited following citation instructions given in the syllabus. Use of generative AI outside assessment guidelines or without citation will constitute academic dishonesty. It is the student’s responsibility to be clear on the expectations for citation and reference and to do so appropriately.
Unrestricted Use
Students may use generative AI throughout this course in whatever way enhances their learning; no special documentation or citation is required.
Sample Rubrics Developed with ChatGPT:
I acknowledge the use of ChatGPT 4.0 to create sample analytic and holistic rubrics. The prompts included “Imagine you are a rubric generating robot who creates reliable and valid rubrics to assess university-level critical thinking skills. You have been tasked with generating a rubric that evaluates students critical thinking skills and incorporates their use of generative AI. Create two holistic rubrics and two analytic rubrics to assess these skills.” The output from these prompts was to provide examples of the kind of rubrics that could be used to assess the integration of generative AI in course assignments.
Criteria | 4 | 3 | 2 | 1 |
Argument Structure | The argument is clearly articulated and logically structured. | The argument is generally clear and logical, with minor inconsistencies. | The argument is somewhat unclear or inconsistently structured. | The argument lacks clarity and logical structure. |
Evidence | Evidence is thorough, relevant, and convincingly supports the argument. | Evidence is generally strong and relevant, with minor lapses. | Evidence is somewhat sparse, irrelevant, or does not fully support the argument. | Evidence is lacking or largely irrelevant. |
Use of Generative AI | AI is used effectively to support arguments, demonstrating a high understanding of its capabilities and limitations. | AI is used effectively, but understanding or integration could be improved. | AI is used, but not effectively integrated or misunderstood. | AI is not used or its use does not contribute to the argument. |
Reflection on AI | The student clearly articulates how AI contributed to their critical thinking process and considers its limitations. | The student generally explains how AI contributed to their thinking, with minor lapses in considering its limitations. | The student’s explanation of how AI contributed to their thinking is unclear or superficial. | The student does not explain how AI contributed to their thinking. |
Analytic Rubric 2: Assessing Generative AI Use and Integration
Criteria | 4 | 3 | 2 | 1 |
Understanding of AI | The student demonstrates a deep understanding of the capabilities and limitations of the AI. | The student demonstrates a good understanding of the AI, with minor misconceptions. | The student shows a basic understanding of the AI, but has significant misconceptions. | The student shows little to no understanding of the AI. |
Integration of AI | AI is seamlessly integrated into the work, effectively augmenting the student’s critical thinking. | AI is generally well integrated, though at times it may seem somewhat forced or awkward. | AI integration is inconsistent or superficial, not effectively augmenting the critical thinking process. | AI is not effectively integrated into the work. |
Reflection on AI | The student clearly reflects on the role of AI in their work, considering both its contributions and its limitations. | The student generally reflects well on the AI’s role, though considerations of its limitations may be superficial. | The student’s reflection on the AI’s role is minimal or lacks depth. | The student does not reflect on the AI’s role in their work. |
Innovation with AI | The student uses AI in novel or innovative ways to enhance their argument. | The student uses AI effectively, though it may lack innovation. | The student uses AI in a straightforward or predictable way, not enhancing the argument. | The student does not use AI in an innovative or meaningful way. |
Honour pledges are formal, student-led commitments to uphold the principles of academic honesty and integrity. These pledges represent students’ personal assurance to maintain and respect academic standards, abstaining from any form of plagiarism, cheating, or other academic misconduct. They often form part of the assessment submission process, where students attach a pre-defined pledge to their work as a statement of authenticity. Several studies have investigated the use of honour codes and academic integrity and found them effective in reducing academic dishonesty.
Instructors might consider developing honour pledges together with their students, or adapting this McMaster honour pledge to their purposes.
“I understand and believe the main purpose of McMaster and of a university to be the pursuit of knowledge and scholarship. This pursuit requires my academic integrity; I do not take credit that I have not earned. I believe that academic dishonesty, in whatever form, is ultimately destructive to the values of McMaster, and unfair to those students who pursue their studies honestly. I pledge that I completed this assessment following the guidelines of McMaster’s academic integrity policy.”
Forthcoming guidelines and resources
The Task Force on Generative AI in Teaching and Learning will continue to work over the summer to expand and enhance these provisional guidelines. Likewise, staff across the University are working to develop resources and supports.
Some known needs from the campus community include:
- Guidelines for teaching assistants on the expectations around the use of generative AI
- Guidelines for instructors on the use of generative AI for offering feedback and grading
- Comprehensive guide for instructors using generative AI
- Resources for students to understand generative AI risks and opportunities
- Overview of generative AI tools including privacy and security assessments
- Digital literacy learning outcomes and digital literacy resources