Generative AI technologies can create opportunities for learning but also heighten existing assessment challenges. This resource suggests points of discussion and provides a downloadable slide pack with tips on speaking with students about AI.
Communicating with students
It is important to provide clear guidelines around AI use and communicate the expectations of learning and assessment with students. This helps create safe opportunities for students to ask questions about AI and for them to feel assured that they are doing the right thing.
The AAIN Guidelines (pdf, 2023) recommend having conversations with students early in units and courses to improve a shared understanding of how and when they can use AI tools. TEQSA’s resource on Conversation Starters (pdf) provides some prompts to initiate conversations in the classroom.
This resource is a living document that will be continually reviewed and updated.
PowerPoint slide pack resource for educators
This downloadable slide pack is a resource for educators to initiate conversations with students and communicate clear expectations around AI use for their learning and assessment. The slide pack also contains various talking points on the limitations of AI, and how students can engage with AI ethically and responsibly.
Educators can download the slide pack and select relevant sections to incorporate into their teaching resources. Within the slide pack, there are placeholders for educators to fill in with information applicable to their course.
ANU example in action
Check out Dr. Russell Smith’s lecture slides where the PowerPoint slide pack resource has been adapted for their first year ENGL1200: Imagined Worlds course. Within this adapted version, Russell has added in more contextual information about AI and Literary studies to discuss with students, and clear expectations and guidelines for use of AI in the course.
Using AI for learning purposes
Provide opportunities for in-class discussions and encourage students to raise questions and discuss ethical use of AI generally and within the discipline context.
Here some potential points for discussion:
- Not relying on AI as a primary source of information
- Best AI tools to use at ANU (Copilot, Adobe Firefly)
- Ethical considerations of AI
- Limitations of AI and AI generated content
- Acceptable use of AI for learning purposes (e.g., asking it to provide feedback, learning about topics and concepts, guidance about the assessment format)
- Opportunities and challenges from AI in the discipline or professional area
- Increasing relevance and importance of students’ discipline knowledge and skills in the age of AI.
The following are examples of messaging and communication to students around AI use.
Example of messaging introducing GenAI
“Artificial Intelligence (AI) is already integrated into many tools we use every day. This includes things like screen readers, autocorrect on your phone, voice commands with Siri or Alexa, and GPS in cars. This type of AI is often called ‘Traditional’ because it follows set rules and is good at specific tasks with known results. There’s another type of AI called ‘Generative AI’ (or Gen AI). Unlike Traditional AI, Gen AI can adapt and generate unpredictable outputs. It learns the underlying patterns and associations from massive datasets. It then uses it to create new images, video, text, music, programming code and even speech.”
Examples of messaging around risk and limitations of AI
“While Generative AI is useful, the predictions and generated content may be inaccurate, incomplete and subject to bias. AI Is not a replacement of your own learning, thinking and research, and always ensure you verify AI content or sources. “
“Non-endorsed AI tools require you to sign up and create an account and the ANU cannot guarantee your security when using these tools. Currently, Copilot Enterprise is available for all ANU staff and students by signing in with an ANU account. The main benefit of using Copilot with an ANU account is that personal and company data is protected. Be cautious about the privacy and security of the content being put into generative AI tools, especially if it contains sensitive or confidential information.”
Examples of messaging around bias in AI
“AI systems are trained on massive datasets, and so reflect any biases or lack of diversity in that data. Before you begin to incorporate AI tools into your studies, we need to be aware of these biases and discuss them. The biases often originate from the data used to train the AI, which may not represent all demographics equally. This can lead to results that might inaccurately represent one group over another. Also, be mindful that these biases and the lack of diversity can affect how inclusive different equity groups feel. The effects could potentially influence how individuals from various backgrounds perceive their involvement and identity under different representations.”
Examples of messaging to educate students academic integrity and AI
“Inappropriate use of AI is unacceptable and constitutes a breach of academic integrity. If students submit AI-generated content as their own work, they are not submitting original work. Other forms of academic misconduct, as outlined in the ANU Academic Integrity Rule 2021, such as plagiarism and contract cheating, have overlapping actions that could be considered breaches of academic integrity when using AI.”
“As scholars, we develop our ideas by critically engaging with the work of others. The principle of academic integrity requires us to appropriately acknowledge our intellectual sources when we incorporate the research of others into our own work. At ANU, students must act with academic integrity and reference and acknowledge the sources upon which their work is based. This extends to the use of Gen AI. Under certain circumstances (when approved by convenors), Gen AI can be permitted for some assessments. However, you must declare and acknowledge how AI has been used for completing the assessment. Refer to the Best Practice When Using Generative AI page for further guidance and book an appointment with Academic skills if required.” (based on guidance from ANU Academic Skills)
Using AI for assessment purposes
The necessary condition for trustworthy assessment is articulating to students the boundaries of what is and not permissible when working with AI.
Whether AI is permitted for assessments will vary greatly across different assessment tasks, colleges and disciplines and assessment tasks. Educators are advised to check with their College or ADE for any College-wide advice or guidance. At some points in students’ learning, it is important to assess their achievement of learning outcomes without the use of AI, and at other points, AI may be allowed or be an integral part of the assessment.
By clearly outlining expectations around AI use in student facing documentation (e.g., the Class Summary and assessment information section in the LMS and reinforcing it in the first lecture), there is more transparency and assurance for students when it comes to knowing what to do for their assessment.
The following are examples of messaging and communication to students around AI for assessment purposes.
Communicate expectations around AI use before the course starts
List any AI requirements clearly in the Course Summary and Assessment Outlines.
Depending on the course, discipline, learning outcomes and specific assessment task, educators can decide how they want students to acknowledge AI use and/or document how AI is used to contribute to an assessment task.
If you are designing assessments where AI use is a requirement:
- Create a space in the LMS course site where students can access Copilot and Adobe Firefly. Include links to tutorials, videos and help guides for any new technologies they are required to use for a specific task.
- Update course notes and lecture slides with any AI requirements.
Explain, justify and model
Justify reasons for including or not including AI. If AI is not permitted for an assessment task, explain to students why it isn’t permitted and how AI will hinder their ability to develop certain knowledge and skills. If AI is permitted for an assessment task, outline the level of acceptable use of AI and provide examples. Also determine how you would like students to acknowledge their use of AI in assessment.
Outline how students should/should not use AI as part of the course and make students aware of the ethical use of AI and discuss what ethical use of AI looks like with students.
Outline how you want students to acknowledge their use of AI in assessment.
Model and demonstrate acceptable use, approaches and strategies on how students could use AI to support their learning.
Conduct a pre-class survey to determine whether students are aware of AI tools, have used them before, and their level of familiarity with them.
Check for student readiness, acceptance and skills
Students may all have different levels with AI literacy. Create opportunities such as a poll activity to understand how students feel about AI and how they engage with it.
Provide formative assessment opportunities for students to practice AI literacy in a safe environment.
Provide opportunities for in-class discussions and encourage students to raise questions and discuss ethical AI use.
Create opportunities for students to experiment, collaborate and share ways to use AI.
Using AI for feedback and marking
As reflected in the university’s GenAI Privacy guidelines, academic staff are not permitted to upload student data or academic work to GenAI platforms. Therefore, it is not currently possible to generate student feedback or results using student work.
Even in the case where student data or work is not involved, any use of GenAI to generate feedback or other material must be cited.
References
AAIN Generative AI Working Group (2023) AAIN Generative Artificial Intelligence Guidelines (pdf), Australian Academic Integrity Network.
Lodge, J. M., Howard, S., Bearman, M., Dawson, P, & Associates (2023). Assessment reform for the age of Artificial Intelligence. Tertiary Education Quality and Standards Agency.