The assessment needs of each course are unique, and the Approaches on AI Use are for educators to select or create an approach that best applies to their course.
Approaches on AI use
After considering your learning outcomes and evaluating your assessment, determine an approach for each assessment and ensuring that the assessment aligns with the selected approach.
Clearly communicate the approach to students. You can also download a slide pack from this page to use in your classes. The provided guidelines offer sample messaging for each approach, which educators can adapt to their specific context. For example, if students are permitted to use AI for editing and planning, educators may need to combine or modify the samples from these two approaches. The placeholders (in grey) allow flexibility for educators to tailor their approach based on the discipline context.
There are three approaches foreach assessment, explained below.
Students are not permitted to use AI tools at all in this category. This approach should only be considered when the assessment is conducted in an invigilated environment. Strategies to prohibit the use of AI should be implemented sparingly and reserved for assessments where security is essential.
As AI becomes increasingly integrated into our current and future lives, ANU graduates should develop skills to critically reflect on the use of AI and understand the contributions or limitations of the tool when performing a task.
Not permitting the use of AI should only be considered if students need to demonstrate individual mastery of course and program learning outcomes (e.g., translation tasks in a translation course). In the context of assessment security, AI misuse is only one contributing challenge and there are other threats to academic integrity including contract cheating and collusion.
From an accessibility and inclusion perspective, many assistive technologies (AT) used by students have AI-enabled functionality, such as screen readers that can produce an image description if an image alt text has not been included. When considering restricting the use of AI for a specific assessment, provisions for the use of AT with AI must be reviewed to ensure the assessment remains equitable and does not discriminate against students dependent on AT.
Examples of assessment
In-class assessments (e.g., quizzes in tutorials, group tasks, essay tasks, etc.).
Viva voces*, Interactive oral assessments*, or debates with unpredictable or randomised questions/prompts.
Supervised examinations (e.g., on campus exam, online proctored exam, Objective Structured Clinical examination in medical education, etc.).
Practical assessments (e.g., laboratory, clinical placements, studio work, etc.).
Live discussion-based assessment (e.g., Group discussions, Book club, peer-evaluation, etc.).
Important notes:
Prompts for these assessments should not be seen by students beforehand.
Security measures required for assessments such as on campus exams may not be scalable and sustainable due to resource constraints.
*Refer to Step 2 for viva voce and interactive oral assessment explanation
Sample messaging to students
“For this assessment, you are not permitted to use generative AI to complete your work or include any AI-generated materials in your submission. Your submission must be entirely your own original work with all sources properly cited. The rationale behind this is to ensure that the work submitted truly reflects your understanding, skills, and knowledge, which is a fundamental principle of academic integrity. Using AI-generated content and claiming authorship of it will be considered a breach of academic integrity. To clarify whether AI has been used, you may be asked to provide you may be asked to provide evidence of learning, including drafts of your submission. Failure to provide evidence or demonstrate your understanding of your own work may be considered a breach of academic integrity. If you are unsure about specific tools (including assistive technology) or have further questions, please contact me.”
Permitted uses
Below are examples of permitted uses of AI that educators can use, modify or combine to suit their assessment needs. The aim is to provide clarity to students on what they can use AI for, and articulate where AI may be helpful or not helpful for ensuring they achieve the learning outcomes of the assessment.
Permitted use: Editing
Students are permitted to use AI tools for editing purposes in this category, including grammar and spell checking, language refinement, and [add any other permitted use such as referencing]. However, submissions must not contain any AI-generated content or ideas that do not originate from the students themselves. Additionally, AI tools should not be used to paraphrase academic sources.
Examples of use
Using tools such as Grammarly for spell and tone checking.
Search for synonyms or antonyms.
Improve conciseness in writing.
Sample messaging
“For this assessment, you are allowed to use generative AI tools for editing, including grammar checking and language refinement. However, you are not allowed to use AI to [include examples of unpermitted use such as paraphrase literature], and your submission must not contain any AI-generated ideas or content that is not your own. Use of generative AI tools must be declared and acknowledged. Attach the following completed declaration to your submission [provide declaration template].
To clarify whether unpermitted use of AI has occurred, you may be asked to provide you may be asked to provide evidence of learning, including drafts of your submission. you may be asked to provide evidence of learning, including drafts of your submission. Failure to provide evidence or demonstrate your understanding of your own work may be considered a breach of academic integrity. If you are unsure about specific uses or have further questions, please contact me.”
Permitted use: Planning and discovery
In this category, students are allowed to use AI in the planning and discovery stage of assessments with proper acknowledgement. This may include brainstorming, generating prompts, using AI as a search tool, and [add any other permitted use]. However, drafts and final submissions of assessments must not contain AI-generated content and AI tools should not be used to paraphrase academic sources.
Consider designing assessments that are more authentic, context specific, or personalised (e.g., industry case study, PechaKucha presentation, portfolio work, reflective observations of field trips) to reduce the risk of AI misuse.
Educators should…
Educators should:
discuss with students about the limitations and considerations of AI
explain why AI should only be used for planning and discovery or as part of assistive technology
emphasise academic integrity expectations, including authorised use and unauthorised use, with particular focus on collusion and plagiarism
advise students on how to acknowledge their use of AI
ensure equitable access to AI (e.g., CoPilot).
Examples of use
Students use AI to generate ideas, suggestions, and concepts while planning for an assessment.
Students use AI to discover and explore varying perspectives on a topic, then use keywords to do further research.
Students use AI to understand the requirements of an assessment or the text type (e.g., what is an annotated bibliography).
Sample messaging
“For this assessment, you are allowed to use generative AI tools for planning and discovery. However, you are not allowed to use AI to [include examples of unpermitted use such as paraphrase literature], and your submission must not contain any AI-generated content. The use of generative AI tools must be declared and acknowledged. Attach the following completed declaration to your submission [provide declaration template].
To clarify whether unpermitted use of AI has occurred, you may be asked to provide evidence of learning, including drafts of your submission. Failure to provide evidence or demonstrate your understanding of your own work may be considered a breach of academic integrity. If you are unsure about specific uses or have further questions, please contact me.”
Permitted use: Support tool
Students are allowed to use AI for specific defined processes in assessment with attribution. This may include creating outcomes, drafting, generating ideas and concepts, summarising, editing, proofreading, and providing feedback. It is assumed that students’ final submissions in this category will contain AI-generated content.
Educators should…
specify which part(s) of the assessment process allows AI in the assessment instructions
provide activities and resources that scaffold student use of AI (e.g. practising prompting for specific rather than generic feedback)
discuss with students the limitations and considerations of AI
emphasise academic integrity expectations, including authorised use and unauthorised use, with particular focus on collusion and plagiarism
require students to acknowledge AI use and provide evidence of prompts and outputs.
ensure equitable access to AI (e.g., CoPilot).
Examples of assessment
Students use AI as a starting point to generate draft outline and structure for assessment (e.g., presentation).
Students use AI to receive constructive feedback on draft assessment. They then edit the draft by choosing which parts of the feedback they agree with (e.g., AI giving feedback on tone and persuasive language in a presentation script).
Sample messaging
“For this assessment, you are allowed to use generative AI tools for [explain specific permitted use] with attribution. You are not allowed to use AI for other purposes, such as [add unpermitted use], and you are expected to keep evidence of your learning and be prepared to provide it if required. This includes any drafts, iterations of work, or initial inputs (prompts) into AI. Failure to provide evidence or demonstrate your understanding of your work may be considered a breach of academic integrity. Any use of generative AI tools must be declared and acknowledged. Attach the following completed declaration to your submission [provide declaration template]. If you are unsure about specific uses or have further questions, please contact me.”
AI as an integral component of assessment
This approach views AI as a core part of the assessment or a co-authoring tool to complete the assessment. The assessment criteria may include evaluating students’ ability to use AI ethically and productively, and to select and evaluate AI generated content. This approach supports the development of AI literacy in a guided situation where educators can provide feedback and guidance to students.
Note that this does not mean students simply use AI to complete the assessment. The learning outcomes in these assessments will likely shift the focus to developing students’ AI literacy, critical evaluation (evaluating the accuracy and reliability of content), or evaluative judgment (assessing the quality of the product).
Student refusal of AI
It is expected that students will use AI in this category. Students who do not want to use AI will need to inform educators with valid reasons, and educators may need to offer alternative assessments that assess the same learning outcomes. For any accrediting requirements in specific disciplines, refer to students’ acceptance of using AI.
Educators should…
ensure AI proficiency and skills are scaffolded in learning activities and/or previous assessments
add AI skills and knowledge as criteria in assessment rubrics if it aligns with course learning outcomes (e.g., ability to evaluate and critique AI generated works)
have open conversation with students about the potential and limitations of AI
emphasise academic integrity expectations
require students to acknowledge AI use and provide evidence of prompts and outputs
monitor how students adapt to and perform in these types of assessments.
Examples of uses
Students design prototypes using Adobe Firefly and CoPilot generated images based on their own research to tackle a problem in their discipline area.
Students evaluate and critique AI generated image, text, or codes (e.g., critique a policy or recommendations written by AI, discuss a stereotype often found in AI-generated images).
Students work with an AI persona designed by the educator and joined as a ‘group member’ in their group assessment, reflecting on how AI provided them with a different perspective.
Students use AI to develop solutions to address particular problems and identify or elaborate on the implications of the solutions for other stakeholder groups or contexts.
Students generate content for a presentation, critically evaluate its accuracy, quality, and relevance, and use it for an oral presentation. They then demonstrate their understanding by responding to unpredictable questions from educators and peers.
Sample messaging
“For this assessment, you should use generative AI tools to [explain use] with attribution. You will be assessed on [assessment criteria], which are not limited to your ability to write prompts. Also, [add optional component such as a short reflection on how AI contributed to your work, or how students verify the accuracy and reliability of AI-generated data analysis].
Any use of generative AI tools must be declared and acknowledged. Failure to disclose [add use that you would like students to disclose] may be considered a breach of academic integrity. For more information, refer to unacceptable use. Please attach the completed declaration [declaration template provided] to your submission. If you are unsure about specific uses or have further questions, please contact me.”
Starting out
The use of AI in learning and teaching is currently optional. Educators who are less familiar with AI can start with small steps. For example, instead of designing a summative assessment that integrates AI, begin with low-stakes formative tasks or in-class activities that incorporate AI use.
Support
Educators should be aware that there is currently no direct method to monitor or detect students’ use of AI without increasing workload. If AI use poses significant integrity issues, it is recommended to redesign the assessment for security measures. Assessment redesign is an iterative process, and the university acknowledges that it is not a simple task. For support with educational design and assessment redesign, please contact Centre for Learning and Teaching (CLT).