Australian National University Logo
Australian National University Logo

Assessment planning Step 4: Determine your approach on AI use

In this collection

  1. Rethinking assessment design in the age of GenAI  
  2. Assessment planning Step 1: Consider the learning outcomes and validity of the assessment   
  3. Assessment planning Step 2: Evaluate your assessment
  4. Assessment planning Step 3: Determine the needs of your assessments
  5. Assessment planning Step 4: Determine your approach on AI use
  6. Considerations for using AI in assessments 

The assessment needs of each course are unique, and the Approaches on AI Use are for educators to select or create an approach that best applies to their course.  

Approaches on AI use

After considering your learning outcomes and evaluating your assessment, determine an approach for each assessment and ensuring that the assessment aligns with the selected approach.

Clearly communicate the approach to students. You can also download a slide pack from this page to use in your classes. The provided guidelines offer sample messaging for each approach, which educators can adapt to their specific context. For example, if students are permitted to use AI for editing and planning, educators may need to combine or modify the samples from these two approaches. The placeholders (in grey) allow flexibility for educators to tailor their approach based on the discipline context.  

There are three approaches for each assessment, explained below.

Unpermitted use 

Students are not permitted to use AI tools at all in this category. This approach should only be considered when the assessment is conducted in an invigilated environment. Strategies to prohibit the use of AI should be implemented sparingly and reserved for assessments where security is essential.

As AI becomes increasingly integrated into our current and future lives, ANU graduates should develop skills to critically reflect on the use of AI and understand the contributions or limitations of the tool when performing a task.

Not permitting the use of AI should only be considered if students need to demonstrate individual mastery of course and program learning outcomes (e.g., translation tasks in a translation course). In the context of assessment security, AI misuse is only one contributing challenge and there are other threats to academic integrity including contract cheating and collusion. 

From an accessibility and inclusion perspective, many assistive technologies (AT) used by students have AI-enabled functionality, such as screen readers that can produce an image description if an image alt text has not been included. When considering restricting the use of AI for a specific assessment, provisions for the use of AT with AI must be reviewed to ensure the assessment remains equitable and does not discriminate against students dependent on AT. 

  • In-class assessments (e.g., quizzes in tutorials, group tasks, essay tasks, etc.). 
  • Viva voces*, Interactive oral assessments*, or debates with unpredictable or randomised questions/prompts. 
  • Supervised examinations (e.g., on campus exam, online proctored exam, Objective Structured Clinical examination in medical education, etc.). 
  • Practical assessments (e.g., laboratory, clinical placements, studio work, etc.). 
  • Live discussion-based assessment (e.g., Group discussions, Book club, peer-evaluation, etc.). 

Important notes: 

  1. Prompts for these assessments should not be seen by students beforehand. 
  1. Security measures required for assessments such as on campus exams may not be scalable and sustainable due to resource constraints.  

*Refer to Step 2 for viva voce and interactive oral assessment explanation 

“For this assessment, you are not permitted to use generative AI to complete your work or include any AI-generated materials in your submission. Your submission must be entirely your own original work with all sources properly cited. The rationale behind this is to ensure that the work submitted truly reflects your understanding, skills, and knowledge, which is a fundamental principle of academic integrity. Using AI-generated content and claiming authorship of it will be considered a breach of academic integrity. To clarify whether AI has been used, you may be asked to provide you may be asked to provide evidence of learning, including drafts of your submission. Failure to provide evidence or demonstrate your understanding of your own work may be considered a breach of academic integrity. If you are unsure about specific tools (including assistive technology) or have further questions, please contact me.” 

Using tools such as Grammarly for spell and tone checking. 

  • Search for synonyms or antonyms. 

Improve conciseness in writing. 

“For this assessment, you are allowed to use generative AI tools for editing, including grammar checking and language refinement. However, you are not allowed to use AI to [include examples of unpermitted use such as paraphrase literature], and your submission must not contain any AI-generated ideas or content that is not your own. Use of generative AI tools must be declared and acknowledged. Attach the following completed declaration to your submission [provide declaration template].

To clarify whether unpermitted use of AI has occurred, you may be asked to provide you may be asked to provide evidence of learning, including drafts of your submission. you may be asked to provide evidence of learning, including drafts of your submission. Failure to provide evidence or demonstrate your understanding of your own work may be considered a breach of academic integrity. If you are unsure about specific uses or have further questions, please contact me.” 

Educators should:

  • discuss with students about the limitations and considerations of AI 
  • explain why AI should only be used for planning and discovery or as part of assistive technology 
  • emphasise academic integrity expectations, including authorised use and unauthorised use, with particular focus on collusion and plagiarism 
  • advise students on how to acknowledge their use of AI 
  • ensure equitable access to AI (e.g., CoPilot).
  • Students use AI to generate ideas, suggestions, and concepts while planning for an assessment. 
  • Students use AI to discover and explore varying perspectives on a topic, then use keywords to do further research. 
  • Students use AI to understand the requirements of an assessment or the text type (e.g., what is an annotated bibliography). 

For this assessment, you are allowed to use generative AI tools for planning and discovery. However, you are not allowed to use AI to [include examples of unpermitted use such as paraphrase literature], and your submission must not contain any AI-generated content. The use of generative AI tools must be declared and acknowledged. Attach the following completed declaration to your submission [provide declaration template].

To clarify whether unpermitted use of AI has occurred, you may be asked to provide evidence of learning, including drafts of your submission. Failure to provide evidence or demonstrate your understanding of your own work may be considered a breach of academic integrity. If you are unsure about specific uses or have further questions, please contact me.”

  • specify which part(s) of the assessment process allows AI in the assessment instructions 
  • provide activities and resources that scaffold student use of AI (e.g. practising prompting for specific rather than generic feedback) 
  • discuss with students the limitations and considerations of AI 
  • emphasise academic integrity expectations, including authorised use and unauthorised use, with particular focus on collusion and plagiarism 
  • require students to acknowledge AI use and provide evidence of prompts and outputs. 
  • ensure equitable access to AI (e.g., CoPilot).
  • Students use AI as a starting point to generate draft outline and structure for assessment (e.g., presentation).  
  • Students use AI to receive constructive feedback on draft assessment. They then edit the draft by choosing which parts of the feedback they agree with (e.g., AI giving feedback on tone and persuasive language in a presentation script). 

“For this assessment, you are allowed to use generative AI tools for [explain specific permitted use] with attribution. You are not allowed to use AI for other purposes, such as [add unpermitted use], and you are expected to keep evidence of your learning and be prepared to provide it if required. This includes any drafts, iterations of work, or initial inputs (prompts) into AI. Failure to provide evidence or demonstrate your understanding of your work may be considered a breach of academic integrity. Any use of generative AI tools must be declared and acknowledged. Attach the following completed declaration to your submission [provide declaration template]. If you are unsure about specific uses or have further questions, please contact me.” 

ensure AI proficiency and skills are scaffolded in learning activities and/or previous assessments 

  • add AI skills and knowledge as criteria in assessment rubrics if it aligns with course learning outcomes (e.g., ability to evaluate and critique AI generated works) 
  • have open conversation with students about the potential and limitations of AI 
  • emphasise academic integrity expectations 
  • require students to acknowledge AI use and provide evidence of prompts and outputs 
  • monitor how students adapt to and perform in these types of assessments.

Students design prototypes using Adobe Firefly and CoPilot generated images based on their own research to tackle a problem in their discipline area.  

  • Students evaluate and critique AI generated image, text, or codes (e.g., critique a policy or recommendations written by AI, discuss a stereotype often found in AI-generated images).  
  • Students work with an AI persona designed by the educator and joined as a ‘group member’ in their group assessment, reflecting on how AI provided them with a different perspective. 
  • Students use AI to develop solutions to address particular problems and identify or elaborate on the implications of the solutions for other stakeholder groups or contexts. 

Students generate content for a presentation, critically evaluate its accuracy, quality, and relevance, and use it for an oral presentation. They then demonstrate their understanding by responding to unpredictable questions from educators and peers. 

“For this assessment, you should use generative AI tools to [explain use] with attribution. You will be assessed on [assessment criteria], which are not limited to your ability to write prompts. Also, [add optional component such as a short reflection on how AI contributed to your work, or how students verify the accuracy and reliability of AI-generated data analysis].  

Any use of generative AI tools must be declared and acknowledged. Failure to disclose [add use that you would like students to disclose] may be considered a breach of academic integrity. For more information, refer to unacceptable use. Please attach the completed declaration [declaration template provided] to your submission. If you are unsure about specific uses or have further questions, please contact me.”

Starting out

The use of AI in learning and teaching is currently optional. Educators who are less familiar with AI can start with small steps. For example, instead of designing a summative assessment that integrates AI, begin with low-stakes formative tasks or in-class activities that incorporate AI use.  

Support

Educators should be aware that there is currently no direct method to monitor or detect students’ use of AI without increasing workload. If AI use poses significant integrity issues, it is recommended to redesign the assessment for security measures. Assessment redesign is an iterative process, and the university acknowledges that it is not a simple task. For support with educational design and assessment redesign, please contact Centre for Learning and Teaching (CLT).