Australian National University Logo
Australian National University Logo

Considerations for using AI in assessments 

In this collection

  1. Rethinking assessment design in the age of GenAI  
  2. Assessment planning Step 1: Consider the learning outcomes and validity of the assessment   
  3. Assessment planning Step 2: Evaluate your assessment
  4. Assessment planning Step 3: Determine the needs of your assessments
  5. Assessment planning Step 4: Determine your approach on AI use
  6. Considerations for using AI in assessments 

This section presents some common concerns and considerations regarding the use of AI in assessment. 

AI offers exciting possibilities for learning and teaching in higher education, but its implementation requires careful consideration. Issues such as student privacy, equity, and academic integrity must be considered before incorporating AI into assessment design.  

Academic Integrity

1. What are the permitted, unpermitted, and unacceptable uses of AI for students? 

Educators will determine the permitted and unpermitted uses of AI within their specific discipline. The section ‘Approaches on AI Use’ provides some generic approaches on unpermitted, permitted, and integrated use in assessments. Educators can decide on an approach or combine a few approaches to suit their context, ensuring they provide clear guidance, including open conversations and demonstration of use, to students on how to engage with AI without breaching the Policy: Student academic integrity at ANU.  

Unacceptable use

Unacceptable use of AI involves actions that may breach academic integrity principles and overlap with other academic misconducts including contract cheating and plagiarism. The below are examples of these actions: 

  • Misrepresenting AI generated content as one’s own work. 
  • Failure to or intentionally avoiding acknowledgement of AI use where required. 
  • Using AI as a third party to complete part or whole of an assessment (ie. Contract cheating) without authorisation from convener(s). 
  • Plagiarise academic sources through AI-generated content without proper paraphrasing and referencing. 

Refer to the Communicating with students about AI section of this document for further guidance.  

2. What are the considerations around AI detection?

Currently, there is no direct method available to detect AI use in students’ work. ANU prioritises upholding academic integrity while respecting student privacy. After careful consideration, ANU did not proceed with adoption of Turnitin’s AI detection module (TAID) from January 2024. This decision reflects ongoing concerns regarding the ethical and technical limitations of AI detection software.  

AI detection lacks evidence

Detecting student use of AI tools is a complex challenge. Current detection tools raise concerns about privacy, equity, and data security. There is lack of evidence to show effectiveness of these tools due to: 

  • false positives and negatives: These tools can mistakenly identify human-written work as AI-generated (false positives) or miss AI-generated text entirely (false negatives) 
  • rapid advancements in AI: Detection tools are unlikely to keep pace with the constant development of AI technology. 

Student privacy

Uploading student work to AI detection software is a breach of student privacy. The university encourages educators to explore alternative approaches that promote academic integrity through educating students about academic integrity through discussion and modelling, and modifying assessments.

Manual detection

Other ways to detect potential misuse of AI include identifying made-up references, regularly checking students’ progress through ungraded activities, and monitoring students’ engagement in the learning management system (LMS).  

3. What is the process for reporting suspected academic misconduct by a student?

According to Policy: Student academic integrity, students have the responsibility to: 

ensure that work submitted for assessment is genuine and original and that appropriate acknowledgement and citation is given to the work of others; and  

be prepared to provide evidence of or authenticate their learning on the assessment task; for example, by showing notes/drafts/resource materials used in the preparation of the task or undertaking a viva voce assessment*;” 

Student rights and responsibilities

Students must: 

  • be informed about what level of AI use is allowed in each assessment 
  • be prepared to provide evidence of their learning 
  • acknowledge if they have used AI in their assessment.  

If you suspect that a student may have used AI in a way that undermines the integrity of the assessment, you could initiate a conversation with the student and request evidence of learning. Please refer to Procedure: Student academic integrity and Office of the Registrar Handling a potential breach of academic integrity page for further support.  

*Refer to Step 2 for viva voce explanation 

Referencing and acknowledgement

4. How can students acknowledge the use of AI tools in their work? 

Both staff and students should be transparent about the use of AI in creating outputs. For students, understanding how to acknowledge AI tools is especially important, even for uses like brainstorming or planning. This guideline provides a generic template, however, educators may need to adapt it for their course or refer to college guidelines for more detailed requirements. 

Template acknowledging AI use

Here is an example of how a student might acknowledge this: 

5. How should students reference the use of AI?

For more information about referencing AI, visit the ANU library Artificial Intelligence including generative AI page.  

Access and equity

6. Are there any equity considerations when using AI tools? 

There are some important equity considerations to keep in mind when incorporating AI into learning and assessment. The digital divide (unequal access to digital technology) can create barriers for students who lack reliable devices or internet access. If the use of AI has been integrated into an assessment, the requirement and the type of AI should be made clear to students from the beginning of semester.

Available tools

Educators should check that all students can access the required AI tool before it is to be used. This is particularly important for any students using AT. To minimise access issues, we recommend using Microsoft Copilot, which is available to all ANU students and staff under our enterprise agreement. 

Diversity

AI systems are trained on massive datasets, and so reflect any biases or lack of diversity in that data. Ensure that this is discussed with students as part of preparing them to use the AI tool including how to address these issues in their assessments. Also, be mindful that bias and lack of diversity may impact how included students from different equity groups may feel. 

Equal AI output

Finally, if you plan to integrate AI into assessments through analysing or critiquing AI-generated responses, ensure all students receive the same or equivalent AI-generated output to ensure assessment equity. 

Privacy and data security

7. Can I upload students’ work into AI? 

While we understand the potential benefits of using AI tools to analyse students’ work, students retain intellectual property ownership of their submitted assignments. Therefore, for privacy and data security reasons, uploading student work into an AI platform without their consent is not permitted. The Generative AI and data governance page provides further guidance around data governance. For assistance with managing privacy concerns and personal information, contact Privacy Officer

ANU is evaluating new technologies and their implications, and this guideline may be reviewed in the future. In the meantime, we encourage educators to explore alternative methods of utilising AI tools that do not involve uploading student data directly.  

Students’ acceptance of using AI

8. What actions should be taken if a student prefers not to use AI tools in their assessment process? 

The use of AI tools in learning and teaching is currently optional. However, as the prevalence of AI grows in both educational institutions and workplaces, it becomes crucial to consider how to incorporate these tools into teaching and learning to develop students’ AI literacy in this digital age.  

Nonetheless, students might not want to use AI tools due to various reasons such as values or concerns about bias or originality. 

  • If AI literacy is required due to accrediting requirements or is an integral part of assessment, a message must be included in the Class Summary and communicated to students at the start of the semester.  
  • If AI literacy is not a compulsory requirement but AI is an integral part of assessment, students who do not want to use AI will need to inform educators with valid reasons, and educators may need to offer alternative assessments that assess the same learning outcomes.  

Educators should check students’ readiness and acceptance of AI by initiating open conversations at the beginning of the semester. Refer to the Communicating with students about AI section of this document for further guidance.