AI chatbots in projects and assignments
TUDelft Assessment Taskforce has created practical guidelines on how lecturers can deal with the influence of AI chatbots on unsupervised assessments and what mitigating measures can be taken.
The possibilities of AI chatbots in assessment require further investigation. A group of AI experts will do a risk analysis of AI chatbots on assessment. We will update this page if their findings require additions to the advises below. This page was last updated on 22 February 2023.
About AI chatbots (e.g. ChatGPT)
AI chatbots like chatGPT can create convincing answers to certain questions, depending on their nature. However their output is not always reliable: Its outputs can contain convincingly presented factual errors. Furthermore, e.g. ChatGPT currently only uses sources that are at least 2 years old. Lastly, most chatbots do not list their resources. Because they can produce incorrect information, they are sometimes referred to as ‘an extra classmate’. Integrating chatbots into search engines like Bing Chat introduced other errors.
On the positive side, chatbots can help with, for example:
- checking grammar, spelling, and references in a text
- generating ideas by listing information from different sources in an accessible way
- giving feedback
- summarising texts
Use by lecturers for assessments: AI chatbots can help lecturers in creating assessments (including different versions of an assignment), answer models, and rubrics.
-
The following assumptions have been used as the basis of the practical guidelines in the how-to section:
- Students’ use
We must assume that AI chatbots are used by our students and graduates.
- Quality requirements for assessment
This is how the quality assessment requirements might be impacted by AI Chatbots:
- Reliability: Does the use of AI chatbots influence whether grades represent how well students master the learning objectives?
- Feasibility for students: Mitigating measures could increase the number of assessments and therefore diminish the study load.
- Transparency: In addition, all expectations from students should be transparent
- Validity: could be diminished if the (weights of) assessment criteria are changed.
- AI chatbot detection
It is currently unknown what the reliability of AI chatbot detectors is.
- Definition of fraud
The definition of fraud is (source: model R&G, article 7.1):
“Fraud is taken to mean any act or omission by a student that makes it fully or partially impossible to properly assess the knowledge, insight and skill of that student or another student.
Fraud is in any event understood to include the commission of plagiarism in any form; it should be clear that this includes all cases in which a student implies that the work in question is his or her own when this is not the case, such as copying the work of others and presenting it as one’s own through deliberate deception, or through carelessness or inadequate references. Fraud also includes among others the following:- being in possession, during an examination, of aids (digital or otherwise), any notes, pre-programmed calculator, mobile phone, book, syllabus, notes in books for an open-book examination, the use of which has not been expressly permitted;
- looking at the work of others during an examination or exchanging information or materials inside or outside the room where the examination is taking place;
- getting someone else to take the examination or impersonating someone else during an examination;
- being in possession of the questions/assignments of an examination before the date or time on which the examination is due to be held.”
- Attribution:
The use of AI chatbots (and of tools in general) should be acknowledged and properly referenced, to ensure the distinction between the students' original ideas and those provided by AI, and to check whether the student critically checked the output of the (chat)bot (or other tools). This can be done by indicating the purpose for which the AI tool was used, and listing the (successful) prompts that the students used.
- Accessibility
Most chatbots are currently free-of-charge, but this is changing. To prevent a digital divide based on students’ financial situation, the TU Delft could consider getting a license to paid bots, while taking into account the security, privacy, and ethical issues below.
- Security, privacy & ethical issues
Bots use user input to train future versions. This can have consequences for the privacy and intellectual property of information that is fed to the (chat)bot during its use. In addition, ethical questions are arising regarding the current and future influence of AI chatbots on truth finding and society as a whole, as well as regarding the power of its owners (big tech companies).
How to assess in unsupervised projects and assignments
During classical written exams and digital exams in the TUD secure exam environment, students do not have access to the internet, and therefore chatbots cannot be accessed by your students. The same holds for oral exams that are held on campus.
-
Feed your assignments to chatbots and study their output. How would you assess the output using your answer model or rubric? How is this different from student answers? You can use this information to get a feeling of whether or not students used AI chatbots in their work.
Discuss the possibilities and limitations with your students of using AI bots in unsupervised assignments. Train students to not trust the answer of AI chatbots, even for questions that are not too difficult and require mostly factual knowledge. Students need to internalize that they need to double-check all output of AI chatbots, to prevent them from learning incorrect facts and reasoning. -
- Give an explanation in case you advise against the use of AI chatbots:
If you consider the use of AI bots in your course detrimental to achieving the learning objectives, clearly state your reasons to your students. Make sure they understand that they will fall through the cracks in the summative assignment when they do not have access to these chatbots. Refer students back to the definition of fraud and our TU Delft code of conduct. - Inform your students about attribution expectations:
Inform your students on how you expect them to correctly attribute the use of bots. Have students reflect on the use of AI chatbots.
Examples:- Attach the chatbot conversations to the documents, or at least the (successful) prompts that they used.
- Have them write a short section on how they used bots and in what ways it was and was not helpful, and what they learned.
- Give an explanation in case you advise against the use of AI chatbots:
-
- Have sufficient feedback moments during the course and ask students to reflect on how they processed the feedback. If possible, do this in a discussion.
- Regularly check the progress of individual students during their projects/assignments (if feasible). This is also good for learning and confidence building if you turn it into a supervision/feedback moment. Check during the creation of the deliverable (of a project/assignment) whether they are all contributing / learning, for example by brief oral discussion of a product they are working on (e.g. after finishing a specific (small) step in a project/code assignment).
-
Currently, e.g. ChatGPT only uses sources that are at least 2 years old, which implies that students would need to feed the chatbot with this new information, which makes the use of chatbots more cumbersome. Add elements that make it hard or impossible to copy information and let AI chatbots change the text (i.e. ‘disguise’ as original information).
Examples:- Provide and ask for multimodal input as part of the assignment (drawing, graphs, and schematic representations). Chatbots are poor at extracting information from drawing, graphs, and schematic representations. Besides adding these to the cases that you give to your students, you could ask your students to add these to their work. Chatbots are currently not very good at creating these, either.
- Ask students to use recent articles as references (and check these).
- Cross-check references across students: if they all use the same references, they may have ‘disguised’ having rewritten each other's homework (with the help of AI chatbots).
- Ask for a personal touch to an essay.
-
- Shift assessment criteria towards the process instead of the deliverable. Make sure that the assessment is valid and transparent for students.
- Version control: Track the progress of students through version control in e.g. Word or Gitlab. Are they processing feedback proactively?
-
Take fraud detection measures and report suspicions of fraud to your Board of Examiners:
- Oral authenticity check: Do an oral authenticity check (4a) to check if it is likely that the student produced the text in themselves. This should either be a random check, or based on justifiable parameters to prevent bias
- Use an AI detector, like GPTZero, to investigate the probability that an AI chatbot was used. Significantly high(er) scores should be reported to BoEx as possible fraud (if the use of AI chatbots was not allowed or not properly attributed).
- Check the transfer of skills & knowledge: Consider adding a written exam to a project in which students have to answer questions on a case that is similar to their project. That way, you can test the students’ ability to transfer their knowledge to another situation. Additionally, this aids with retention of knowledge & skills, especially if you discuss the exam in class. Carefully consider the timing and the weight of the exam (consider making the exam pass-fail) to prevent students from focussing on the exam instead of on their project. Adding assessments adds to the study load and is not permitted during the academic year without the permission of the board of examiners (and the programme director).
-
Rethink your course’s assessment plan. If necessary, adjust the learning objectives. This doesn’t necessarily mean that the taxonomy level should be increased since this could lead to an increase in the difficulty and study load of the course. Consider the relation to other courses in your decision.
Keep in mind that these changes require a study-guide adjustment and will therefore have to be approved by your programme director, the Board of Studies and the Faculty Student Council and during the academic year in special circumstances by the board of examiners, see here).
Need support?
Get in touch with us! We are happy to help.
+31 (0)15 27 84 333