Guidance for Academic Schools Relating to PGT Dissertations and the use of Generative Artificial Intelligence Tools

Whereas traditional AI tools, which are based upon pre-defined rules and algorithms, can be used to identify patterns within a training data set and make predictions, generative AI uses machine learning algorithms to create new content or generate new ideas based on patterns and characteristics it has learned from existing data. Generative Artificial Intelligence (AI) tools can therefore be used to create new content, including text, computer code, images, and audio, and are designed to produce original content that seeks to replicate the style of the data that they were trained upon based upon human commands and inputs. Generative AI tools include, but are not limited to, Open AI’s GPT-3.5, GPT-4, and ChatGPT, Google’s Gemini, Microsoft’s Bing Copilot, and Grammarly GO.

General

  1. Generative AI tools are becoming increasingly accessible and like all other higher education institutions, we are continuing to work to understand their impact on teaching, learning, assessment, and support practices, and how students are using them.
  2. This guidance has been produced to assist supervisors of postgraduate taught (PGT) dissertations to ensure that students are clear about the implications of generative AI for their projects, and to help in mitigating its effects. It is based upon guidance first circulated in 2022/23 and applies to all PGT dissertations completed by all registered students within the University.
  3. Longer-term, this guidance is intended to encourage Schools to consider the future arrangements for their PGT dissertations, and the potential impacts of generative AI upon them, ahead of future academic years. It therefore forms part of a wider programme of work that is currently underway related to student projects and dissertations in a generative AI-enabled world.
  4. The guidance here aligns with the University’s Framework for the Introduction and Use of Generative Artificial Intelligence within Teaching, Learning and Assessment and the University’s Code of Practice on Academic Integrity.
  5. Should Supervisors wish to seek further advice or guidance, they should contact, in the first instance, either their Head of Education or their School’s Academic Integrity Officer.

Use of Generative AI Tools

  1. Students should not submit, within any part of their PGT dissertation, material or content that has been generated by AI tools unless their use has been specifically permitted by the School.
  2. Where the use of generative AI tools is permitted, the University’s Framework for the Introduction and Use of Generative Artificial Intelligence within Teaching, Learning and Assessment must be followed and students required to appropriately reference its use.
  3. Section A1.6 of the Code of Practice on Academic Integrity has recently been updated to provide clarity on the potential role of generative AI as a proofreading tool within essays, projects and dissertations. Where such use is made of generative AI tools, students must notify their dissertation Supervisor and include a clear statement of its use within a dedicated acknowledgments section:
  4. “When used to provide ‘proofreading support, generative AI must not be used to:
    1. Alter text to clarify and/or develop the ideas, arguments, and explanations.
    2. Correct the accuracy of the information.
    3. Develop or change the ideas and arguments presented.
    4. Translate text into, or from, the language being studied.
    5. Or for the purpose of reducing the length of the submission so as to comply with a word limit requirement.
    It may only be used to offer advice and guidance on:
    1. Correcting spelling and punctation.
    2. Ensuring text follows the conventions of grammar and syntax in written English.
    3. Shortening long sentences or reducing long paragraphs without changes to the overall content
    4. Ensuring the consistency, formatting and ordering of page numbers, headers and footers, and footnotes and endnotes.
    5. Improving the positioning of tables and figures and the clarity, grammar, spelling and punctuation of any text in table or figure legends."
  5. All PGT Students should be reminded, via an email from the School’s lead for PGT, a clear and visible statement on the dissertation Canvas page, and at a meeting with their Supervisor, of the School’s policy on the use of Generative AI within dissertations. This should include:
    1. Reminding students of the University’s Code of Practice on Academic Integrity, including where to access it in full, which specifically references the use of generative AI tools and their role in providing proofreading support, and of the need to ensure that submitted work is their own.
    2. Reference to some of the known issues or limitations of using generative AI including: inaccuracy and mis-representation; potential for biased ideas; potential for the inclusion of plagiarised or copyright content; and, uncertainty of ownership of intellectual property.
    3. Reminding students that whilst they may use spelling and grammar checking tools, such as Grammarly (not GrammaryGO) and those integrated within Microsoft Word which use AI, generative AI tools go beyond the acceptable use permitted above.
    4. Alerting students of the need to be cautious if using generative AI to search for information or identify literature sources.
    5. Where the use of generative AI tools is allowed, informing the students of how its use should be appropriately acknowledged and referenced within their work.
    6. Where the use of generative AI tools is not allowed, informing students of the need to include a clear statement at the start of their dissertations, confirming that other than for proofreading support, no AI generated material is included within their dissertation.
    7. Directing students to the University’s Student guidance on using Generative Artificial Intelligence tools ethically for study.

Mitigating the Effects of Generative Artificial Intelligence

  1. The following form ideas and suggestions Schools may wish to consider in relation to their PGT dissertations.
  2. An effective strategy to help Supervisors in mitigating the effects of inappropriate use of generative AI involves becoming increasingly familiar with the written work of each student. Strategies to consider include:
    1. Staging or staggering tasks associated with the production of the dissertation, so that students are required to complete a series of smaller activities which. When combined, collectively form their dissertation. Supervisors should ensure tasks build upon previous work.
    2. Ahead of each supervision meeting, requiring students to upload the current draft of their project. This offers the benefit of allowing progress to be tracked and forms a record of progress along with allowing Supervisors to develop increased familiarity with the work of the student.
    3. Requiring students to upload a bullet-point commentary, or video/podcast as an alternative, detailing actions they have undertaken, for example sources of information they have used to inform their ideas or research, between each supervision.
    4. Related, ask students to submit immediately afterwards a short but specific plan of the actions they will now complete after each supervision meeting.
    5. When providing feedback, require specific activities or tasks associated with the project to be completed ahead of the next supervision meeting. Progress should then be checked and discussed with the student at that meeting to determine their understanding and application of the ideas.
  3. Consider asking students to submit ahead of an initial supervision meeting a linked reference list of material or sources they intend to include within their dissertation. Alternatively, Supervisors might want to specify material they expect to see included within the final submission.
  4. Working with your Dissertation Co-ordination and Head of PGT, consider revising the weighting of criteria within your marking guides and assessment rubrics. These can be adjusted to place a greater emphasis upon criteria that assess learning beyond that which AI tools are currently be capable of producing. Factual accuracy and correct referencing to appropriate scholarly sources could be given an enhanced weighting along with requiring all references to be hyperlinked.
  5. Generative AI tools are trained using a specific dataset and so may not be accurate when applied to more recent, localised, or contextualised scenarios. Consider adding a Birmingham-specific context, for example related to a particular taught module the students have studied, research activity or application, or require students to use a specific dataset or case study to underpin their project.
  6. Consider holding a joint supervision meeting with other students to allow students to discuss their ideas and work, or embed a peer review component to encourage students to submit draft work for feedback. This also offers the students an opportunity for enhanced feedback.
  7. Supervisors should ensure they keep a written record of each supervision, including a summary of student progress and agreed actions relating to future plans and the overall academic direction of the project.