Note this document is changed regularly so please do check back for the latest version
This is v1.8 updated on 29 May 2025
Changes
- Added security advice on AI note-taking bots provided by the University (1.8)
- Changed text "line manager/head of department" to "head of department/college officer" (1.7)
- Added a box to the text showing what users need to do before using AI (specifically getting agreement from their line manager/head of department and approval from IT and Digital Services) and added this text to the policy (1.6)
- Changed the name of the paid for version of Microsoft Copilot (it is now called Microsoft 365 Copilot) (1.5)
- Clarified that the definition of users that the policy applies to includes others such as external Directors of Studies and contractors as well as staff and fellows (1.5)
Introduction
AI tools are all around us today. Used safely that can make our working lives easier – but there are risks we need to consider in order to be able to use them safely.
Scope
This policy and guidance cover any use of artificial intelligence (AI) software and services on college business. The term ‘users’ includes any staff, fellows or others (such as external Directors of Studies or contractors) involved with College business. This does not include any academic usage for teaching or research – guidance is available from the University for this type of usage.
The term AI is used in this document, but this document only concerns the use of generative AI (GenAI). This typically descries any AI which can be instructed to generate content.
University AI Guidance
The University has issued guidance on the use of AI for administrative purposes which must be followed by users.
Key Risks
The University guidance identifies the following key risks:
- Bias, misinformation and inaccuracy
- Data protection law non-compliance
- Intellectual property and copyright law non-compliance
- AI legislation non-compliance
- Reputational damage and other risks
College Guidance
BEFORE you use AI in the workplace for college business: |
AI is a useful tool in the workplace but like all tools needs to be used carefully. If you aren’t sure about anything, the IT Support Team can advise you, we are always happy to support and advise. It’s really important to ask before you try anything as it is not easy (and sometimes impossible) to make things right after.
There are some key things to be careful about.
- Make sure you don’t paste text or upload documents containing personal information or confidential information to any AI tool unless you know it is approved by IT and Digital Services
For example – say you have a spreadsheet of student data that you want to format in a particular way. AI tools might be able to do this, but you must not upload the data into something like ChatGPT as that would be putting personal data at risk.
- Make sure you don’t paste text or upload documents, photos or videos where we might want to protect our intellectual property from being used to “train” AI. The same is true if we have content (for example photographs) that are the intellectual property of someone else that we are using. They might have given us permission to use them but not to train an AI tool. You can only do this if the tool is approved by IT and Digital Services.
For example, an AI tool might be able to help edit a photo that you are using for a presentation that we had taken as part of a professional photo shoot. You must not upload this to an AI tool as it might be used to “train” that tool meaning that bits of the photo might appear when someone else asks the tool to create an image. This puts at risk our intellectual property and possibly any agreements we made with the photographer.
- If you are using AI to make a transcript and/or automatic notes during a meeting, make sure you actively inform and ask all participants for their consent before you do this. You must only use approved tools for this purpose, and you should delete any recordings and transcripts as soon as you don’t need them. Remember, you should ask any external participants for consent too (for example a supplier we are working with).
- When running teams meetings amend the security settings in teams to prevent AI note-taking bots from joining your meetings to protect the security of your meetings and any confidential information you may discuss or show on screen. An AI notetaking bot is a program that can record online meetings, take screenshots and automatically take notes. You may not be aware that a bot has joined your meeting. Follow the step-by-step instructions provided by the University.
College Policy
University Guidance
Users must consider relevant risks, and whether they are applicable to the circumstances in which they are considering use of AI, before any use of AI is initiated. The University guidance that can be found at https://www.information-compliance.admin.cam.ac.uk/data-protection/guidance/ai-guidance must be followed as well as this policy and the college guidance attached.
Users must get agreement from their head of department or college officer and approval from IT and Digital Services before using AI tools for college business.
Personal Data and confidential information
Users must not upload any personal or confidential information into an AI tool unless the tool is specifically approved for use by IT and Digital Services. Services will only be approved for use following a rigorous security assessment.
Intellectual Property
Users must not upload digital assets or information that the college might reasonably wish to protect its intellectual property interest in (or the interests of a third party), unless the tool is specifically approved for use by IT and Digital Services. Services will only be approved for use following a rigorous security assessment.
Meetings and Transcripts
Users must:
-
- Inform and seek consent from all present to use AI tools to record and transcribe meetings and only do this using Microsoft Co-Pilot when logged into a University Office 365 account
- Only keep transcripts and recordings until minutes for the meeting have been prepared or for a period of no more than 3 months for meetings that have no minutes
- Ensure that AI note-taking bots are not permitted to join teams meetings to do this follow the instructions provided by the University
Approved Services
The following services are currently approved under this policy:
Service | Conditions |
Microsoft Copilot and Microsoft 365 Copilot | Only when logged into a University Office 365 account |
Google Gemini | Only when logged into a University Google Workspace account |
Canva | Only when logged in using an account supplied by IT and Digital Services |
ChatGPT | Only when logged in using an account supplied by IT and Digital Services |
Note: The version of Copilot accessible to most Office 365 users is called “Microsoft Copilot”. A more comprehensive version of Copilot is available called “Copilot for Microsoft 365”, and a licence needs to be provided by IT and Digital Services for this. The University guidance has more information about the difference between the two. In all cases you must be logged into your University Office 365 account.
James Hargrave
Director of IT and Digital Services
29 May 2025