AI Policy for The Norman Family Trust

1 – Purpose

This policy sets out how The Norman Family Trust (NFT) uses artificial intelligence (AI) responsibly and lawfully in its operations. As a UK-based charitable funder, we are committed to ensuring that AI is used ethically, transparently, and in compliance with all applicable laws, including the UK GDPR and Data Protection Act 2018.


2 – Scope

This policy applies to all staff & trustees who may use AI tools on behalf of the organisation. It covers:

  • Administrative and communication tasks
  • Publicity and external communications
  • Internal operational support
  • The role of AI in grant applications coming into the charity.  We have also produced some Guidelines for Applicants on the use of AI in grant applications to the NFT.

It does not cover grant assessment, as we do not use AI for that purpose.  The trust will evaluate the use of AI tools in this process at a future time.


3 – Our Use of AI

The Norman Family Trust may use AI tools for:

  • Drafting or editing publicity materials, website content, reports, and other communications
  • Summaries of non-confidential documents
  • Administrative support such as scheduling assistance or template creation
  • Research for public or sector-level information

We do not use AI to:

  • Assess or score grant applications
  • Make decisions about funding
  • Process personal data in ways that could meaningfully affect individuals without human oversight

All funding decisions will always be made by humans.


4 – Data Protection and Privacy

When using AI tools, we adhere to the UK GDPR and Data Protection Act 2018. This means:

4.1 – Personal Data

  • Personal data relating to the Norman Family Trust must not be entered into AI systems unless the tool has been assessed for GDPR compliance and appropriate data processing agreements are in place.
  • Staff must not input sensitive personal data (e.g., health, ethnicity, financial hardship information) into AI tools.
  • Any data shared with AI tools must be minimised, anonymised where possible, and handled securely.

4.2 – Transparency and Accountability

  • We remain responsible for all outputs produced using AI tools.
  • Human oversight is required for all content generated by AI before publication or external sharing.

5 – Ethical Use

We commit to:

  • Ensuring AI use aligns with our charitable mission and values
  • Avoiding AI-generated content that may be misleading, discriminatory, or harmful
  • Clearly distinguishing between human and AI-authored content where appropriate

6 – Security

  • Staff may only use AI tools approved by the organisation.  This includes ChatGPT and Microsoft Co-Pilot and Adobe Acrobat AI.
  • AI tools must meet our security expectations, including not using data for training without consent.
  • Access to AI systems must be appropriately managed and monitored.

7 – Review and Updates

This policy will be reviewed annually or sooner if:

  • UK law or regulatory guidance on AI changes
  • Our use of AI significantly expands
  • Risks or concerns are identified through internal review

8 – Contact

Questions about this policy or the organisation’s use of AI should be directed to:
Emma Le Poidevin, Grants Administrator – emma.lepoidevin@nfct.org

Last Reviewed - March 2026