AI Use Guidance for Applicants

1 – Purpose of This Guidance

This document explains how organisations applying to The Norman Family Trust NFT) may use artificial intelligence (AI) tools when preparing their grant applications. It aims to support applicants in using AI responsibly, while ensuring that applications remain truthful, accurate, and representative of the organisation’s real activities and needs.


2 – Principles for Using AI in Grant Applications

2.1 – AI as a Support Tool, Not a Substitute

Applicants may use AI tools to assist with:

  • Drafting or editing written sections
  • Clarifying language or structure
  • Summarising non-confidential documents
  • Generating ideas or improving readability

However, AI tools should not replace the applicant’s own understanding, programme design, financial planning, or strategic intent.


3 – Safeguards and Responsibilities

3.1 – Accuracy and Truthfulness

Applicants remain fully responsible for the accuracy of all information submitted.

  • AI-generated text must be reviewed and verified by a human within the organisation.
  • All facts, figures, claims, and descriptions of need or impact must be confirmed as true.
  • Applicants should avoid generic narratives that do not reflect their real community, beneficiaries, or organisational capacity.

3.2 – Protecting Confidential and Personal Data

To comply with UK GDPR and good data practice:

  • Do not input personal data about staff, volunteers, trustees, beneficiaries, or service users into AI tools.
  • Do not include sensitive data (e.g., health, ethnicity, financial hardship, safeguarding information) in prompts to public AI tools.

3.3 – No Fabrication of Evidence or Impact

Applicants must not use AI to:

  • Invent data, testimonials, outcomes, or evidence
  • Simulate service-user voices
  • Create misleading images, statistics, or financial projections
  • Claim organisational capabilities that do not exist

All information must reflect real activities and achievable plans.

3.4 – Transparency within Grant Applications

We will ask, via the NFT website, for applicants to note in their application if AI tools were used (e.g., “Some sections drafted with AI and reviewed internally”). This does not affect assessment but promotes openness.


4 – Ownership and Voice

Applications should represent the organisation’s authentic voice and mission.

  • AI-generated text should be adapted to ensure it reflects the organisation’s values, tone, and community context.
  • Senior staff or trustees should approve the final version before submission.

5 – Ensuring Representativeness

Organisations should take steps to ensure AI does not distort their message:

  • Avoid over-polished or jargon-heavy AI-generated language that might not reflect real practice.
  • Cross-check that descriptions of need, beneficiaries, methods, and outcomes match the organisation’s actual work.
  • Verify that AI tools have not introduced errors, stereotypes, or assumptions about communities.

6 – Accountability

The applicant organisation:

  • Retains full responsibility for all submitted material, regardless of whether AI was used
  • Confirms that the application is complete, honest, and based on accurate information
  • Understands that the funder may request clarification or evidence for any claims made

7 – Prohibited Uses of AI in Applications

Applicants should not use AI to:

  • Write entire applications without human oversight
  • Generate financial data, monitoring results, or impact data without a factual basis
  • Represent artificial or hypothetical beneficiaries as real
  • Create misleading images, reports, or organisational credentials

8 – Support

This guidance is not meant to restrict the use of helpful tools but to ensure fairness, transparency, and accuracy across all applications. Applicants with questions about responsible AI use may contact:
Emma Le Poidevin, Grants Administrator – emma.lepoidevin@nfct.org


AI in Grant Applications: Quick Dos and Don’ts

✅ DO

  • Use AI to improve clarity, grammar, and structure
  • Verify every fact, figure, or statement produced by AI
  • Ensure the final application sounds like your organisation, not a generic template
  • Anonymise any information before entering it into AI tools
  • Use AI for admin support (summaries, outlines, readability)
  • Have a staff member or trustee check the final application

❌ DON’T

  • Enter personal or sensitive data into public AI systems (GDPR risk)
  • Use AI to invent evidence, statistics, outcomes, or user stories
  • Allow AI to decide your project plan, budget, or strategy
  • Submit AI-written text without careful human review
  • Use AI to create misleading images or claims
  • Let AI overshadow your authentic community voice