AI tools are becoming increasingly accessible and offer new possibilities for supporting the grant writing process. A recent report highlights how ERC funded researchers are using artificial intelligence (AI) in their scientific processes, and how they see its potential impact by 2030
Use and impact of artificial intelligence in the scientific process - Publications Office of the EU
Research Funder Policies on AI Use in Grant Writing
As artificial intelligence (AI) tools become more prevalent, research funders are developing policies to address their use in grant applications. Here's an overview of current policies from major funders:
UK Research and Innovation (UKRI)
UKRI has established a policy on the use of generative AI in funding applications and assessment (Generative artificial intelligence in application and assessment policy – UKRI):
- Applicants must apply caution when entering information into generative AI tools to develop an application. Sensitive or personal data of others must never be input into a generative AI tool without formal consent from the individual.
- Applicants should consider the risk of bias when using outputs from the generative AI tool or model and consider mitigation.
- Applicants must apply caution when using outputs from generative AI tools to develop applications and ensure it does not contain any information that is confidential and used without consent, falsified, fabricated, plagiarised or misrepresented.
- Applicants are expected to be transparent about their use of generative AI tools in developing applications.
- Reviewers are explicitly prohibited from using generative AI tools to develop their reviews of funding proposals.
European Commission
The European Commission together with the European Research Area countries and stakeholders, has put forward a set of guidelines to support the European research community in their responsible use of generative artificial intelligence (AI).
The ERC has issued a statement on AI use in grant writing (Current position of the ERC Scientific Council on AI|ERC):
- The ERC recognizes that scientists may use AI for brainstorming, literature searches, and text revision.
- However, authors retain full responsibility for their proposals, including acknowledgments, avoiding plagiarism, and maintaining good scientific conduct.
Funders joint statement: use of generative AI tools in funding applications and assessment
The Research Funders Policy Group includes Association of Medical Research Charities (AMRC), British Heart Foundation (BHF), Cancer Research UK (CRUK), National Institute for Health and Care Research (NIHR), Royal Academy of Engineering, Royal Society, UK Research and Innovation (UKRI), Wellcome.
They issued a joint position statement: Use of AI tools in funding applications - What we do - All our work | Wellcome
Individual funders include detailed guidelines for applicants, for example RAEng:
- Applicants must take full responsibility for all content in their applications.
- The use of generative AI tools is not penalized but must be clearly acknowledged.
- Applicants should exercise caution to avoid including AI-generated inaccuracies or "hallucinated" references.
- It is not acceptable to rely solely on AI tools to write entire applications.
- Proper attribution of sources is essential, as AI-generated content may inadvertently include ideas from other authors.
General Considerations for Applicants
When considering the use of AI in grant writing, applicants should:
- Prioritize transparency and disclose any AI tool usage.
- Ensure the application reflects their own voice and ideas.
- Verify the accuracy of AI-generated content, especially for technical or field-specific information.
- Use AI as a supplementary tool rather than relying on it entirely.
- Be aware of potential intellectual property issues when inputting novel ideas into AI systems.
As AI technology evolves, funder policies are likely to adapt. Applicants should always check the most current guidelines from their specific funding bodies before submitting proposals.