Funding for research on AI risks

Funding for research on AI risks

I appreciate that you can't open a newsletter or social media without someone thrusting some AI-generated pearls of wisdom, or some "thoughtful" guidance on how not using AI will be the death of your business, but this edition of #FF brings news of new Government Grants!

Whether we embrace the benefits of AI or not, the risks that AI brings to our everyday lives are significant.


Researchers in the UK are being offered new funding opportunities to explore ways of making society more resilient to the risks posed by Artificial Intelligence (AI). These risks include emerging threats such as deepfakes, misinformation, and cyber-attacks, and the funding is intended to support work aimed at ensuring AI’s safe and responsible use.

This initiative, launched last week, is a collaboration between the government, the Engineering and Physical Sciences Research Council (EPSRC), and Innovate UK, which is part of UK Research and Innovation (UKRI). The initiative is focused on exploring how AI systems can be made safer, and will also support research to tackle the threat of AI systems failing unexpectedly, for example in the finance sector.


AI is considered to hold significant potential to drive long-term economic growth and improve public services. However, there are risks that come with AI, including system failures and misuse. The government is keen to promote and maximise the benefits of AI across the UK economy and therefore it is looking at ways to ensure that as AI is adopted across different industries, it remains safe, reliable, and trustworthy.

Grant opportunities

The Systemic Safety Grants Programme, overseen by the UK’s AI Safety Institute, has opened applications for its first phase, which is set to distribute up to £4 million in funding.

This programme is part of a broader £8.5 million fund that was first announced at the AI Seoul Summit in May, so further phases of grant funding will become available in the future.

Here are the key details:

  • Who can apply: UK-based organisations can apply, but projects may include international partners.
  • Funding: Up to £200,000 per project for around 20 selected projects in the first phase.
  • Focus areas: The opening phase aims to deepen understandings over what challenges AI is likely to pose to society in the near future.
  • Application deadline: Proposals must be submitted by 26th November 2024.
  • Timeline: Successful applicants will be notified by the end of January 2025, and grants will be awarded in February 2025.


For researchers interested in contributing to the future of AI safety, this funding could present a significant opportunity, so please share this news far and wide.

For more information and guidance on how to apply for the grant scheme, visit https://www.aisi.gov.uk/grants.

Mark Kreling

Director, Fenland District Brokers Ltd

1 个月

Talking to a lawyer yesterday, AI can create case law; dangerous to rely on.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了