A group of people with representation from marginalised communities sitting around a large wooden table in a modern boardroom having a meeting

Photo by Christina @ wocintechchat.com

It’s Trustees’ Week- a time to celebrate charity boards. 2024 marks my sixteenth year of sitting on various charity boards and undertaking other non executive roles in the sector. This Trustees’ Week also falls two years after ChatGPT arrived, kickstarting a wave of change and innovation in how charities are using artificial intelligence (AI). 

I’ve already seen how AI has changed my role as a charity trustee. On the board at Charity Digital, AI is an area of focus, both in terms of how we can best support the sector through the work of the charity, and through discussing how other trustees and staff are using AI. Reflecting on it now, that feels like a shift in my brief as a trustee. The role feels more challenging, but also more exciting. I can’t be the only trustee affected by this. 

AI is changing our sector

Let’s be frank about the disconnect here. The Charity Commission’s latest trustee survey shows only 3% of trustees report their charity using AI. Even among larger charities (£1m+ income), that figure crawls up to just 8%. If those numbers sound surprisingly low, you’re right – and that’s exactly why we need this conversation.

Our own Charity Digital Skills Report found that 61% of charities are using AI- a huge increase from 35% last year. Charity Excellence’s  helpful Charity AI Benchmarking Survey 2024 found that a similar number (60%) were using AI. 

I suspect that there are many charity trustees who may not be aware that their charity is using AI. That worries me, because without this knowledge they cannot provide the right strategy, scrutiny and support. 

If your board needs to understand more about how their staff and volunteers are using AI then they could use our AI Checklist for Charity Trustees and Leaders to structure the conversation. We launched it during last year’s Trustees’ Week. 

We will be launching more resources to help charity trustees and leaders with AI in 2025, as we are currently working with Charity Digital to update The Charity Digital Code of Practice to include guidance on artificial intelligence. 

AI policies: beyond the template

Earlier this year the Charity Commission published a useful blog suggesting that charities might consider having an AI policy as their use of these tools develops further. After spending two years writing and reviewing AI policies for charities of all sizes, I’ve spotted patterns in what works – and  what doesn’t. Here are the key things you need to know.

Firstly, I think your policy should be based on evidence. As both the Charity Digital Skills Report and Charity Excellence’s Charity AI Benchmarking Survey 2024 shows, use of AI is growing rapidly across the sector. Talk to your staff, volunteers and yes, your trustees, to find out how they are using AI, where they need support with it and what hopes and fears they may have for AI’s role in your charity’s future. If your charity is small, this might be a conversation around the kitchen table. If you’re a medium to large sized charity I would undertake a survey and run a small number of interviews with staff across your charity so you have robust data and insights to inform your policy.

The scope of your policy also needs careful thought. I have seen some which have a very tight remit, for example focusing solely on generative AI, not other forms of artificial intelligence, or majoring on which tools can and can’t be used. Don’t make your policy too narrow. AI needs to be looked at holistically, with time taken to assess its impact on your operations, your governance, your staff, your ways of working, and many other areas. To get started on this,you could use the AI Checklist for Charity Trustees and Leaders, which covers the questions you need to ask about AI adoption. 

Equally, don’t make your AI policy too abstract. Sure, ChatGPT, Claude and the other LLMs can generate a first draft of a policy. But my experience is that you’ll need to give these tools very detailed prompts, lots of useful information and have the time and patience to iterate it, otherwise you may get a generic template as the output. What it can come up with is a useful first draft, but it might need a fair bit of iteration to make it useful. This policy is going to need to do a lot of heavy lifting as innovation accelerates. 

In AI, context is everything, because AI is a change programme as much as a radical shift in how we use technology. Your trustees need to understand this fundamental truth: AI isn’t just another IT project. It’s about cultural change, strategic direction, and ultimately, how your charity will stay relevant, and have impact,in a rapidly evolving digital landscape.

The questions your board needs to ask now

One of the most valuable things you can do as a trustee is ask the right questions. These might be the things that other people are scared to ask, or the question that might sound silly or obvious to you, but turns out to be something others are worried about. 

As use of AI grows in our charities and across our sector, we need to adopt it responsibly. I wrote a piece for Third Sector a while back about how this means implementing it carefully, testing it safely, and understanding how data will be used, including assessing any risks involved. 

We all know the pressures that the cost of living crisis has created in the sector. These issues are going to get worse because, at the time of writing, charities are battling the potential £1.4 billion overhead of the increased National Insurance contributions announced in the budget. 

Right now, it may be very tempting to ask if AI could save your charity money. Yes, there is a possibility it might do, but equally you’ll need to evaluate whether ramping up your use of AI, based on how your charity operates and what its mission is, is the right thing to do, and fits with your values. When planning how to scale your charity’s use of AI, start by asking ‘In what ways might AI help us increase our impact?’ rather than starting with how it might save you cash, however tempting that is. Asking this question and gathering the evidence to support your ideas on how your charity might use AI will help you stretch your resources further by identifying where to best focus your money and your time. 

This discussion might well get you thinking about your decision making criteria, and the principles which underpin them. It’s likely that you’ll have to review these as AI develops further. So why not start thinking about this sooner, rather than later, so that you are on the frontfoot? The pace of change in AI is not going to slow down anytime soon. 

The delegation dilemma: who’s really making AI decisions?

As we saw earlier, AI adoption across the sector grew hugely this year. Within this, there is a lot of delegation going on that trustees may not be aware of. 

Assuming your charity is of a size where delegation of the management of the charity and its operations has been handed to staff, it’s likely that AI is involved in making decisions in your charity. 

Here’s what’s keeping me up at night: the growing gap between what trustees think is happening with AI in their charity, and what’s actually happening on the ground. Your staff are probably already using AI tools – whether it’s drafting social media content, analysing service user data, or even writing policy documents. Whilst delegation of day to day operations is vital for many charities, AI brings new considerations to the table.

Consider this scenario: your fundraising team starts using AI to personalise donor communications and predict giving patterns. On the surface, this looks like routine operational activity. But what if the AI is making assumptions about donors that don’t align with your values? Or what if your service delivery team is using AI to triage service users without clear ethical guidelines? These aren’t just operational decisions anymore – they’re governance issues that need trustee oversight. It’s time to redraw those delegation boundaries with AI in mind

I’m not trying to warn charities off using AI- quite the opposite. But as AI has quietly become part of business as usual for many of the charities I work with , taking a step back to understand how your team are using it, what they’ve learned, and to shape how you can underpin the next steps in your charity’s AI implementation with your values will help you adopt it more safely and sustainably. 

Building your board’s AI confidence

Amongst the data in this year’s Charity Digital Skills Report were two stats which show how the jobs of trustees and CEOs have changed. 31% of charities told us that they want their board to learn about emerging tech and AI tools. 39% of charities want CEOs to stay informed on emerging tech trends, and 34% want them to understand related risks and opportunities. 

These stats tell us something important about the changing nature of charity leadership. Board rooms and senior teams need to evolve – and fast. But this isn’t about turning trustees into tech experts. It’s about having the right mix of skills and confidence to ask those strategic questions about AI, understand the implications for your charity’s future, and support your team in using it responsibly. Whether that means upskilling current trustees, bringing new expertise onto your board, or both, now is the time to have those conversations.

From insight to action

The AI landscape is moving at pace, and trustees have a vital role to play in helping their charities navigate it. Yes, it can feel daunting. But by focusing on asking the right questions, understanding how AI is being used in your charity, and ensuring you have the right governance in place, you can help your charity harness AI’s potential whilst staying true to your values. 

The gap between perception and reality in AI adoption shows us that trustees need to engage with this now. After all, good governance isn’t about perfect knowledge. It’s about learning, being curious, and making sure we’re asking the questions that matter.