Back to blog

Someone types "charities supporting young carers in Yorkshire" into ChatGPT. You run one. You do not come up.

Not because your charity is doing poor work. Not because you have not heard of SEO. Because your website does not give AI tools enough clear, structured, specific information to surface you with confidence.

This is increasingly how people find services. Not just donors and volunteers. Commissioners, social workers, journalists, and grant-makers are using AI tools to research sectors and shortlist organisations. If you are not appearing, you are losing conversations before they start.

What "invisible to AI" actually means

When someone asks an AI tool a question, the tool does not browse the web the way a search engine does. It draws on a combination of its training data and, in some cases, real-time web retrieval. The key difference is this: a search engine returns a list of pages and lets the user decide. An AI tool selects sources, summarises them, and often names the organisations it has drawn from.

If your charity is not named, it effectively does not exist in that response.

Rand Fishkin at SparkToro has tracked AI referral traffic trends carefully and found that the gap between "ranking on Google" and "being cited by AI" is significant and widening. A site can be on page one of Google and still not appear in an AI-generated answer about the same topic.

How AI tools decide what to cite

The exact mechanics vary between ChatGPT, Claude, Perplexity, and Google's AI Overviews. But certain patterns are consistent across all of them.

AI tools favour sources that:

  • Are clearly identifiable, with a named organisation and location
  • Publish specific, factual content rather than broad, vague descriptions of their work
  • Have content that directly answers common questions about their field
  • Have been cited or linked to by other credible sources
  • Have consistent topical depth, not a wide range of shallow content

Perplexity in particular shows its sources explicitly. If you have ever used it, you will notice it consistently cites organisations that publish clear, structured content with named people and specific detail. Vague homepages with mission statements do not get cited. Specific service descriptions with numbers, named areas, and named contacts do.

Why most charity websites fall short

Charity websites are often written for two conflicting audiences: funders reading an annual report, and service users finding the crisis line. The result is usually copy that is too formal for one and too vague for the other.

AI tools struggle with vague copy for the same reason human visitors do. "We provide holistic support to vulnerable people across the region" tells an AI tool almost nothing. It cannot work out your geography. It cannot work out your specialism. It cannot work out who qualifies.

By contrast, "We provide emergency housing support to young people aged 16 to 25 in West Yorkshire, including Bradford, Leeds, and Wakefield" gives an AI tool something to cite when someone asks about housing charities in that area.

Kevin Indig, whose work on generative engine optimisation tracks how AI tools evaluate sources, makes the same point: the organisations that appear in AI-generated answers are almost always the ones with the most specific, structured content. Not the most famous. Not the biggest. The clearest.

A quick test you can run right now

Go to Perplexity or ChatGPT. Type in a question a commissioner or grant-maker might ask about your area of work. Something like: "What charities support domestic abuse survivors in [your county]?" or "Which organisations provide employment support to people with learning disabilities in [your city]?"

See whether you appear. If you do, note what content they cite. If you do not, note which organisations do, and compare their websites to yours.

You are likely to find that the organisations appearing have two things yours may not: a clear geographic description of where they work, and specific language about who they help and what that help involves.

Three changes that improve your chances

Be specific about your geography. "We work across the South East" is not useful. "We work with charities in Berkshire, Oxfordshire, Hampshire, and Surrey" is. AI tools use location signals to match organisations to geographically specific queries. If your site does not name the areas you serve, you will not appear for those searches.

Name the people doing the work. Anonymous organisations with generic contact pages are harder to surface credibly. A named director or lead contact signals to AI tools that there is a real, identifiable person behind the work. This also improves trust signals for human visitors.

Write content that answers questions directly. Think about the questions your referrers, funders, and commissioners actually ask. Write pages or articles that answer them specifically. "Who qualifies for our service?" "What happens after someone gets in touch?" "What does your impact data show?" These are the types of queries that pull organisations into AI-generated answers.

If you want help working through this for your own site, our SEO and AI visibility service covers exactly this.

Is your site content clear enough to be cited?

The free Charity Safety Content Audit checks whether key information is easy to find on your site: contact details, service descriptions, and who you actually help. Seven questions, under three minutes.

Take the free audit →

This is not a replacement for traditional SEO

Google is still where most search journeys start. AI visibility and search ranking are two related but separate problems. The good news is that the things that improve AI citation, clear content, named people, specific geography, and genuine topical depth, are the same things that improve search rankings.

You do not need a separate strategy for each. You need one strategy built on specificity and clarity, which benefits both.

We have written more on this in The two places your charity website needs to show up in 2026, which covers both channels together.