A major financial services company watched a prospect search for options in their category — live, in a demo — and ChatGPT recommended a smaller competitor instead of them. This company had the largest market share among its peers. It was spending the most on SEO, on digital marketing, on media. And it was invisible in the AI answer.

MIT Sloan Review published this story in January. It’s not a warning about what might happen. It’s a description of what’s already happening.

The reason behind it is simple, and it changes almost everything about how you need to approach AI visibility: AI search pulls from a broad ecosystem of sources. Your website is 5–10% of that ecosystem. According to McKinsey’s 2025 research on AI search behavior, owned-site content often makes up only a small slice of what AI-powered search actually references — the rest comes from publishers, affiliates, user-generated content, and third-party platforms.

That financial services executive’s firm had spent years building domain authority on the 5%. The smaller competitor had somehow ended up in the other 95%. That’s the entire gap.

Here’s how to find out where you actually stand — and what to do about it.

The three-question audit you can run before lunch

This is not a tool-dependent process. You can do this with the AI platforms you already have access to.

Question 1: What does AI say about you when you’re not in the room?

Open ChatGPT, Perplexity, and Google’s AI Overviews. Search for your category — not your brand name. What you’d type if you were a prospect doing research. “Best [category] tools for [use case]” or “how do companies handle [problem you solve].”

Note three things: whether your brand appears at all, how it’s described when it does, and what sources the AI cites to support whatever it says. That third piece is the one most teams skip. The sources are the actual signal.

Question 2: What sources are shaping answers in your category?

In Perplexity, you’ll see citations directly. In ChatGPT with browsing enabled, you’ll see references. In AI Overviews, you’ll see the sourced snippets. Pull out the top 10 sources appearing across multiple responses in your category.

This is the ecosystem your brand needs to exist in. If you’re present on those sites — cited, mentioned, quoted — AI engines will find you. If those sites have never referenced you, you’re not in the dataset the AI is drawing from.

Question 3: Which of those sources is your brand currently on?

Cross-reference your list of top 10 ecosystem sources against your known presence: earned placements, mentions, citations, community contributions. For most B2B companies, the gap is significant. According to McKinsey’s research, even industry leaders may lag 20–50% below expectations in AI search visibility compared to their SEO performance.

That gap is the priority list. Every source on your ecosystem list where you’re absent is a specific action item.

What to do with the gap list

The audit gives you a ranked list of third-party sources that are shaping AI responses in your category. The work is building presence on each of them — but the work looks different depending on the source type.

For editorial publications and trade media: Earned placements are the most durable form of presence. A well-placed article in a publication that AI engines already treat as authoritative gets pulled into responses repeatedly over time. This isn’t a numbers game — one substantive placement in the right publication does more than ten mentions on low-authority blogs. Muck Rack’s research across more than one million AI prompts found that 85% of AI citations come from earned media sources. These are the sources AI already trusts.

For community platforms and user-generated content: Reddit, Quora, G2, and category-specific forums often appear in AI citations because they’re rich with natural-language responses to exactly the kind of questions prospects are asking. If your category has active community discussions and your brand isn’t represented in them, that’s a gap worth closing. Executive participation in relevant threads, direct responses to questions your buyers are asking, and systematic review cultivation on platforms like G2 all build presence in spaces AI pulls from.

For affiliate and comparison sites: Analyst sites, review aggregators, and comparison platforms serve a structural function in AI search — they’re often the first source AI references when answering “best [category]” queries. A Yext analysis of 17.2 million AI citations found that platform citation patterns vary significantly by engine; Gemini, ChatGPT, and Perplexity each have distinct source preferences. Getting on the right platforms for each engine matters.

The mistake most teams make after the audit

They try to optimize their website.

That’s not wrong — good technical structure helps. But it treats the symptom rather than the actual problem. If your website is already well-structured and AI still isn’t citing you, it’s because AI isn’t looking primarily at your website. McKinsey’s finding is worth repeating: owned pages are 5–10% of what AI search references. The other 90–95% is the ecosystem you have or haven’t built.

Perplexity’s head of communications told Business Insider that AI search will shift budgets toward brand marketing for exactly this reason — AI removes friction and lets people buy things just by asking. When that happens, what the AI says about you is determined by where it finds you in the ecosystem, not how well-optimized your homepage is.

The brands winning in AI search right now aren’t winning because of technical SEO. They’re winning because they built enough presence in the sources AI already trusts that the AI has something credible to say about them.

Why the ecosystem approach compounds over time

Every earned placement, every community citation, every analyst mention is persistent. Once an AI engine has indexed that a credible external source describes your brand in a positive, specific context, that signal stays. It gets reinforced by additional placements in related sources. The more sources in your ecosystem that mention you, the more confident AI systems become that you’re a legitimate, citable option.

There’s a downstream effect worth understanding: when AI summaries appear in search results, click-through rates on traditional links drop from 15% to 8%, according to Pew Research Center’s July 2025 study. Users read the AI answer and stop. Which means the brands being cited inside that summary are the only ones in the room. The brands not cited don’t lose traffic — they lose the decision entirely, before the prospect ever considers clicking anything.

This is what Machine Relations — machinerelations.ai — describes as the foundation of AI-era visibility: earned media placements in the publications and platforms AI engines already trust. It’s not a new observation that PR matters. It’s that the publications which shaped human brand perception for decades are the same ones AI systems treat as their source of truth. The mechanism hasn’t changed. The reader has.

Running the audit above tells you which specific publications and platforms are acting as that source of truth in your category right now. That’s where you need to be.

If you want to see the gap quantified before you build the strategy, the AI Visibility Audit maps exactly where your brand appears — and doesn’t — across the AI engines your prospects are using to make decisions.


Sources: MIT Sloan Management Review, “Can Customers Find Your Brand? Marketing Strategies for AI-Driven Search,” January 28, 2026 (sloanreview.mit.edu). McKinsey & Company, “New Front Door to the Internet: Winning in the Age of AI Search,” October 2025 (mckinsey.com). Muck Rack / Generative Pulse, “What Is AI Reading?”, December 2025 (generativepulse.ai). Yext, “AI Citation Refresh: January 2026” (yext.com). Pew Research Center, “Google Users Are Less Likely to Click on Links When an AI Summary Appears in the Results,” July 2025 (pewresearch.org).