Tutorial · April 28, 2026

How to build an AEO tool

Track which URLs AI engines cite as answers. Find your citation gaps. Run an outreach loop. The full tool in around 150 lines.

TL;DR
AEO is about being cited as the answer. Pick 30 priority queries. Pull citations from six AI engines. Find URLs cited often that do not currently mention you. Outreach the gap. Re-measure in 6 weeks. The whole tool: ~150 lines, ~$50 a month.

SEO ranks pages. AEO ranks answers.

When ChatGPT answers a question and lists 5 sources, those 5 sources are the new top of page one. The 6th source is invisible. So the question for AEO is not "where do I rank" but "am I in the citation list, and if not, what URLs are taking my slot?"

This post walks through building that tool. Pull citations from every major AI engine, normalize the URLs, build a gap list, run outreach against it, measure the lift. The whole thing is about 150 lines of code.

What an AEO tool actually does

Three jobs, in order of importance.

Citation extraction. For each priority query and each AI engine, pull the list of URLs the engine cited as sources. Normalize them: strip tracker params, resolve redirects, drop fragments.

Citation gap analysis. Find URLs that show up across multiple queries (the "authoritative" sources) and check whether they mention you. URLs that are cited often AND do not mention you are your outreach list.

Citation rate over time. Out of N queries, how often were you cited? Plot that per surface. After outreach, watch the rate climb.

In SEO you optimize the page to win the rank. In AEO you optimize the answer to win the citation. The page is what gets indexed; the answer is what gets used.The AEO mindset shift

Why AEO is harder than it looks

Three operational problems most teams underestimate.

Citation extraction is per-engine work. ChatGPT exposes citations differently from Perplexity, which exposes them differently from Google AI Overviews. Bing Copilot has no public API at all. Wiring up all six takes about a month per engine if you do it right.

URL canonicalization is gnarly. The same article gets cited asexample.com/post, example.com/post?utm=source,example.com/post#section, and www.example.com/post. Treating those as one URL is non-trivial. Maintaining the canonicalization rules as engines change citation formats is ongoing.

API ≠ UI. The citation list returned by ChatGPT's API is often different from what users see in the UI. Per our 1,000-prompt teardown, the API and UI diverge on 96% of ChatGPT queries. Measuring AEO on API-only data is measuring the wrong thing.

The picks-and-shovels argument: all three problems are already-solved on MentionsAPI. UI scraping for the engines that have it. Canonicalization. API/UI delta detection. Skip the 3-month build and call the API instead.
Get the citation extraction layer
One API call returns canonicalized citations from all 6 AI surfaces. PAYG from $10. $1 free signup credit.

The build

Pick the right queries for AEO

AEO queries are intent-led and answer-seeking. The kind of queries where AI engines actually cite sources. Three patterns work well.

How-to: "how does HubSpot's email automation work," "how do I track AI brand mentions."

Comparison: "Pipedrive vs HubSpot," "Linear vs Asana for engineering teams."

Definition: "what is generative engine optimization," "what is the best CRM for SaaS."

30 queries is the right number. 10 in each bucket.

Pull citations from each AI engine

Call /v1/check with mode: all_live. The response includes a citations array per surface. Each citation has a canonical URL, the position in the answer, and the surrounding text.

extract-citations.mjs
import config from "./queries.json" assert { type: "json" };

async function extractCitations(query) {
  const res = await fetch("https://api.mentionsapi.com/v1/check", {
    method: "POST",
    headers: {
      Authorization: `Bearer ${process.env.MENTIONSAPI_KEY}`,
      "Content-Type": "application/json",
    },
    body: JSON.stringify({ mode: "all_live", query }),
  });
  const data = await res.json();
  const out = {};
  for (const [surface, payload] of Object.entries(data.providers)) {
    out[surface] = (payload.citations || []).map(c => ({
      url: c.url,
      domain: new URL(c.url).hostname.replace(/^www\./, ""),
      position: c.position,
    }));
  }
  return out;
}

const allCitations = {};
for (const q of config.queries) {
  allCitations[q] = await extractCitations(q);
}
console.log(JSON.stringify(allCitations, null, 2));
Detect 'are we cited' per query and surface

For each (query, surface) cell, check whether the citation list contains your domain. The result is a 2D matrix you can roll up two ways.

rate.mjs
const MY_DOMAIN = "mentionsapi.com";

function citationRate(allCitations, myDomain) {
  const SURFACES = ["chatgpt", "claude", "gemini", "perplexity", "ai_overview", "bing_copilot"];
  const perSurface = {};
  let totalCells = 0, totalHits = 0;

  for (const surface of SURFACES) {
    let hits = 0, total = 0;
    for (const query of Object.keys(allCitations)) {
      const cites = allCitations[query][surface];
      if (cites === undefined) continue;
      total++;
      if (cites.some(c => c.domain === myDomain)) hits++;
    }
    perSurface[surface] = total > 0 ? hits / total : 0;
    totalCells += total;
    totalHits += hits;
  }

  return {
    overall: totalCells > 0 ? totalHits / totalCells : 0,
    perSurface,
  };
}

console.log(citationRate(allCitations, MY_DOMAIN));

Sample output for a brand citation rate of 24%, with Perplexity strongest at 42% and AI Overview lowest at 8%.

Build the citation gap list

Now invert the question. Which URLs are cited often but do NOT mention you? Those are your outreach targets.

gap.mjs
function findCitationGaps(allCitations, myDomain) {
  const urlCounts = new Map();
  for (const cites of Object.values(allCitations)) {
    for (const surface of Object.values(cites)) {
      for (const c of surface) urlCounts.set(c.url, (urlCounts.get(c.url) || 0) + 1);
    }
  }
  const popular = [...urlCounts.entries()]
    .filter(([url, count]) => count >= 3 && !url.includes(myDomain))
    .sort((a, b) => b[1] - a[1])
    .slice(0, 50);
  return popular.map(([url, count]) => ({ url, citation_count: count }));
}

const gaps = findCitationGaps(allCitations, "mentionsapi.com");
console.table(gaps);

Output is a ranked list of "URLs cited 3+ times across your priority queries that do not mention you." Most useful URLs cluster between 5 and 15 citations. Anything over 15 is a category-defining authoritative source.

Track citation rate over time

Save the rate weekly. After 6 to 12 weeks, you have a trend line. Outreach lands by 6 weeks for Perplexity, 8 to 10 for ChatGPT and Gemini, 10 to 12 for AI Overviews. Plot the rate per surface and you can see exactly which channels your investments hit.

What this costs

30Priority queries in a typical AEO query set
$50Monthly cost at weekly snapshots
6-12 wksTime for outreach to land in citations
3+Citations a URL needs before it makes the gap list

30 queries × $0.50 × 4 weeks = $60 worst case. With 60% cache hits, real cost is $50 a month or less. Add 20 more queries for sample-size reasons and you are still under $100.

The mistake everyone makes

They measure citation rate on day one and then never measure again.

AEO is a longitudinal game. The score you compute today is meaningless in isolation. The score you computed today minus the score you computed 30 days ago is the actual signal. If outreach is working, the line goes up. If it is not, the line is flat and you should change tactics.

What separates successful AEO programs from failed ones: the ones that succeed instrument the citation rate from week zero, run outreach against the gap list for 90 days, and measure the lift. The ones that fail run outreach without measurement and have no idea whether anything worked.

Frequently asked questions

What does an AEO tool actually measure?
An AEO tool measures whether your URLs are cited as the source for AI-generated answers. Three things specifically: (1) is your domain in the citation list, (2) which page on your site got cited, (3) what fraction of priority queries cite you versus competitors. The output is a citation gap list that drives outreach and content investment.
How is AEO different from GEO?
GEO is the umbrella: brand mentions, citations, ranks, sentiment, all together. AEO is the citation-focused subset. If GEO is "are we showing up at all," AEO is "are we cited as the answer." Most practitioners use the terms interchangeably; technically AEO is narrower.
Which AI engines surface citations?
Perplexity is the most citation-heavy: typically 6-10 cited URLs per answer. ChatGPT search returns 3-5 when web mode is on. Google AI Overviews show 4-8. Gemini surfaces 3-6. Claude returns the fewest, often 0-2. MentionsAPI extracts and normalizes them all into one shape.
How do I find URLs that cite competitors but not me?
Run your priority query set. Capture every cited URL. Filter to URLs that appear across multiple queries. Cross-reference against a list of URLs that already cite you. The diff is your outreach list. Most useful URLs appear on 3+ queries.
What is the cost difference between API and UI scrape?
On MentionsAPI, mode quick (API only) is $0.02 per call. mode all_live (API plus UI scrape) is $0.50. For AEO work specifically, you want all_live because UI answers and API answers diverge on 60-96% of queries depending on the engine. Cheaping out on quick mode means measuring the wrong thing.
How long until outreach moves the citation rate?
In our tracking, new citations show up in AI engines 2 to 6 weeks after the source page is indexed. Perplexity is fastest (often under a week). Google AI Overviews are slowest (often 6+ weeks). Plan a 90-day cycle: 30 days outreach, 60 days waiting for citations to land.

Ship it next month

Week one: build the citation extraction pipeline and compute your baseline rate per surface. Week two: build the gap list and pick the top 10 URLs to pitch. Week three to ten: run outreach. Week eleven: re-measure. The lift is the answer to whether your AEO program is working.

Most teams skip the measurement step. Do not be most teams.

Nikhil Kumar
Founder, MentionsAPI

Growth marketer at the intersection of marketing, product, and technology. 8+ years across startups and scale-ups in India, Switzerland, and the Netherlands. Founder of Landkit (landkit.pro).

Build your AEO tool this month.

Citation extraction, canonicalization, and UI scraping done. $1 free signup credit. Wallet from $10. The shovel for your AEO program.