All posts
Tactical · 7 min read · 2026-05-06

What To Do When AI Gets Your Business Wrong

AI models sometimes get businesses wrong. Wrong address, wrong hours, outdated phone number, misclassified service area. And unlike Google, where you can claim your listing and push corrections, there's no AI control panel you can log into to fix the record.

That's a real problem. A customer asks ChatGPT for your address and gets last year's location. Perplexity describes you as a general contractor when you specialize in kitchen renovations. Claude recommends you but cites a phone number that's been disconnected for eighteen months.

Here's what actually happens when AI gets your business wrong — and what to do about it.

Why AI models carry wrong information

AI models don't pull from a single database. They aggregate information from training data and live web retrieval, and both can carry stale or conflicting details.

Training data is collected periodically. If you moved locations, changed your name, or updated your services in the past year or two, a training-based model may still have the older version. Retrieval-based models like Perplexity do live web searches, which helps — but only if the web has been updated. If your old address still appears on a directory you forgot about, the retrieval will surface it.

The deeper issue is conflicting signals. If your website says one thing, your Google Business Profile says another, and three old directory listings say a third thing, AI models don't know which version is canonical. They make a probabilistic judgment — and that judgment is often wrong.

The sources that feed AI models bad data

Before you can fix the problem, you need to find where it's coming from.

Old directory listings are the most common culprit. If your business moved, your website and GBP may have the new address — but if Angi, Yellow Pages, or a local chamber of commerce directory still shows the old one, retrieval-based models will surface the conflict. Some models will cite the majority version. If four directories have the old address and one has the new, you can guess what gets cited.

Cached review platforms are a second source of problems. Yelp snapshots, Facebook business pages, and industry-specific directories often lag months or years behind your current information. A phone number you ported away from, a location you closed, a service you discontinued — it can persist in these platforms long after you've moved on.

Your own old content can also work against you. Landing pages from a campaign three years ago that still describe a service you don't offer. A PDF brochure indexed with the wrong business category. An old press release that named an address you've since vacated. If it's on the web and getting crawled, it's in the mix.

How to fix the record

Start with the canonical sources AI models weight most heavily.

Fix your Google Business Profile first. GBP is one of the primary entity records that feeds both retrieval and training data. If your address, phone, hours, or category are wrong here, correct them now. This single record carries more weight than almost anything else in the AI visibility ecosystem.

Update your schema markup to match. The JSON-LD on your website should align exactly with your GBP — same address format, same phone, same business name, same category. When your schema and your GBP conflict, AI models see ambiguity. Ambiguity reduces citation confidence and sometimes triggers the wrong version being selected.

Audit your major directory listings. The critical ones: Yelp, Angi, BBB, Facebook Business, Apple Maps, Bing Places, and any industry-specific directories with an active profile for your business. Name, address, and phone should be identical across all of them — not close, identical. One inconsistency is noise. Five inconsistencies become the version of truth an AI model believes.

Remove or redirect old pages. If your site still hosts outdated content about old services or old locations, either update the pages or 301 redirect them to current content. A page that still ranks and gets crawled will keep feeding AI models the wrong picture.

Submitting corrections directly to AI platforms

Some AI platforms have limited mechanisms for flagging incorrect information. They're inconsistent and slow, but worth using.

Google's AI Overviews are often correctable through normal GBP processes — Google draws heavily from its own data, so a GBP update tends to propagate relatively quickly. Perplexity has a feedback button on individual answers where you can flag inaccuracies. OpenAI has a form for requesting removal or correction of business data in their systems.

None of these update training data immediately. They're slow feedback loops, often measured in weeks or months. The faster fix is always the underlying source data. If you clean up the web, retrieval-based models update within days of their next crawl. Training-based models update with their next training run, which could be considerably longer.

How to confirm the fix worked

Test it directly. Open Perplexity, ChatGPT, Claude, and Gemini. Ask each one about your business. "What are the hours for [business name] in [city]?" "What does [business name] specialize in?" "Is [business name] still at [old address]?"

Don't just test once. Run the same queries a week after making corrections, then again at the thirty-day mark. Retrieval-based models should reflect your fixes within a week if the source pages have been recrawled. Training-based models will lag longer — be patient.

Keep a log of what each model says before and after. If a model is consistently wrong about something structural even after you've fixed the sources, you'll have the documentation to file a formal correction with the platform.

The upstream fix is the durable one

Running around correcting individual AI answers is reactive and temporary. The better play is eliminating the conflicting signals that cause the problem — building a clean, consistent data foundation that every AI model draws the same conclusion from.

If you haven't done a structured audit of your business information across the web, a Signal Check from Sourcepull will surface the exact discrepancies creating the conflicting picture. It maps your NAP consistency, schema accuracy, directory profile status, and citation signals into a single score — and shows you what to fix, in what order.

Most businesses we audit have at least two or three conflicting signals they didn't know existed. Finding them is the first step to making sure AI models describe you correctly.

See how your business scores on AI platforms.

Check your score — free