Skip to main content
Cautionary Tale8 min read

2,000 Broken Links and a 23% Traffic Drop

By The bee2.io Engineering Team at bee2.io LLC

Illustration: a massive chain with hundreds of broken links scattered on the ground

Every good mystery starts with a crime scene and no obvious suspects. In this case, the crime scene was a search console dashboard showing a 23% drop in organic traffic over six months, and the suspects were... everyone and no one.

I love a good SEO mystery. This one had everything.

The Setup

About eight months ago, I was brought in to look at an e-commerce site that had gone through what their team described as a "routine site restructure." They'd reorganized their product categories, updated their URL structure to be cleaner, added some new landing pages. Standard stuff.

Except organic traffic had been quietly bleeding out for six months and nobody had figured out why.

The client's internal team had checked the obvious things: robots.txt was fine, sitemap was updated, no manual penalties in the search console. A few people were quietly blaming algorithm updates. One person was convinced it was a competitor doing something shady. Someone else had a theory about their CDN.

None of them were right.

The Discovery

I started by crawling the site properly. Not a quick scan of the homepage, a real full-site crawl that follows every internal link. And that's when the number appeared.

2,247 broken internal links.

Two thousand, two hundred and forty-seven links pointing to URLs that returned 404 errors. Pages that had existed before the restructure, that had been linked to from other parts of the site, that had been reorganized or renamed or deleted without anyone updating the links pointing to them.

I've found broken links before. You find 50 and you feel pretty good about yourself as a detective. You find 200 and you know something went wrong. You find 2,247 and you just stare at the number for a while.

How Did Nobody Notice For Six Months?

This is the question that genuinely haunts me, and the answer is embarrassingly simple: nobody was looking.

The restructure team verified that the new pages worked. They verified that the redirects they knew about were in place. They tested the checkout flow, the search functionality, the major category pages. What they didn't do was systematically verify that every internal link across the entire site still pointed somewhere valid.

The site had around 15,000 pages. During the restructure, roughly 800 of those pages were moved or renamed. Sounds manageable. But each of those pages had links coming in from multiple other pages. Some product pages were linked from 30 or 40 different places: category pages, blog posts, related products sections, promotional landing pages from old campaigns.

Nobody had a complete map of all those internal links before making changes. So when the pages moved, the links became landmines.

Why Broken Links Hurt So Much

Let me explain the mechanism, because I think most people understand "broken link bad" without really understanding why it's bad, and the full picture is more alarming than you'd expect.

First, there's the link equity drain. Internal links pass authority between pages. When a link points to a 404, that authority goes nowhere. You're bleeding link equity into a void. On a site with thousands of broken links, you're essentially running an SEO system full of holes.

Second, and this one really gets me, there's crawl budget. Search engines have limited resources to crawl any given site. Every time a search engine bot follows a link and hits a 404, it's wasting a crawl request that could have gone toward discovering or refreshing an actual page. On a large site, if a significant percentage of your internal links are broken, you're actively preventing search engines from efficiently crawling your good content.

A single broken link page can effectively orphan the pages it was supposed to lead to. If the only internal paths to a page go through 404s, that page might as well not exist from a crawling standpoint.

Third, there's user experience. Real humans click links. When they hit a 404 on your site, they don't stick around. They leave. And a high bounce rate on 404 pages sends signals you don't want to send.

The Performance Connection

While I was in there, I also found something that surprised me: the site's page load times had gotten noticeably worse during the restructure period. Not dramatically, but measurably. Average load time had crept up by about 400 milliseconds on key category pages.

Four hundred milliseconds sounds like nothing. Published research from major e-commerce platforms has suggested that every 100ms improvement in load time can translate to measurably higher conversions, with some studies citing up to a few percentage points per 100ms. The exact impact varies by site and audience, but slower pages generally correlate with lower conversion rates.

In this case, the culprit was some JavaScript that had been added during the restructure that was loading synchronously and blocking rendering. Not malicious, just careless. But combined with the broken link disaster, the performance degradation had quietly compounded the damage.

It's almost never one thing. It's usually three things that nobody noticed because they were each small on their own.

The Fix

The fix was tedious but not complicated. We exported the full list of broken links, organized them by source page, and prioritized by the authority of the source pages. High-authority pages with broken links got fixed first.

We also set up proper 301 redirects for the most important destination URLs that had changed, so any external links pointing to old URLs would still work.

The recovery wasn't immediate. Search engines don't just flip a switch when you fix 2,000 broken links. It took about ten weeks to see meaningful recovery in organic traffic, and the full six months of loss never came back completely. Some of that link equity drain is permanent. Some of those pages that went uncrawled for six months lost rankings that took more work to recover.

The lesson isn't that you can easily fix this kind of damage. The lesson is that you need to not create it in the first place.

What Should Have Happened

Before any site restructure, you need a complete internal link map. You need to know which pages link to which other pages, so when you move or rename something, you know exactly what needs to be updated.

After any significant change, you need a full broken link audit. Not a spot check, a full crawl. There are multiple ways to do this. A desktop crawling tool is a popular option for technical SEOs. For smaller sites or quick checks, a browser extension that audits broken links on the live rendered page is much faster than setting up a dedicated crawl tool.

SCOUTb2's free tier checks broken links as part of its single-page audit, which means you can at least verify that any given page's outbound links are valid. For multi-page scanning across your whole site, the PRO tier can crawl up to 10,000 pages in the background and surface detected issues at once. Either way, the point is to have eyes on this before it becomes a six-month forensics project.

Also: after any restructure, set up search console alerts so you're notified of crawl errors immediately. Not six months later when the damage is already done.

The Ending

The retailer recovered about 17% of their lost traffic over the following three months. Not back to baseline, but much better. The broken links got fixed, the redirect chains got cleaned up, the JavaScript performance issue got resolved.

The detective work was satisfying. Finding 2,247 broken links after six months of investigation is a genuinely good find. But I'd have much preferred to be there before the restructure, running the audit before the changes went live, when fixing it would have been a day of work instead of a month.

Mysteries are more fun in books than in dashboards.

Aisha is an SEO consultant who specializes in technical audits and traffic forensics. She's found a lot of broken links over the years.

Disclaimer: This article is for informational purposes only and does not constitute legal, professional, or compliance advice. SCOUTb2 is an automated scanning tool that helps identify common issues but does not guarantee full compliance with any standard or regulation.

seobroken-linkssite-auditorganic-trafficcrawl-budgetperformance

Stop finding issues manually

SCOUTb2 scans your entire site for accessibility, performance, and SEO problems automatically.