How to Find (and Fix) Broken Links on Any Website
Broken links waste crawl budget, harm user experience, and lose link equity. Here's how to find them all — and track down which pages are causing the problem.
8 May 2026 · 5 min read
Broken links are one of the most common technical SEO problems, and one of the most straightforward to fix. A broken link is any hyperlink that returns a 4xx or 5xx response - most commonly a 404 Not Found.
The problem is finding them. On a site with more than a few hundred pages, manually clicking through links is not realistic. You need a crawler.
Why broken links matter for SEO
Broken links hurt a site in three ways:
- Wasted crawl budget. Googlebot follows links to discover and index pages. If a significant number of those links lead to 404s, Googlebot is spending time on pages that do not exist instead of finding and indexing real content.
- Lost link equity. When a page that has external links pointing to it returns a 404, those links are effectively dead. The link equity that should flow from those external links to your site goes nowhere.
- Poor user experience. Visitors who click a broken link see an error page. Depending on how your 404 page is set up, they may have no clear path forward. This increases bounce rate and erodes trust.
Step 1: Crawl the site with Crawly
Open Crawly, click New Crawl, and enter the site's root URL. You want a full spider crawl - the default settings are fine. Crawly will follow every internal link it finds and record the response code for each URL.
Let the crawl complete. On a site of a few hundred pages this takes a minute or two. Larger sites take longer, but Crawly crawls 10 pages concurrently by default so it moves quickly.
Step 2: Find the broken pages
Once the crawl finishes, go to the Issues tab. Look for Broken pages (4xx) in the errors section. The number next to it is the count of broken URLs found.
Click the issue to expand the full list of affected URLs. You will see every 4xx URL that Crawly found during the crawl, with its status code (404, 410, 403, etc.) and the page title if the URL returned any content.
You can also filter the Response Codes tab to show only 4xx pages if you prefer to work from there.
Step 3: Find what is linking to each broken page
Knowing a page is broken is only half the job. The more important question is: which pages on the site are linking to it? Those are the pages you need to fix.
In the Pages tab, click any broken URL to open its detail panel. The detail panel shows inbound links - the pages on the site that contain a link to this broken URL. This tells you exactly where to go to update or remove the link.
If you have Crawly's MCP server connected to Claude Code, you can get this in one shot:
Show me all 4xx errors and the pages that link to them.
Claude calls the get_broken_with_inlinks tool and returns a structured list - each broken URL alongside every page that links to it, with anchor text. A complete broken link report in seconds.
Step 4: Prioritise what to fix
Not all broken links are equal. Prioritise fixes in this order:
- Broken pages with external inbound links. If a 404 URL has links pointing to it from other websites, fixing or redirecting it recovers that link equity. Check your backlink tool (Ahrefs, Majestic, etc.) for these.
- Broken pages in your XML sitemap. A sitemap URL that returns a 404 is telling Google that a page exists when it does not. Remove it from the sitemap or fix the URL.
- Broken internal links on high-traffic pages. If your homepage or a top landing page links to a broken URL, that is a priority fix regardless of link equity.
- Everything else. Fix remaining internal links to broken pages in bulk - update the link, remove it, or redirect the destination URL.
Step 5: Decide how to fix each broken URL
For each broken URL, you have three options:
- Update the link. If the content has moved to a different URL, update the internal link to point to the correct destination.
- Remove the link. If the content no longer exists and there is no equivalent, remove the link from the source page.
- Set up a 301 redirect. If the URL used to exist and has external links or bookmarks pointing to it, redirect it to the most relevant existing page. Do not redirect to the homepage unless there is genuinely no relevant alternative.
Checking for broken external links
Crawly also tracks outbound links. The External Links tab lists every link from your site to another domain. While Crawly does not crawl external URLs by default, you can use List Mode to run a targeted check: export your external link list, paste the URLs into Crawly's list mode, and run a quick audit to check their status codes.
Broken external links matter less for SEO than broken internal links, but they still degrade user experience and can indicate that content you are citing or referencing has been removed.
Setting up regular checks
Broken links are not a one-time problem. Pages get deleted, URLs get restructured, external sites go offline. The best practice is to run a crawl regularly - monthly for most sites, weekly for high-velocity sites that publish frequently or undergo regular structural changes.
Crawly saves every crawl to disk. When you run a new one, use the Crawl Comparison feature to see what has changed since last time - including newly broken URLs that appeared in the latest crawl.
Finding broken links used to mean either paying for an expensive enterprise tool or running a slow manual check. With Crawly, a full broken link audit on most sites takes under five minutes - crawl, check issues, export, fix.
Broken links are one part of a complete technical SEO audit. See the technical SEO audit guide for the full workflow.
Download Crawly and run your first broken link check today.