Why do working pages on my website appear as broken?
There may be a time where Site Audit may show that some of your internal pages are reported as broken but are actually still fully up-and-running. While this does not happen often, it can be confusing for some users.
Generally, when this happens it's because of a false-positive. The three most common reasons for a false-positive reading are:
- Our Site Audit crawler could have been blocked for some pages in robots.txt or by noindex tags
- Hosting providers might block SEMrush bots as they believe it is a DDoS attack (massive amount of hits during a short period of time)
- At the moment of the campaign re-crawl, the domain could not be resolved by DNS
If you believe it’s happening because of a crawler issue you can learn how to troubleshoot your robots.txt with this full article.
You can also lower the crawl speed in order to avoid a large number of hits on your pages at one time. That is why you could see this page as a working one, which is right, and our bot was unable to do so and reported the false-positive result.
- How is Site Health Score calculated in the Site Audit tool?
- Why does SEMrush say I have duplicate content?
- How many pages can I crawl in a Site Audit?
- How do I audit a subdomain?
- What Issues Can Site Audit Identify?
- Why does SEMrush say I have an incorrect certificate?
- What are unoptimized anchors and how does Site Audit identify them?
- What do the Structured Data Markup Items in Site Audit Mean?
- Configuring Site Audit
- Site Audit Overview Report
- Site Audit Thematic Reports
- Reviewing Your Site Audit Issues
- Site Audit Crawled Pages Report
- Site Audit Statistics
- Compare Crawls and Progress
- Troubleshooting Site Audit
- Exporting Site Audit Results
- How to Optimize your Site Audit Crawl Speed
- How To Integrate Site Audit with Zapier
- Using SEMrush to Find Areas for Improvement on Your Website
- Optimizing Page Title and Meta Description via SEMrush