Part of having a well-SEO’d website is staying on top of the technical issues that arise every now-and-then. Google Search Console is an essential tool for staying on-top common errors, with its simple to navigate interface and handy site crawl error lists, it serves as a go-to tool for technical SEO as well as a whole host of other tools and features.
So, it’s all good and well knowing that there are errors on your website but how do you go about fixing them? In this blog post, I aim to explain 6 of the most common website crawl errors in Google Search Console as well as instructions on how to fix them:
#1 Soft 404
Google has found a page with minimal/no content.
There are a couple of ways of tackling this one:
- Add more content to the page so that Google recognises it as a page with a purpose;
- Change it to show up a 404 error page, or
- 301 redirect it to a more useful page with more content.
#2 Not Found
A page that’s been linked to from somewhere (or in Google’s index) is showing a 404 error page where there should be content.
This one is relatively straightforward. The best thing to do is
- 301 redirect the page to more useful content, or
- As long as it won’t cause issues with duplicate content, publish the page again.
#3 Server error
Google has tried to access your website but cannot due to response issues with the website’s server.
Get in touch with your hosting company and ask if they’ve had any problems with their servers recently and if they’ve fixed them.
#4 DNS error
When trying to access your website, Google has encountered issues with your domain name’s settings.
Check with the company you purchased your website’s domain name from that its all been set up properly. It may be that you will have to change 1 or 2 DNS records, but this is relatively straightforward to do and either your website developer or domain name provider should be able to help you with this.
#5 Access Denied
When Google has tried to crawl a page on your website, it hasn’t been allowed access to it, likely as a result of settings in your robots.txt file, login details being required or the Googlebot is blocked by your hosting provider.
- If the page should be blocked in your robots.txt, then no-index the page and mark this error as fixed in Search Console, if not then remove the line of code blocking the Googlebot from accessing the page;
- If the page should require login details, add it to your robots.txt and no-index the page, if not then remove the login prompt and allow Google to crawl the page;
- Otherwise, contact your hosting provider and ask if they’ve blocked Googlebot from accessing your website. If they have, then this is a serious issue as you could be missing out ranking in the search engine and potentially not gaining valuable traffic.
#6 Not Followed
- Check for redirect hops. You can use httpstatus.io to check for redirects and how many there are. If there are lots, try to cut it down to just 1;