The Blog

Digital Marketing Blog From SocialB

arrow

6 Common Website Crawl Errors In Google Search Console And How To Fix Them

Part of having a well-SEO’d website is staying on top of the technical issues that arise every now-and-then. Google Search Console is an essential tool for staying on-top common errors, with its simple to navigate interface and handy site crawl error lists, it serves as a go-to tool for technical SEO as well as a whole host of other tools and features.

So, it’s all good and well knowing that there are errors on your website but how do you go about fixing them? In this blog post, I aim to explain 6 of the most common website crawl errors in Google Search Console as well as instructions on how to fix them:

#1 Soft 404

The problem

Google has found a page with minimal/no content.

The solution

There are a couple of ways of tackling this one:

  • Add more content to the page so that Google recognises it as a page with a purpose;
  • Change it to show up a 404 error page, or
  • 301 redirect it to a more useful page with more content.

#2 Not found

The problem

A page that’s been linked to from somewhere (or in Google’s index) is showing a 404 error page where there should be content.

The solution

This one is relatively straightforward. The best thing to do is

  • 301 redirect the page to more useful content, or
  • As long as it won’t cause issues with duplicate content, publish the page again.

#3 Server error

The problem

Google has tried to access your website but cannot due to response issues with the website’s server.

The solution

Get in touch with your hosting company and ask if they’ve had any problems with their servers recently and if they’ve fixed them.

#4 DNS error

The problem

When trying to access your website, Google has encountered issues with your domain name’s settings.

The solution

Check with the company you purchased your website’s domain name from that its all been set up properly. It may be that you will have to change 1 or 2 DNS records, but this is relatively straightforward to do and either your website developer or domain name provider should be able to help you with this.

#5 Access denied

The problem

When Google has tried to crawl a page on your website, it hasn’t been allowed access to it, likely as a result of settings in your robots.txt file, login details being required or the Googlebot is blocked by your hosting provider.

The solution

  • If the page should be blocked in your robots.txt, then no-index the page and mark this error as fixed in Search Console, if not then remove the line of code blocking the Googlebot from accessing the page;
  • If the page should require login details, add it to your robots.txt and no-index the page, if not then remove the login prompt and allow Google to crawl the page;
  • Otherwise, contact your hosting provider and ask if they’ve blocked Googlebot from accessing your website. If they have, then this is a serious issue as you could be missing out ranking in the search engine and potentially not gaining valuable traffic.

#6 Not followed

The problem

Google has tried to follow a URL to your page and gotten stuck somewhere, likely the result of a redirect, Flash or Javascript on the page.

The solution

  • Check for redirect hops. You can use httpstatus.io to check for redirects and how many there are. If there are lots, try to cut it down to just 1;
  • Use the fetch as Google tool in Search Console to see what Google sees when it tries to crawl your page, then move any content obscured by Javascript or Flash to HTML so that Google and other search engines can easily read it.


< Back to the blog

Leave a Reply

Your email address will not be published. Required fields are marked *

Let’s Talk –
Get in touch with us today

We’ll help you determine the most effective digital marketing plan for your business.