Published April 5, 2019
Updated October 30, 2020
Google Search Console (GSC) is a free tool that helps you monitor your website’s performance to identify and fix any potential problems that prevent it from appearing in Google’s search results as expected. We’ve put together a list of some of the most common Google Search Console error reports, as well as how to determine what might be causing the error, and, in most cases, how to fix it.
Submitted URL Errors: Pages With Errors Have Not Been Indexed
There are at least three different reasons why Google might not be indexing specific pages on your site. Why does it matter? A page that is not or cannot be indexed will not appear in the organic search results, meaning you are getting zero SEO value from it.
1. Pages Are Blocked by robots.txt
- Cause: You submitted this page for indexing, but the page is being blocked by the robots.txt file. The robots.txt file is a list of rules for Google’s “bot” (aka Googlebot) to follow. If any of these rules tells the bot that it should not index a given page, then the page won’t be indexed. And since you’ve submitted the page, there is a conflict.
- Fixing this error: Test your page using the robots.txt tester to confirm the problem. You may have a broad rule that is blocking more pages than you realize from indexation. If you want a given page to be indexed, change the robots.txt file to prevent that page from being blocked or change the page itself so that the rule no longer applies.
2. Page Is Marked “No Index”
- Cause: You submitted this page for indexing, but the page has a “noindex” directive either in a meta tag or HTTP header. This is similar to the previous error – you’re sending Google two conflicting signals on what to do with the page.
- Fixing this error: If you want this page to be indexed, remove the “noindex” meta robots tag or HTTP header.
3. Page Has a Crawl Issue
- Cause: Google has encountered an unspecified crawling error. Unfortunately, a number of different issues could cause this, but Google does not provide more detail.
- Fixing this error: Google recommends using the URL inspection tool in Google Search Console to debug your page. You may need to discuss this type of error with your developers to determine the underlying cause.
A “404 error” means that Googlebot cannot find a page. Typically, it either no longer exists in a place that is accessible to the bot or the page is now blank. 404 errors are not uncommon, since websites grow and change, and they are not always a problem. Here are a couple of distinctions on when this error will occur and what you need to do (if anything) to fix it.
1. Submitted URL Seems to Be a Soft 404
- Cause: You submitted this page for indexing at some point, but the server is now returning a blank or nearly blank page.
- Fixing this error: If the page is no longer available and has no clear replacement, configure your server to return a 404 (not found) or 410 (gone) response code. If the page has moved or has a clear replacement, set up the appropriate 301 redirect (permanent redirect).
2. Submitted URL Not Found (404)
- Cause: A URL included in your sitemap no longer exists. Sometimes, a page needs to be removed (creating a 404) and that is completely acceptable. For example, you might remove a discontinued product that has no equivalent replacement or delete old blog posts that have received no traffic, have no links, and are not ranking for any keywords.
- Fixing this error: Only some 404s will need to be fixed. If a URL should exist but has moved, simply add in a 301 redirect; you can do the same if a discontinued product has a good replacement version (such as the newer model). If the URL is unknown or you want to permanently delete the content, you can ignore the 404 error. Eventually, Googlebot will stop looking for these pages.
Server Error (5XX)
If Google Search Console reports a server error, it means that Googlebot couldn’t access your URL, the request timed out, or your site was busy. As a result, Googlebot was forced to abandon the request. There are a variety of possible causes for this type of error, and you may need to address this issue with your development team or server host in some cases.
- Cause: Your server returned a 500-level error when the page was requested.
- Fixing these errors:
- Dynamic page requests can cause excessive load times; check yours and reduce excessive page loading if needed.
- Make sure your site’s hosting server is not down, overloaded, or misconfigured.
- Check that your site is not inadvertently blocking Google.
- Control search engine site crawling and indexing wisely – some webmasters intentionally prevent Googlebot from reaching their sites to control how it is crawled and indexed. Consult your developers to ensure that your site is set up for optimal performance.
When you move (or remove) a page on your site, it’s best to set up a 301 redirect to tell the web browser that the page has moved and to direct it to the new page. If pages move multiple times or get sent back to the original location, however, errors can occur.
- Cause: Redirect errors are typically caused by one of the following:
- There was a redirect chain that was too long (use 3 max)
- There was a redirect loop (A points to B which points to A…)
- The redirect URL eventually exceeded the maximum length (this can be a problem when you’re redirecting within layered navigation and/or when adding query parameters)
- There was a bad or empty URL in the redirect chain
- Fixing these errors: Review all of your redirects and ensure there is only a single redirect whenever possible. The Chrome Extension “Ayima” is great at detecting if a page has been redirected, where it redirects to, and the number of redirect hops present, which should help identify issues on a page-by-page basis.
New Product Issues Detected for Site
The following warnings and Google Search Console errors are directly related to product-specific Structured Data Markup (SDM). Use the Structured Data Testing Tool to determine which elements are missing or are set up incorrectly and use schema.org and Google Search Console’s guide to understanding how structured data works for more information on how to fix these missing or incorrectly specified elements.
1. Either “Offers”, “Review”, or “aggregateRating” Should Be Specified
- Cause: Product pages must have at least one of these elements specified within the SDM.
- Fixing these errors: Add at least one of these elements to the page or template HTML. These are typically a parent element with child elements underneath them, such as type, url, ratingValue, or reviewCount.
2. Missing Field “Price”
- Cause: This required product field should be entered as ##.##
- Fixing these errors: Ensure you include the price in the SDM in the format of “87.99” (no dollar sign should be present in this field!)
3. Rating Is Missing Required Best and/or Worst Values
- Cause: If including product ratings or reviews, you will need to specify the range for “bestRating” and/or “worstRating” for each product.
- Fixing these errors: bestRating is typically 5, while worstRating is typically 1.
4. Value in Property “ratingCount” Must Be Positive
- Cause: This property should contain the total number of ratings on the product page.
- Fixing these errors: While this can be 0 if there are no reviews on the product yet, this number can never be negative.
Mobile Usability Errors
With the rapid increase in internet usage on mobile phones, making sure that you have a website that is fast and easy to use on the majority of mobile devices is important. Mobile usability errors alert you to problems that make pages on your site difficult for some users to navigate.
1. Clickable Elements Too Close Together
- Cause: This report shows sites where touch elements, such as buttons and navigational links, are so close to each other that a mobile user cannot easily tap the desired element with their finger without also tapping a neighboring element.
- Fixing these errors: Correctly size and space touch elements to be suitable for your mobile visitors. You can learn more in Google’s Accessibility Styles Guidelines; the minimum recommended target size is 48 pixels and spacing between elements of at least 8 pixels.
2. Viewport Not Set
- Cause: Your page does not define a viewport property, which tells browsers how to adjust the page’s dimension and scaling to suit the screen size.
- Fixing this error: Because visitors to your site use a variety of devices with varying screen sizes, your pages should specify a viewport using the meta viewport tag. The recommended setting is <meta name=”viewport” content=”width=device-width, initial-scale=1.0″>. Related to this, avoid using large, fixed-width elements on the site (such as images) that cannot scale to a reduced screen size, requiring users to scroll from side to side.
3. Content Wider Than Screen
- Cause: This report indicates pages where horizontal scrolling is necessary to see words and images on the page. This happens when pages use absolute values in CSS declarations, or use images designed to look best at a specific browser width (such as 980px).
- Fixing this error: Make sure the pages use relative width and position values for CSS elements.
4. Text Too Small to Read
- Cause: This report identifies pages where the font size for the page is too small to be legible and would require mobile visitors to zoom in to read it.
- Fixing this error: After specifying a viewport for your web pages, set your font sizes to scale properly within the viewport. Use relative units (em or rem) for font size rather than pixel values.
Excluded From Indexing Errors
Sometimes, in the Coverage report, you may see hundreds or even thousands of pages that have been excluded from Google’s index. Not all of these should be considered errors or issues (i.e. “alternate page with proper canonical tag”), but some could be cause for concern. Let’s take a look at the top 5 exclusions we typically see:
1. Crawled – Currently Not Indexed
- Cause: Googlebot crawled your page, but for one reason or another, chose not to index it.
- Fixing this error: Use a site:search in Google to confirm whether or not the pages listed are actually indexed. If they’re not (and you think they should be), confirm they are in your sitemap, not disallowed in robots.txt, are not canonicalized to another page, are not a duplicate of another page, and they have the proper meta robots tags on them. If all of this checks out, you can either use the URL Inspection tool to resubmit them (if it’s back up and running when you read this – as of 10/14/2020, the tool has been disabled for maintenance) or wait patiently for Google to recrawl your site and hope it sees and indexes those pages next time.
2. Crawl Anomaly
- Cause: Google was unable to access the page(s).
- Fixing this error: Use the URL Inspection tool to see if any obvious errors exist. You could also try crawling the page(s) in an SEO tool, such as ScreamingFrog, to look for issues.
3. Duplicate without User-Selected Canonical
- Cause: Googlebot found more than one version of this page, but none of the pages have a specified canonical tag. Googlebot doesn’t think this should be the primary page, so it has chosen not to index it.
- Fixing this error: Find all versions of the page(s) and add appropriate canonical tags. If all versions of the page aren’t needed, you may want to redirect additional variations to the primary page. If all versions are needed, consider canonicalizing the additional variations to the primary URL, so you don’t deplete crawl budget or cannibalize keywords.
4. Duplicate, Submitted URL Not Selected As Canonical
- Cause: Similar to the error above, this is one of the pages in a group of duplicates for which none of the pages have a canonical tag identified. The difference between the two errors is that this one appears when you have specifically requested indexing on a URL and Google feels that a different page deserves the canonical instead.
- Fixing this error: Add appropriate canonical tags to your pages. Similar to the fix above, consider canonicalizing the URL to the primary page (if this one is not it). If this is the primary page, canonicalize the additional URLs to this page.
5. Discovered – Currently Not Indexed
- Cause: Google knows the page exists, but was unable to crawl it.
- Fixing this error: Make sure Googlebot is able to access your site, isn’t bogged down by an excessive crawl delay (specified in your robots.txt file), or generally overloaded.
While these are just the top 5 we often see, there are actually 15 types of exclusions you may find in your Google Search Console dashboard. Check out the Search Console Help page for more information about each type of exclusion.
Google Search Console Resources and Next Steps
The Google Search Console Help Center and the Webmaster FAQ are great resources if you have any questions about organic search.
Want more information on Google search optimization? Download our 2019 Ecommerce Paid Search Report for industry insights from Google.