Google makes 4 changes to the index coverage report

The Google Search Console index coverage report receives four updates to keep site owners better informed about indexing issues.

The index coverage report is new compared to other reports offered by Google as it was first launched when the revamped version of Search Console was launched in 2018.

Since the launch of the Index Coverage report, site owners have shared feedback with Google about improvements they want to see in the future.

Changes to the index coverage report, released today, are based on the feedback provided by the webmaster community.

‘Based on the feedback we’ve received from the community, we’re implementing significant improvements to this report today so that you can be better informed about issues that could prevent Google from crawling and indexing your pages. The change aims to provide a more accurate state of existing problems, which will help you solve them more easily. ‘

Advertisement

Read more below

Changes to the Search Console Index Coverage Report

The list of changes to the index report in Search Console contains:

  • Removal of the common type of “crawl anomaly” issue – all crawl errors should now be linked to a problem with a finer solution.
  • Pages submitted but blocked and indexed by robots.txt are now reported as “indexed but blocked” (warning) instead of “submitted but blocked” (error)
  • Adding a new edition: “indexed without content” (warning)
  • Soft 404 reporting is now more accurate

The overarching theme of these updates seems to be the accuracy of data.

There is no more guesswork involved in crawl errors, because the problem with ‘crawl deviation’ is replaced with specific problems and resolutions.

Site owners will know for sure if a page indexed by Google is blocked by robots.txt because the report ‘indexed but blocked ”rather than“submitted but blocked. “Submitting a URL is not the same as having it indexed, and the report has now been updated to display it.

Advertisement

Read more below

Soft 404 reporting is said to be more accurate, and a brand new edition called ‘indexed without content’ has been added. Let us take a closer look at the matter as it appears in one of your reports.

This is what the Search Console Help page says about indexed without content:

“This page appears in the Google index, but for some reason Google could not read the content. Possible reasons are that the page is covered by Google or that the page is in a format that Google cannot index. This is not a case of robots.txt blocking. ‘

If you the indexed without content issue means the URL is in Google’s index but the crawlers cannot see the content.

This could mean that you accidentally published a blank page, or that there was an error on the page that prevented Google from displaying the content.

For further guidance on resolving a non-content indexed indexer, I encourage site owners to run the specific page through Google’s URL Inspection Tool.

The URL Inspection Tool will display the page as Google sees it, which may help to understand why the content is not visible to Google crawlers.

These changes are now reflected in the index coverage report. Site owners may see new types of expenses or changes in the number of expenses.

For more information, see Google’s official blog post.

Source