Authors: Niki Morock, SEO Content Manager | Robyn Riley, SEO Technical Manager | Tyler Mosher, Senior SEO Content Specialist
Google Now Verifies Official Release Dates for Ranking Updates
If you’re noticing higher than usual keyword ranking volatility on your preferred keyword tracking tool, you can now go to at Google Search Central to confirm whether a Google algorithm update has been rolled out!
Google’s Ranking Updates page provides ranking update release history dating back to January 2020, including the purpose and duration of the rollout. This information can not only help you pinpoint specific keyword wins or declines as well as the reasoning behind these fluctuations but also help frame your existing content strategy according to Google’s current preferences.
Google Search Console’s New Video Indexing Report Now Rolling Out
Search Engine Land reported on July 11th that the Google Search Console video indexing report had started to become visible and would be completely rolled out over the next few months.
This report shows how many indexed pages on your site contain at least one video and how many of those videos are indexed. This information can help identify areas of opportunity and improvement, as well as show what’s working well.
Search Console Insights Can Now Use Google Analytics 4 Data
Until recently, Google’s Search Console Insights was only useful with Universal Analytics (the version of GA that will sunset next summer). Earlier this month, Google announced that GA4 users can now link to GSC Insights to:
- Learn how content is performing
- See which pieces are performing well
- Discover where users discover content
It is highly recommended you link GA4 to GSC to make the most of your website’s data.
Google’s Top Ranked SERP Feature Leads to Critic Reviews & Product Review Articles on Mobile
On mobile, Google’s search results have a feature called Top Ranked. This feature contains a list of products that show how the product stacks up based on a subtopic related to the product.
Google says this is based on “web reviews and mentions.”
When one of the products in the list is tapped, the user is taken to a new list of search results that contain “critic reviews” and more information via product reviews. If your products often appear on “top product” lists, you’ll want to keep an eye on these features, especially as we approach the holiday shopping season.
Google: Search Query Pages Are Equivalent to Low Effort Category Pages
John Mueller recently reminded us on Twitter that it is generally not recommended to allow indexing of site search results (search query results) on your site. He equated these to “low-effort category pages,” which would be deemed thin content and unhelpful to your SEO.
If you find that certain queries are often searched on your site, you may want to consider creating a robust category or subcategory page to make your site more useful for your site visitors.
Almost Half of GSC Clicks Go to Hidden Terms
A study by Ahrefs recently showed that nearly half of all clicks reported in Google Search Console come from anonymized or hidden queries. Google has shared that some queries are hidden to protect privacy, while others are long-tail, unique queries.
Site owners should keep in mind that GSC data is not complete and that it’s best to use multiple sources of data to form conclusions. By comparing the queries provided in GSC to organic keyword reports from third parties, you can determine the kinds of terms that drive traffic.
Google Clarifies How to Use Product Rich Results on Product Variants Pages
For guidance on how to handle multiple product variants on a single page, Google has updated documentation on how to best use rich results markup.
If site owners choose to include multiple product variants that share the same URL on a single page, the page may be ineligible for product rich results and create a poor Google Shopping experience. Using a distinct URL per variant is a better option since site owners can choose a canonical URL to help Google understand which variant is best to show in search results.
Google Updates Crawl Stats Report Help Documentation
In late June, Google clarified its help docs to better explain what happens when a 404 Page Not Found is shown for a robots.txt file: Google treats it as a success and can crawl any URLs on the site.
If Google has an unsuccessful robots.txt file response, it will take certain actions depending on the time duration.
- First 12 hours: Google stops crawling your site but continues to request the robots.txt file.
- Between 12 hours to 30 days: Google uses the last successfully fetched file while still requesting a new one.
- After 30 days: If the robots.txt file still hasn’t been successfully fetched, Google will look at your homepage. If your homepage is available, Google will act like the file is 404ing and crawl the site. If your homepage is not available, Google will stop crawling your site. However, it will still request the robots.txt file occasionally.
Google to Pay Wikipedia for Content in Knowledge Panel & Search
Despite already being one of Wikipedia’s largest donors, Google will now start paying Wikipedia for use of its content displayed in the Knowledge Panel and search results. Wikipedia announced Wikimedia Enterprise, a paid service for large, commercial organizations like Google that use and repurpose its content in other contexts.
The goals of this new service are to improve user experience beyond Wikipedia’s website, increase reach and discoverability of content, and make it easier for users to verify data they find.
Is Google Using AI for Its New Pros and Cons Snippets?
A new search results snippet appeared on some Google pages within the last month that has many SEOs speculating how Google comes up with its content.
The snippet lists pros and cons for a product that is the subject of a user’s query. While the information could be helpful to the searcher, in the examples we’ve seen, the words “pros” and “cons” do not appear on the entry’s linked page. An educated guess is that Google uses natural language processing (NLP) to understand the sentiment of the descriptions of the product on the page. If this is the case, it’s another leap forward in Google’s use of AI to enhance the search experience.
Google Updates Documentation & Causes Panic Among SEOs
Google added a single line to a help center article that explained that Googlebot won’t crawl more than 15 MB of an HTML file. After quite a commotion among SEOs on social media who were concerned their pages wouldn’t be crawled, Google posted a clarification to the Search Central Blog explaining that it’s not a new rule – it’s just a new line in its documentation.
Search Liaison John Mueller further stated on Twitter that most pages aren’t even close to 15MB in size, and in fact, if you want to create one that large, “it’s a *lot of work* and almost nobody does that.”
Tying It All Together
The digital marketing landscape is always changing, and the SEO world is no exception. That’s why we keep you updated monthly with all of the latest organic search news. Interested in talking with one of our knowledgeable experts on how optimizing your organic search presence can allow you to tap into more revenue? Reach out today to get in touch.