Welcome to our January 2025 SEO News Recap! In this edition, we cover Google’s December 2024 spam update, faceted navigation recommendations, warnings against using generative AI for links and more! Also, be sure to check out our latest installment of “ROI Answers” at the bottom of this article to find answers to commonly asked SEO questions.
Google’s December 2024 Spam Update
Google recently rolled out a significant spam update that affected many websites. This was the third spam update of 2024:
March 2024 Spam Update: This update aimed to improve Google’s ability to detect and combat low-quality content created at scale. Google may take manual action against websites if they do not comply with Google’s guidelines.
June 2024 Spam Update: This update also focused on enhancing Google’s spam-detection capabilities. Google aimed to present users with more relevant, trustworthy, and helpful search results. It covered a wider range of spam tactics, including deceptive practices and link manipulation.
December 2024 Spam Update: This update was a broad spam update that hit many websites, with some experiencing significant deindexing and de-ranking issues. This update started shortly after the December 2024 core update and just before the holiday season, causing a significant increase in ranking volatility during peak holiday sales.
Google’s spam updates are part of the search engine’s ongoing efforts to maintain a high-quality search experience for users by combating spam and ensuring that unique content written by real people is rewarded. Websites that provide high-quality, original, and helpful content are less likely to be negatively impacted by the update.
Export Hourly Data from Google Search Console
You can now view and export the freshest hourly data (from the last 24 hours) in Google Search Console’s Performance reports. This data drilldown can provide more granular insights to help identify trends, anomalies, and patterns over time. It can also aid in real-time decision making, tracking and resolving issues, optimizing for peak hours, and helping to better understand user behavior at different times of day.
Does Google’s “Validate Fix” Button Really Speed Up Crawling?
Have you ever fixed issues found in the Indexing reports of Google Search Console (GSC), clicked “Validate Fix,” and then felt like it took ages for the issues to disappear? Well, that’s because clicking that button doesn’t speed up how quickly your changes are reviewed, according to Google’s John Mueller. Barry Schwartz of Search Engine Roundtable writes, “It just a reporting mechanism for you to track when Google recrawls the issue and for you to see if it was ‘fixed’ according to Google or not…”
If you need a handful of pages reprocessed quickly, you can use the URL inspection tool and ask Google to recrawl the page(s) in question but validating fixes in the Indexing report doesn’t do that for you.
In typical Google fashion, though, John posted an update saying that in some cases, the “Validate Fix” button does speed up crawling. Stay tuned for more details on that, perhaps next month.
Faceted Navigation Recommendations
Google recently updated their faceted navigation (AKA layered navigation) URL documentation, which recommends preventing crawling of these parameterized navigation URLs if they don’t need to be indexed. When URLs are created any time a facet is selected (i.e., color, size, dimensions, etc.), it can waste Googlebot’s time, slow down its crawling of your site, and waste server resources. According to the documentation, the optimal solution is to use robots.txt to disallow crawling of faceted navigation URLs. You could also apply a “canonical” tag or a “nofollow” tag, but the documentation says for this situation, these are often less effective than robots.txt disallows. If you do want your parameterized URLs indexed, make sure you follow Google’s recommendations in the guidelines linked above.
Google Rep Warns Against Using Generative AI for Links
A social media discussion may have confirmed suspicions about Google’s recent spam updates targeting AI-generated content since the large language model (LLM) boom. In response to a business owner who expressed frustration about her SEO firm outsourcing blog creation to a partner that used AI to create low-quality and “factually incorrect” content, Google’s John Mueller empathized and then added, “this is almost certainly against Google’s spam policies too.”
Although generative AI can be an incredible tool for aiding research and finding efficiencies in workflows, experts continue to caution against using the tool to try to game the system. Creating a ton of content using chatbots for the purpose of gaining backlinks and rankings will almost certainly have the opposite of your desired effect and result in demotions and penalties from Google. When using AI for content generation, we recommend thorough human reviews and incorporating unique, first-hand knowledge into each piece. See Google’s helpful content guidelines for more tips.
Apple Requests Involvement in Google’s Antitrust Case
In late December, Apple “issued a motion of intervention in the DOJ vs Google antitrust case,” according to a report on Search Engine Roundtable. Analyzing the situation, publisher Barry Schwartz suggests several reasons the tech giant may want to be involved in the case, including:
- Protecting $20 million in funding from Google
- Not being interested in building its own search engine
- Thinking that Google may not try to protect their deal.
This is an interesting twist in a case that the tech world is watching (and we are, too!).
ChatGPT Search Is Widely Available
When OpenAI first launched ChatGPT Search, it was only available to paid users. Now, anyone with an account can ask the generative AI tool a question about recent events and receive an answer pulled directly from the web. Here’s how it looks:

You can see the small “TALKSPORT” button next to the text portion of the answer. Hovering over that button provides a pop-up with a link to follow directly to the source.
If you would like to know whether your website is receiving traffic from AI tools like ChatGPT Search, reach out to our ROI Revolution SEO team and we’ll be happy to chat!
Google’s Market Share Takes a Hit
As competition in the search industry heats up, Google Search’s market share has dropped below 90% for the first time since the first quarter of 2015. At 89.73%, Google is still firmly in control, but searchers seem to be looking elsewhere for their answers. Bing, Yandex, and Yahoo all gained a little share in November and December as Google saw a slight decline, but generative AI tools, like ChatGPT Search or Perplexity, might also be taking a portion of the market.
Emarketer reported that, despite Google’s efforts, an influx of spammy, inaccurate, and questionable results may be contributing to its decline. Amsive’s Vice President of SEO Strategy and Research, Lily Ray, recently pointed out that even after receiving manual actions from Google for violating its spam policies, websites may still appear as sources in AI Overviews.
The general rule for SEO has been that if a tactic works for Google Search, it should also work for the other search engines, but as the others gain ground, the rule could change. In the meantime, we’ll keep an eye on all of them for you.
Expect More AI from Google in 2025
Google’s Sundar Pichai held an employee meeting recently and said the company’s top priority for 2025 is to build “big, new business.” “Scaling Gemini on the consumer side will be our biggest focus,” he said. It sounds like we can expect more robust consumer-facing AI from Google in addition to new products and more disruptive moments from the tech giant. If you feel your website isn’t prepared for all that Google has in store, let us know – we’d love to help!
Google Might Be Blocking Rank-Checking Tools
In mid-January, some rank checking tools were showing lower SERP volatility while websites were seeing wild swings in their rankings. As confirmed by Similarweb, one reason for the disconnect could be that Google is increasing or improving its anti-scraping measures.
Google does not approve of third-party tools scraping the SERPs, but most reporting platforms that use them have figured out how to work around its blocks. Last week’s apparent adjustment by Google required quick response and flexibility by the reporting platforms that were affected, so if you noticed a blip in your reports that correlated to this timing, the tool you use might have been affected.
Bing Pulls One Over on Searchers
We won’t speculate on its motivation, but as recently as last week, when a user queried the word “Google” on Bing, the Microsoft search engine got a little spicey. It appeared to take users to another page that looked like Google’s search homepage. Then, just this week, it was reported that Bing was hiding part of its results for the search giant when a user typed in “Google.”
As of this writing, we couldn’t recreate either result but hats off to Bing for making us giggle!
ROI Answers: SEO FAQ of the Month
Q: How much should I change on a page to avoid a duplicate content penalty?
A: Before answering this, we want to make it clear that according to Google, there is no such thing as a duplicate content “penalty.” While that is the official word, experienced SEOs know that when Google finds duplicated content on the internet, a couple of things might happen:
- Google could choose its own canonical, which may or may not be the original page.
- Google could avoid showing either page or rank both pages lower in the SERPs.
It’s never a good idea to copy and paste content from one page to another without giving credit to the originator. If you’re copying a page from another website, make sure to give credit and a backlink to the original publisher and canonicalize your page to the original one. This way, the creator receives due credit and should be the first shown by Google.
Even when you follow proper protocol, you could still rank well for canonicalized content. It just depends on how much competition there is for the subject matter.
If you want your website to reap the rewards of helpful content based on an article published by another site, a good rule of thumb (not an official Google rule) is to change about 70% of the content and add your own original, useful information.
AI tools are available to detect plagiarism, and we can safely assume that Google has similar tools built into its algorithms. If you want your website to have integrity in Google’s eyes without using canonicals – even if you (or your company) wrote the original article published on the other site – avoid simply copying, pasting, and calling it a day.
Still confused or have questions about anything you’ve read in the January 2025 SEO News Update? Reach out to ROI Revolution’s SEO team to see how we can help!