Marketer holding tablet with search bar visible.

Welcome to our August 2024 SEO News Recap! This month, we cover a new core algorithm update, 3rd part cookies, good SEO advice, deepfakes, and an uproar over an Olympics advertisement. Also, be sure to check out our latest installment of “ROI Answers” at the bottom of this article to find answers to commonly asked SEO questions.

Jump to each August 2024 SEO News Recap topic using the links below:

The August 2024 Core Algorithm Update is Rolling Out

On Thursday, August 15, Google announced the launch of the August 2024 core update. The update incorporates feedback provided by creators and others to Google after the March 2024 update, to improve the quality of search results by showing more helpful content and fewer pages that appear to be written for search engines.

If this sounds familiar, it’s because the March 2024 update was the first to include the Helpful Content System in the core algorithm, so every core update will consider helpful content as part of the ranking system.

Google said this update should take about four weeks to complete, and your ROI Revolution SEO team will keep an eye on it as we do with all major algorithm updates.

Google Will Not Deprecate Third-Party Cookies

The struggle to find alternatives to third-party cookies on Chrome is at an end, for now. After months of delays, Google announced it would not deprecate them and would instead find other ways to protect Chrome users’ privacy while still “preserving an ad-supported internet.” The Privacy Sandbox will continue to be available for developers to work toward meaningful alternatives.

It’s still worth noting that Apple and other browsers already block third-party cookies. First-party data is still the most reliable way to track engagement and user interaction on your website.

High-Quality Sites Get Crawled More Often

Google’s Lizzi Sassman & Gary Illyes recently chatted about ways to get Googlebot to crawl your site more frequently. These include making sure your site’s content is consistently high quality, helpful, and likeable by most people. It also helps to publish this type of content often. While these are not groundbreaking thoughts, they can be helpful when evaluating non-indexed pages in Google Search Console’s “crawled – currently not indexed” and “discovered – currently not indexed” reports.

Reminder: Don’t Change Canonicals to Your M. Subdomain

If you’re still using an m. mobile subdomain (ex: m.example.com), Google’s John Mueller reminds you to keep the canonical tag set to your non-mobile site rather than changing it to the m. subdomain. He recommends working towards a responsive design that uses the same URLs across device types, but if you don’t have the resources to devote to that and already use a mobile subdomain, keep your canonicals in place as-is and don’t try to change them, as that could cause major issues for your site.

Google Enhances Gemini with Verification & Deep Research Features

As we’ve likely all experienced, any AI chatbot is prone to hallucinating now and then. To combat these hallucinations from Google’s Gemini (formerly “Bard”), a few new features have been rolled out that help users fact check the AI-generated responses. You may now encounter links to related content, which seem to be the newest iteration of the previously discontinued link citations, accessible via a chip at the end of the paragraph.

In addition, the new double-check feature helps users verify Gemini’s statements by highlighting sections that it can or cannot corroborate with information found on the web.

You can see an example of both new features in action in the screenshot below:

The latest Gemini enhancement doesn’t have a clear launch date yet. The “Deep Research” feature was teased as part of the Gemini 1.5 Pro demo earlier this month. For more complex requests, Gemini will outline its approach to multi-step research and how it plans to prepare a deliverable for the user before getting started. Google shared that this tool is “designed to help you with making big decisions, navigating multiple sources, or getting started on a project you might not know about.”

AI Actually Isn’t Disrupting Traditional Search All That Much

If you regularly keep up with digital marketing news, you’ve seen headlines about how the rise of AI will lead to the death of traditional search. Well, recent insights based on Datos’ clickstream panel show that even the major players in AI/LLM search are barely making a dent in Google’s traffic. The current top competitor, Perplexity, still only has about 15 searches per searcher per month compared to Google’s ~200. Check out some more key takeaways shared by Rand Fishkin if you want to dive into the data.

Google’s Bigger, Badder, Faster Trends Tool

Google Trends has gotten some upgrades geared towards marketers and researchers, including new and improved real-time search trend analysis. The Trending Now feature will update every 10 minutes, and Google says it can forecast 10x more emerging trends than before.

Trends now covers 125 countries, including 40 region-specific hot spots, expanding capabilities for local trend research.

On the UX front, Google Trends also updated its data visualization options. You can now break down trends by emergence time and duration, view an interest-over-time chart, compare multiple trends, and fine-tune with enhanced filters. For those who want to dive deeper, the data can now be exported for further manipulation and analysis.

Watch this walk-through of the new Trends from the Google Search Central team to learn how to navigate the interface and its new features.

International Sites’ ccTLDs & Copy Language

Keywords in URLs may not matter much for SEO; however, Gary Illyes of Google recently said that using a ccTLD and relevant language for the copy on your site can help you rank a bit better in the regional version of Google.

For example, if your site’s copy is in German and you use a .de ccTLD (www.example.de), you may see a little boost in SERPs. He highlighted, though, that the copy on the site being in German (in this example) and matching the user’s query language is likely more impactful than the domain name.

Google Can Crawl Reddit While Bing and Others Cannot

In a move that surprised many, Reddit has blocked Bing search engines from crawling its site. While the exact reason remains unclear, Reddit claims it’s unrelated to their recent Google deal. This could potentially impact Bing search results related to Reddit content.

Reddit told The Verge, “We have been in discussions with multiple search engines. We have been unable to reach agreements with all of them, since some are unable or unwilling to make enforceable promises regarding their use of Reddit content, including their use for AI.”

Along with Bing, many other search engines and AI bots are no longer allowed to crawl Reddit. Google can still crawl Reddit, and Google rolled out the hidden gems algorithm update in November 2023, which helped Reddit gain a lot of traffic from Google. Google’s Hidden Gems Ranking Algorithm aims to surface high-quality, authentic content from lesser-known sources. By prioritizing personal insights and experiences shared on platforms like forums and blogs, Google hopes to provide more diverse and informative search results.

Google Declared a Monopoly

This news reaches beyond the SEO world. On August 5th, a federal judge ruled that Google has been operating as a monopoly in search and advertising markets. This is a huge win for the Department of Justice in last fall’s court case, and it could set the stage for future rulings against other big tech companies.

The next step in this case is determining how to break the monopoly up, and at this point, anything you might have heard is speculation. Google may be made to divest the Chrome web browser, Android operating system, or Google Ads. Or, the company may be required to share intelligence and data with its competitors, but that will be determined in future court proceedings.

A separate trial involving Google and the DOJ is scheduled to start on September 9th. That one will be focused on Google’s ad tech business.

Google Updates Policies Around Deepfakes

At the end of July, Google announced updated policies focused on non-consensual, AI-generated explicit images called “deepfakes” and explained how its algorithms combat that content:

  • It’s now easier to report and request the removal of deepfakes.
  • Google will remove the image and all copies of the image that it can find.
  • Google will look for additional related queries and address those automatically.
  • Google assumes that if a website has many removal requests against it, the content is not up to standards and will downgrade the content (and the entire website) while simultaneously promoting legitimate news articles about the person named in the query.

Backlash Over Google’s Gemini AI Ad Sparks Debate

Google pulled its Gemini AI ad from Olympics coverage after people took to the internet to complain. The ad’s narrative is told from the perspective of a proud father who wants to help his daughter write a fan letter to US track star Sydney McLaughlin-Levrone. The controversy is over asking Gemini to write the letter.

The complaints online ranged from Google missing the point of a fan letter to general outrage over the increased use of generative AI.

The lesson we’re taking away from this debate is that humans crave authentic connection, and marketers should double down on what it means to be human. Generative AI may be able to finish your sentences and use words that evoke emotion, but a heartfelt message written by an adoring fan, with misspellings and all, will always beat a bot’s predictive text.

ICYMI: Other Recent SEO Blog Posts

July SEO News Recap

Web Development and Ecommerce

ROI Answers: SEO FAQ of the Month – Toxic Backlinks

Q: What are toxic backlinks? Should I worry about them? Can I do anything about them?

A: Let’s break this one down into each of its parts.

Toxic backlinks are defined as “unnatural” links from low-quality websites that have the potential to negatively affect your site’s visibility in search.

You generally don’t need to worry about them unless your website team has intentionally purchased spammy backlinks or otherwise gained links with the intent to game search engine algorithms. They’re just too smart for these old-school black-hat SEO tactics. If you haven’t used any spammy tactics but see alarm bells going off in a third-party SEO tool like SEMrush, this can stay relatively low on your list of priorities.

There isn’t a whole lot you can do about toxic backlinks, either. In the past, we’ve advised clients to use the Disavow tool to tell Google that these “toxic” backlinks may be coming from spammy sources and that we don’t want to be associated with those domains. This can be a lot of work for a low payoff, especially because Google has said recently that the Disavow tool is going away (at some point) and has already been dropped by Bing back in September 2023.

Moral of the story: there are bigger fish to fry, but if you’re still worried, ask your SEO team about it!