Google’s March 2024 core update is bringing a seismic shift to the SEO industry. This major update might sweep the web the same way the Panda and Penguin updates did.
Its impact is enormous and widespread and every seasoned marketer I’ve talked to agrees that this update is a big deal. Has your website been hit by the March 2024 core update?
Here are five things you need to know to avoid getting penalized and maintain your good standing in search results.
1. Google is completely deindexing websites
Imagine waking up to find your website gone from Google’s search results. That’s exactly what happened to many websites after the March 2024 update rolled out.
In its March 5 announcement, Google emphasized its goal of reducing unhelpful, irrelevant, unoriginal content from search results.
This cleanup drive promises to remove up to 40% of low-quality websites that provide useless information and poor user experience – pages created just to match specific search queries.
Websites found violating Google’s guidelines or employing questionable SEO tactics will not only be penalized but completely removed both from the search results and the index.
In the days following the announcement, the SEO world turned upside down.
Here’s a tweet from Jeff Coyle:
Please note that deindexing websites completely are usually related to a Google manual action and often is not the case with a Google core update.
2. Penalties are swift
What’s notable about this update is that the penalties were hard and fast, with site owners scrambling for answers.
Let’s look at a whopping 10 websites deindexed:
To check if your site was impacted by the Google update, look for your website in Google by typing “site:website.com” and see if you are showing up in the search results.
It’s important to note that sites affected by the March 2024 core update or spam update “would not get notified of a ranking decline through Google Search Console’s manual action viewer,” as pointed out by Barry Schwartz.
“Algorithmic updates are automated and Google does not not notify the site owner when a site is negatively (or positively) impacted by an algorithm update,” Schwartz writes.
Get the daily newsletter search marketers rely on.
3. Websites that survived previous updates are not immune
Older websites aren’t spared, either. Those untouched by previous algorithm updates now face repercussions, too.
Some folks are cheering, saying it’s about time those low-quality sites got the boot after years of gaming the system and staying in SERPs.
This is one example of a “link building agency” that got kicked out:
But others are left scratching their heads, wondering why their entire network has vanished.
In a discussion within the private Affiliate SEO Mastermind group, a member shared the unfortunate experience of a publisher whose entire network crumbled.
According to the post, the publisher’s eight affected sites, all established within the last two years, covered various niche topics. The author emphasized that the content was primarily human-written with minimal assistance from AI.
All eight websites are now displaying zero traffic impressions.
This is where E-E-A-T might have played a crucial factor.
The broad range of topics these sites covered indicates a lack of deep topical knowledge – experience and expertise – which Google prioritizes when ranking websites.
4. Outdated, error-filled sites are vulnerable
Speaking of E-E-A-T, look at how Google has now added another factor when considering content to be untrustworthy and with the lowest E-E-A-T:
Content is generated by AI, it’s outdated and it’s not error-free.
These are surefire ways to get de-ranked!
5. Even small AI content sites were affected
An interesting development is how small AI content sites have come under scrutiny. Google’s ability to detect AI-generated content has become more refined, resulting in penalties even for smaller players in this field.
Craig Griffiths thinks the biggest indicator of an AI website is the frequency with which it publishes content.
But that is not the case here:
Digging further into this conversation reveals more specific signals that Google might be looking for:
Let’s get one thing straight: Google is not going after AI content. What Google wants to clean up from its search results is useless, repetitive, unoriginal content, whether it’s written by humans or AI or both.
The latest update aims to remove bad websites that do nothing for people while bringing good websites with valuable content to the surface.
This is a wake-up call for website owners who thought their legacy would keep them safe. It proves again that adaptation is key to survival in the digital world.
If you’re managing one of these older, smaller domains, don’t panic yet. Start by conducting a comprehensive site audit and fix these issues immediately.
Takeaway: Quality trumps quantity
Google’s emphasis has always been on originality, depth and value for the reader. Websites that were removed and deindexed often relied heavily on thin or duplicated content without providing unique insights or perspectives.
To combat this issue head-on, improving E-E-A-T becomes essential. E-E-A-T principles show Google – and, more importantly, your readers – that you’re an authority worth listening to. This includes citing reputable sources within your niche and showcasing author expertise clearly on your site.
User experience also takes center stage with the latest algorithm changes. Your site should look good and be easy and intuitive for visitors. Websites focusing solely on keyword optimization instead of holistic UX design principles will be penalized.
Paying attention to page speed, mobile-friendliness and clear calls to action will help keep users engaged longer, sending positive signals back up the SEO food chain.
Remember, by making these adjustments, you’re not just playing nice with search engines; you’re building a better online space for everyone who visits.
Adapt or fall behind
The sheer number of de-indexed websites reported by the search marketing community confirms Google’s plan to implement a major shakeup in their ranking systems.
It is prudent to study the patterns of websites affected by algorithmic events to spot common trends so you can uncover your websites’ own weaknesses and recalibrate your strategies.
- Focus on content quality.
- Enhance user experience.
- Start auditing your site and make changes.
The road ahead might seem daunting, but remember, every step toward improvement is a step away from Google hell.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.