Saturday, November 23, 2024
spot_img

Top 5 This Week

Related Posts

5 things to check if your traffic suddenly drops

Traffic Drops Featured 1920x1080

No one’s immune to traffic drops, and the causes may be easy to overlook. Still, to avoid barking up the wrong tree, gather the evidence first: look closely at the overall stats for a long enough period in your Google Analytics to see if the traffic spike really looks out of place or if it might occur due to seasonality, holidays or other patterns inherent to the niche.

Another thing to make sure of is that you don’t miss any pieces of data for certain days or devices. Double-check to see if Google Analytics codes are implemented properly on all pages of your site (here’s a tracking code checker), if correct snippets for your property are being used and if the code formatting is preserved (your GA tracking code can be found under Admin > Tracking Info > Tracking Code).

Once you know for sure the error isn’t statistical, it’s time to have a closer look at the symptoms and start digging to find the root cause.

1. Significant changes to your site

Substantial site changes like redesign, migration or content clearout may have pitfalls, potentially subversive, for your SEO. If you have recently been through any, and there’s a traffic dip out of the blue, pay close attention to the aspects that were involved. The most commonly overlooked issues would concern indexation and crawlability.

First, go to Crawl > Crawl Errors in your Google Search Console (GSC) and examine the graphs closely to spot any abrupt changes after the renovation.

Screen1Screen1

To track down the broken URLs reported by GSC, you can turn to a tool like WebSite Auditor: create a project for your site and allow a few moments for the app to crawl it in-depth. Find the URLs in the All Resources tab with a Quick Search filter and check the Links to Page section at the bottom to see where the broken links hide.

Next, go to Google Index >  Index Status  in GSC and check to see if the number of indexed pages for your site may have drastically decreased. To make sure you haven’t disallowed anything mistakenly, rebuild your WebSite Auditor project to recrawl the site on behalf of Googlebot. Check the Robots Instructions column under All Resources, showing the directives from your robots.txt, as well as page-level restrictions (X-Robots or noindex tags) and how they apply to each URL.

Screen2Screen2

2. Manual search engine penalty

If you have called down the wrath of Google by violating webmaster quality guidelines and got caught by Google’s human reviewers, you will see a notice in your GSC account in the Search Traffic > Manual Actions section. The site being hacked is one possible reason that lies beyond your control; for such a case, Google has a comprehensive recovery guide. Most other reasons are fully under your control and are easier to recover from.

User-generated spam

You can be penalized for spammy and irrelevant links on pages of your site that allow user-generated content. The cure would be to detect the violations and clean up carefully. Further preventive treatment would be moderation, anti-spam measures like a reCAPTCHA plugin or defaulting the UG content to “nofollow.

Unnatural outgoing lnks, cloaking and sneaky redirects

Those on-page violations may be hard to spot on a site that’s larger than average, yet WebSite Auditor can lift that burden for you again.

If the penalty refers to unnatural linking, go to All Resources and review the list of all external links from your site. Get rid of the paid links and links gained through evident link exchange.

Screen3Screen3

If you’re punished for sneaky redirects, make sure none of the redirects lead from your site to anywhere unforeseen or suspicious. Check the Pages with 301/302 redirects and meta refresh sections under Site Audit to revise all the destination URLs.

Screen4Screen4

As for cloaking, ensure that your pages return the same content to a user in a browser and to a search engine bot. To see your site from Google’s perspective, crawl it with the Fetch as Google tool, and check for any discrepancies.

Thin or duplicate content

Having too many pages that provide too little meaningful content may trigger a thin content penalty. These may be category pages with loads of links and only a few lines of text, thin local landings instead of one information-rich store locator or other need-to-have pages with no value in terms of content. Those are better off merged or hidden, as all the accessible content adds up to the overall quality of your site.

To find the peacebreakers, turn back to your WebSite Auditor project. Add the Word Count column to your workspace, click on its header to sort the URLs, and check if there are too many pages with thin content. You may also pay attention to the ratio between the word count and the number of Links From Page.

Screen5Screen5

To spot the pages that are likely to be duplicated or very similar, check the Duplicate Titles and Duplicate Meta Descriptions sections under Site Audit.

Unnatural links to your site

To find the culprits here, fire up SEO SpyGlass: create a project for your site and choose SEO PowerSuite Backlink Explorer in tandem with Google Search Console (connect the GSC account associated with your site). Once the backlinks are gathered, switch to Linking Domains and go to Link Penalty Risk workspace, select the domains and click Update Penalty Risk.

Screen6Screen6

Penalty Risk column will show the likelihood of getting penalized for any link, considering the sites’ quality and your backlink profile diversity. The domains with penalty risk higher than 50 percent are worth manual revision. The ones you decide to take down can be disavowed right from the tool (select > right-click > Disavow Domains). The ready-to-submit disavow file can be exported from the Preferences > Disavow/Blacklist Backlinks menu.

Once you’ve cleaned up the mess, submit a reconsideration request to Google: click to Request A Review under Manual Actions in your GSC.

3. Google algorithm update

Google is evolving without a break: it tweaks and rolls something out every once in a while. If traffic decline corresponds to any update rollout, investigate the factors triggering it.

New core updates (and renewals to well-known algos) would normally be all over the SEO news, so you should be able to find out what they are about.

Narrowly focused and niche-specific updates, however, may not get covered right away, so it’s a good idea to keep an eye on SERP fluctuations for your niche keywords to spot any unusual shakeups. Rank Tracker is there to help: in the bottom tab of the Rank Tracking module, you can go to SERP Analysis and click to Record SERP Data. The tool will gather the top 30 pages for your keywords with each ranking check and accumulate the results in the SERP History table.

The Fluctuation Graph will reflect any suspicious fluctuations for individual keywords and the whole project. The SERP History table will show the dropped URLs so that you could investigate a possible cause by analyzing pages’ common traits.

Screen7Screen7

4. Valuable backlinks lost

Losing backlinks may hurt your visibility and traffic big times, especially if your site doesn’t have loads of backlinks overall. If the traffic drop affected most keywords and pages, check for any noticeable changes to your backlink profile. Back in SEO SpyGlass, you can look through the Summary section to see if the Backlink Progress graph went down recently.

If so, you can track the lost links by updating Backlink Page Info factor for the backlinks. In the Links Back column, you’ll see the real-time status of each; the Last Found Date column will give a hint as to when you may have lost any. If possible, contact website owners personally to get the crucial links back.

Screen8Screen8

5. Competitors and SERPs changes

If the traffic drop is rather moderate or specific to certain keywords, some of your pages may have gone a few results down the SERPs. To investigate, turn back to SERP History tab in Rank Tracker: you may notice new kinds of results on some SERPs that now answer your target queries directly in a Featured Snippet, Knowledge Graph or Answer Box. That would be one good reason to consider optimizing for position 0.

If rankings and traffic are lost to a competitor, try reverse-engineering the upranked pages and find the aspect where you fall behind. Are the pages mobile-optimized as opposed to yours, was the content significantly renewed, new backlinks gained? Or perhaps schema markup made the result stand out and entice your clicks. Think of catching up on weak spots you discover.

Screen9Screen9

Finally, it may happen that your competitors set up a PPC campaign and started biding on keywords you rank high for, and now their ads show up on SERPs and drive the traffic away from the top organic results including yours. If that’s the case, you may consider overbidding the competitor if given keywords are of high importance to you or shifting your focus to other target queries.


Popular Articles