Saturday, November 23, 2024
spot_img

Top 5 This Week

Related Posts

5 "Foundations" Of SEO That Were Torched In 2013

grill-seoThe basics of SEO haven’t changed much in the last 15 years. If you followed the mantra of creating good content and obtaining quality links, they still haven’t changed… or have they?

Here are five SEO “foundations” that were absolutely torched in 2013. If you are still counting on any of these, stop now and get up to date on what SEO will mean in 2014.

1. Keywords Are The Key To Search Results

Many lamented the finality of not provided when it was announced on September 23, 2013 that keywords would no longer be passed in the referral string from Google. But what many failed to see (and many still fail to see) is that search is not about keywords; it’s about intention. It sort of always has been; but, SEOs have used keywords as a proxy to those intentions.

Is someone using the keyword “buy”? They must be looking to buy something. Makes total sense. But as the search engine algorithms have progressed, the average user now realizes they don’t have to put the keyword “buy” in their query. All they have to do is click on one of the conveniently placed shopping results, or better yet, skip the search engine altogether and use a vertical (shopping) search engine like Nextag.

Want further proof of this? Look at Google trends. For almost any comparison of a keyword vs. buy + that keyword, you’ll see a trend similar to this, showing that while people are increasingly interested in a product, their tendency to add “buy” to that product keyword is diminishing. Below is an example of this with “power tools” (red) and “buy power tools” (blue), but you can see it with a lot of examples if you feel like digging a little.

Google Trends for "Buy Power Tools" and "Power Tools"Google Trends for "Buy Power Tools" and "Power Tools"

Google Trends for “Buy Power Tools” and “Power Tools”

2. Geo-Location Keywords Are Important

It used to be that if you wanted your site to show up for a specific city, all you had to do was create a page that showed the city or city and state along with the keyword. This led to millions of pages on the internet like these:

SERP for a site with geo-located pagesSERP for a site with geo-located pages

Not only is this no longer a recommended tactic; the Panda algorithm was created in part to stamp out the practice. Instead,  you now have to have a Google Places/Pages/Whatever they are calling it profile for a verified address in the local area to rank well.

Plus, you have to use schema tagging (or Hcard or whatever) to mark up your address. For some reason, the search engines think that because our office is physically located in Raleigh, NC, we’re more relevant there than anywhere else in the country – even though the majority of our clients are not from NC. So, we also have the death of common sense to celebrate.

Hopefully, this particular issue will be short-lived, since again, searchers are becoming more savvy and realizing they don’t have to enter the city name to get a local result – unless, of course, they want a location outside of the area they are currently in… but I digress. Suffice it to say this particular practice is torched, but there’s not yet a good replacement solution.

3. 302 Redirects Have A Function For SEO

302 redirects were used for SEO back in the day because the search engines would only crawl sites every now and then (not multiple times a day like they do now). If you were making a page change that you didn’t expect to stick for a long period of time, you would post a 302 redirect so the search engine wouldn’t change your listing during the time before you went back to the old URL.

The official reason for this in SEO was that you didn’t want the search engines to update all your inbound links to the new URL, as indicated by this directive in w3.org: “302 Found… The requested resource resides temporarily under a different URI. Since the redirection might be altered on occasion, the client SHOULD continue to use the Request-URI for future requests. This response is only cacheable if indicated by a Cache-Control or Expires header field.”

However, Google went on record in August of 2012 with John Mueller stating: “the 302 is something where we would still pass the PageRank. Technically with the 302, what can happen is that we keep the redirected URL and just basically use the content of the redirected URL.”

We began to see this really play out in 2013 as we saw more and more sites getting penalized for bad links to their site – especially affiliate links, which often go through 302s with the intention (incorrect) of stopping the flow of PageRank.

4. 404 Error Pages Should Be Reserved For Outdated Pages

This is another very frustrating development in Google’s quest to kill spammers. Hundreds of sites are being forced to deal with inbound links that they can’t control by making the destination page go to a 404 or 410 result. This means there are thousands of new broken links on the web as a result of Google’s heavy-handed penalties.

This is terrible for user experience, but if you can’t control the links into your site and can’t get the webmaster to respond to you to remove them, it’s really the only option to get back in Google’s good graces. They say you can disavow bad links, but they also say you have to make a concerted effort to remove the links first – and they don’t consider a spreadsheet with hundreds of “no response” entries to qualify, in my experience.

You can see just how big this problem has become if you start looking at search results on a site: level — especially for news sites that posted a lot of press releases and later removed them due to pressure from the webmasters that syndicated them in the first place.

Linkrot is a real problem on the web, and now Google’s contributing to it mightily. I’ll be interested to see how the numbers change over the next few months, but this study (released just this month) shows a pretty clear trend:

Linkrot Study by the Chesapeake Digital Preservation Group - 2013Linkrot Study by the Chesapeake Digital Preservation Group - 2013

Linkrot Study by the Chesapeake Digital Preservation Group – 2013

5. Links To You Can’t Hurt You

This is possibly the biggest SEO understanding that was torched in 2012 and 2013. Back in October of 2007, Wysz (a well known Googler at the time) said this: “I wouldn’t really worry about spam sites hurting your ranking by linking to you, as we understand that you can’t (for the most part) control who links to your sites.”

This week, Matt Cutts said this: “But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits.” I think commenter “hGn” puts it best: “The collateral victims of the [Google] experiments are much more than the spammers that these algorithms are really stopping or frustrating.”

What I see from my seat as an SEO consultant is that the people with broken spirits are the companies that hired the spammers, not knowing what they were going to do. The spammers have already moved on to their next victim.

So there you have it. A sad and trite listing of some of the more disturbing ways SEO has changed this year. While building quality content that visitors want to use and share is a beautiful, altruistic and idealistic goal, I somehow doubt that the search engines (Google especially) will abandon their vendetta against spammers long enough to help the new SEO best practices actually work the way they are intended to. Here’s hoping for better news in 2014.

Image used under license from Archology Inc.

Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


Popular Articles