Home Magazine

Google’s Need For Freshness Sours Search Results

By Simone Design Blog @HomeSpire

Every so often SEO professionals produce a list of what they believe to be the top factors influencing search engine rankings. The latest update to this list of proposed factors looks much like past lists, focusing on traditional factors like links, content, HTML tags, and domain registration age, and some new ones like geographical factors and personalized search history. But one term might be new to many people: link velocity.

Link velocity refers to the speed at which new links to a webpage are formed, and by this term we main gain some new and vital insight. Historically, great bursts of new links to a specific page has been considered a red flag, the quickest way to identify a spammer trying to manipulate the results by creating the appearance of user trust. This led to Google's famous assaults on link farms and paid link directories.

But the Web has changed, become more of a live Web than a static document Web. We have the advent of social bookmarking, embedded videos, links, buttons, and badges, social networks, real-time networks like Twitter and Friendfeed. Certainly the age of a website is still an indication of success and trustworthiness, but in an environment of live, real time updating, the age of a link as well as the slowing velocity of incoming links may be indicators of stale content in a world that values freshness.

Google’s Need For Freshness Sours Search Results

This puts Google and other search engines in a tight spot for determining the relevance of any given destination. With information suddenly so viral and speedy, bursts of links to content are key indicators of freshness and what is at the top of mind for searchers. That means the balance has changed, and we may have caught Google in an awkward stage of transition.

Obviously Google can't go full-on real time like Twitter; search results would become a spammer's ball. At the same time Google must reevaluate what is considered relevant. Age of content and websites are then devalued in favor of freshness, and link bursts regain value so long as link bursts appear naturally viral. But that's a real trick, isn't it? How does a search engine differentiate between spam bursts and natural viral bursts?

Part of that equation involves evaluating the source of the link, which means link farms and paid directories are out, and trusted sites, including and especially trusted sites with social features, are in. The other part of that appears to be where Google is currently failing. By devaluing apparently stale content and slowed link velocity, the value of fresh content and link bursts to it is artificially elevated, but Google isn't so perfect at determining which fresh, apparently viral content is legitimate and which is full on malicious spam.

Encyclopedia Britannica's criticism of Google for giving so much weight and relevance to Wikipedia was dismissed as the sour grapes of an old-world information source failing to keep up with the times. But I think there is a major clue: Wikipedia's content is continually updated by its editors, giving it the appearance of constant freshness. Combined with continual linking, that gave it live Web relevance. As Twitter, YouTube, Facebook, et alia, skyrocketed in popularity, gaining links and continuous user-generated freshness, they also enjoyed sudden boosts in search engine ranking. Just as everybody was talking about and linking to Twitter, Twitter started dominating search results.

But this emphasis on freshness and link velocity has an inherent flaw, one that blackhat SEOers have been disturbingly effective at exploiting for malicious ends. Yesterday, we reported how PandaLabs had discovered a million links targeting Ford-related search queries in order to dupe searchers landing at targeted destination pages into paying to download phony security software.

These blackhatters were astonishingly successful at getting at least ten (I stopped looking after ten) full pages of search results to point to malicious web pages, despite the pages' obvious spamminess and relative youth, despite very suspicious machine-generated URLs resolving to Polish domains, despite that all but the first result were largely irrelevant to the query (a search for Nissan motor parts brought back Nissan door part results, etc.).

They were able to do so, in part, by dropping multitudes of links into comments sections and discussion areas of current and trusted websites and forums. I think the blackhatters were so successful precisely because of Google's current need for freshness in the age of the real time live Web.

They didn't create link bursts from telltale link farms and paid directories, they created them, complete with buzzy anchor text, by piggybacking on Google's inherent trust of social media. As a result, the targeted webpages take over the search results, and the average user, trusting Google more than they trust their own analysis of what should be obvious spam URLs, get directed to harmful sites.

Back to Featured Articles on Logo Paperblog