marți, 13 decembrie 2011

Freshness Factor: 10 Illustrations on How Fresh Content Can Influence Rankings

Freshness Factor: 10 Illustrations on How Fresh Content Can Influence Rankings


Freshness Factor: 10 Illustrations on How Fresh Content Can Influence Rankings

Posted: 12 Dec 2011 01:06 PM PST

Posted by Cyrus Shepard

In 2003, engineers at Google filed a patent that would rock the SEO world. Named Document Scoring Based on Document Content Update, the patent not only offered insight into the mind of the world’s largest search engine, but provided an accurate roadmap of the path Google would take for years to come.

In his series on the 10 most important search patents of all time, Bill Slawski shows how this patent spawned many child patents. These are often near-duplicate patents with slightly modified passages – the latest discovered as recently as October 2011. Many of the algorithmic changes we see today are simply improvements of these original ideas conceived years ago by Google engineers.

One of these recent updates was Google’s Freshness Update, which places greater emphasis on returning fresher web content for certain queries. Exactly how Google determines freshness was brilliantly explored by Justin Briggs in his analysis of original Google patents. Justin deserves a lot of credit for bringing this analysis to light and helping to inspire this post.

Although the recent Freshness Update received a lot of attention, in truth Google has scored content based on freshness for years.

How Google Scores Fresh Content

Google Fellow Amit Singhal explains that “Different searches have different freshness needs.”

The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query. While some queries need fresh content, Google still uses old content for other queries (more on this later.)

Singhal describes the types of keyword searches most likely to require fresh content:

  • Recent events or hot topics: “occupy oakland protest” “nba lockout”
  • Regularly recurring events: “NFL scores” “dancing with the stars” “exxon earnings”
  • Frequent updates: “best slr cameras” “subaru impreza reviews”

Google’s patents offer incredible insight as to how web content can be evaluated using freshness signals, and rankings of that content adjusted accordingly.

Understand that these are not hard and fast rules, but rather theories consistent with patent filings, experiences of other SEOs, and experiments performed over the years. Nothing substitutes for direct experience, so use your best judgement and feel free to perform your own experiments based on the information below.

Images courtesy of my favorite graphic designer, Dawn Shepard.

1. Freshness by Inception Date

A webpage is given a “freshness” score based on its inception date, which decays over time. This freshness score can boost a piece of content for certain search queries, but degrades as the content becomes older.

The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.

Inception Date for Freshness

"For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set."
    
  - All quotes from US Patent Application Document Scoring Based on Document Content Update

2. Document Changes (How Much) Influences Freshness

The age of a webpage or domain isn’t the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn’t change. In this case, the amount of change on your webpage plays a role.

For example, the change of a single sentence won’t have as big of a freshness impact as a large change to the main body text.

Content Changes for Freshness

"Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

3. The Rate of Document Change (How Often) Impacts Freshness

Content that changes more often is scored differently than content that only changes every few years. In this case, consider the homepage of the New York Times, which updates every day and has a high degree of change.

How Often Content Changes for Freshness

"For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

4. Freshness Influenced by New Page Creation

Instead of revising individual pages, websites add completely new pages over time. This is the case with most blogs. Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.

Some SEOs insist you should add 20-30% new pages to your site every year. This provides the opportunity to create fresh, relevant content, although you shouldn’t neglect your old content if it needs attention.

New Pages Influence Freshness

"UA may also be determined as a function of one or more factors, such as the number of “new” or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document."

5. Changes to Important Content Matter More

Changes made in “important” areas of a document will signal freshness differently than changes made in less important content. Less important content includes navigation, advertisements, and content well below the fold. Important content is generally in the main body text above the fold.

Boilerplate Changes Count Less for Freshness

"…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA."

6. Rate of New Link Growth Signals Freshness

If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you are about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)

That said, an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Be careful, as engines are likely to devalue such behavior.

Link Growth Rate for Freshness

"…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document’s score."

7. Links from Fresh Sites Pass Fresh Value

Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.

For example, if you obtain a link off an old, static site that hasn’t been updated in years, this doesn't pass the same level of freshness value as a link from a fresh page – for example, the homepage of Wired.com. Justin Briggs coined this FreshRank.

Freshrank Illustration

"Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh."

8. Changes in Anchor Text Signals may Devalue Links

If a website changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.

For example, If you buy a domain about automobiles, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.

In this instance, Google might determine that your site has changed so much that the old anchor text is no longer relevant, and devalue those older links entirely.

Anchor Text Freshness Signals

"The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good."

9. User Behavior Indicates Freshness

What happens when your once wonderful content becomes old and outdated? For example, your website hosts a local bus schedule... for 2009. As content becomes outdated, folks spend less time on your site. They press the back button to Google's results and choose another url.

Google picks up on these user behavior metrics and scores your content accordingly.

User Behavior for Freshness

"If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively."

10. Older Documents Still Win Certain Queries

Google understands the newest result isn’t always the best. Consider a search query for “Magna Carta". An older, authoritative result is probably best here. In this case, having a well-aged document may actually help you.

Google’s patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.

Older Content Wins Query

"For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set."

Conclusion

The goal of a search engine is to return the most relevant results to users. For your part, this requires an honest assessment of your own content. What part of your site would benefit most from freshness?

Old content that exists simply to generate pageviews, but accomplishes little else, does more harm than good for the web. On the other hand, great content that continually answers a user's query may remain fresh forever.

Be fresh. Be relevant. Most important, be useful.


Do you like this post? Yes No

Wake Up SEOs, the New Google is Here

Posted: 12 Dec 2011 01:51 AM PST

Posted by gfiorelli1

Puppy reaction while watching the Google updates list of 2011I must admit that lately Google is the cause of my headaches.

No, not just because it decided I was not going to be not provided with useful information about my sites. And neither because it is changing practically every tool I got used since my first days as an SEO (Google Analytics, Webmaster Tools, Gmail…). And, honestly, not only because it released a ravenous Panda.

No, the real question that is causing my headaches is: What the hell does Google want to go with all these changes?

Let me start quoting the definition of SEO Google gives in its Guidelines:

Search engine optimization is about putting your site's best foot forward when it comes to visibility in search engines, but your ultimate consumers are your users, not search engines.

Technical SEO still matters, a lot!

If you want to put your site’s best foot forward and make it the most visible possible in search engines, then you have to be a master in technical SEO.

We all know that if we do not pay attention to the navigation architecture of our site, if we don't care about the on-page optimization, if we mess up with the rel=”canonical” tag, the pagination and the faceted navigation of our web, and if we don’t pay attention to the internal content duplication, etc. etc., well, we are not going to go that far with Search.

Is all this obvious? Yes, it is. But people in our circle tend to pay attention just to the last bright shining object and forget what one of the basic pillars of our discipline is: make a site optimized to be visible in the search engines.

The next time you hear someone saying “Content is King” or “Social is the new link building”, snap her face and ask her when it was the last time she logged in Google Webmaster Tools.

Go fix your site, make it indexable and solve all the technical problems it may have. Just after done that, you can start doing all the rest.

User is king

Technical SEO still matters, but that does not mean that it is synonym of SEO. So, if you hear someone affirming it, please snap her face too.

No... content is not the only King. User is the King! Image by Jeff Gregory

User and useful have the same root: use. And a user finds useful a website when it offers an answer to her needs, and if its use is easy and fast..

From the point of view that Google has of User, that means that a site to rank:

  1. must be fast;
  2. must have useful content and related to what it pretends to be about;
  3. must be presented to Google so that it can understand the best it can what it is about.

The first point explains the emphasis Google gives to site speed, because it is really highly correlated to a better user experience.

The second is related to the quality of the content of a site, and it is substantially what Panda is all about. Panda, if we want to reduce it at its minimal terms, is the attempt by Google of cleaning its SERPs of any content it does not consider useful for the end users.

The third explains the Schema.org adoption and why Google (and the other Search Engines) are definitely moving to the Semantic Web: because it helps search engines organize the bazillion contents they index every second. And the most they understand really what is your content about, the better they will deliver it in the SERPs.

The link graph mappedThe decline of Link graph

We all know that just with on-site optimization we cannot win the SERPs war, and that we need links to our site to make it authoritative. But we all know how much the link graph can be gamed.

Even though we still have tons of reasons to complain with Google about the quality of SERPs, especially due to sites that ranks thanks to manipulative link building tactics, it is hard for me to believe that Google is doing nothing in order to counteract this situation. What I believe is that Google has decided to solve the problem not with patches but with a totally new kind of graph.

That does not mean that links are not needed anymore, not at all, as links related factors still represent (and will represent) a great portion of all the ranking factors, but other factors are now cooked in the ranking pot.

Be Social and become a trusted seed

In a Social-Caffeinated era, the faster way to understand if a content is popular is to check its "relative" popularity in the social media environment. I say “relative”, because not all contents are the same and if a meme needs many tweets, +1 and likes/share to be considered more popular than others, it is not so for more niche kind of contents. Combining social signals with the traditional link graph, Google can understand the real popularity of a page.

The problem, as many are saying since almost one year, is that it is quite easy to spam in Social Media.

The Facebook Social Graph from Silicon Angle

For this reason Google introduced the concepts of Author and Publisher and, even more important, Google linked them to the Google Profiles and is pushing Google Plus, which is not just another Social Media, but what Google aims to be in the future: a social search engine.

Rel=”author” and Rel=”publisher” are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs.

If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel=”publisher” in its code.

Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes.

The new Google graph

As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graph and which is possibly harder to game. Because it can be gamed still, but – hopefully – needing so many efforts that it may become not-viable as a practice.

Wake up SEOs, the new Google is here

As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine):

Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly.

This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You’ll have better, more relevant search results and ads.

Think about it this way … last quarter, we’ve shipped the +, and now we’re going to ship the Google part.

I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words.

What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don’t assume that winter - oops - the change of Google is coming.

The New SEO graph

 


Do you like this post? Yes No

Interactive Timeline: Ending the War in Iraq

The White House Your Daily Snapshot for
Tuesday, Dec. 13, 2011
 

Interactive Timeline: Ending the War in Iraq

Before the year ends, the last of our troops will cross the border and return home. After nine years, the war is over.

Check out the timeline of the Iraq war:

In Case You Missed It

Here are some of the top stories from the White House blog:

President Obama Welcomes Iraqi Prime Minister Nouri al-Maliki
President Obama met with Iraqi Prime Minister Nouri al-Maliki on Monday to discuss the end of the Iraq war and the steps necessary to realize a new phase in the relationship between the two countries.

By the Numbers: $1 Trillion
The war in Iraq cost $1 trillion, money that can now be invested in the American people as the war draws to a close.

From the Archives: President Obama Meets with Prime Minister Maliki in 2009
A look back at Prime Minister Nouri al-Maliki's visit to Washington in July of 2009.

Today's Schedule

All times are Eastern Standard Time (EST).

10:30 AM: The President is interviewed by regional television outlets

11:25 AM: The President delivers remarks at a campaign event

11:30 AM: The Vice President hosts a Cabinet meeting as part of the Administration’s Campaign to Cut Waste WhiteHouse.gov/live

12:45 PM: Press Briefing by Press Secretary Jay Carney WhiteHouse.gov/live

WhiteHouse.gov/live Indicates that the event will be live-streamed on WhiteHouse.gov/Live

Get Updates

Sign up for the Daily Snapshot

Stay Connected

This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Click here to unsubscribe | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111

 

 

SEOptimise

SEOptimise


Can Meeting Your Followers Face-to-Face Increase Loyalty?

Posted: 13 Dec 2011 04:35 AM PST

A few days ago I was looking through my mentions on Twitter and I noticed a rough correlation between the strength of relationship I have with a person and the frequency/consistency of their interaction with me. I also noticed that strength of relationship was in rough correlation to the medium(s) that I've communicated with them on. In other words, the followers who I'd only spoke to over e-mail or Twitter weren't interacting with me as often as those who I'd met in real life.

It got me thinking:  if the depth of a relationship impacts frequency of interaction online, and the medium in which I communicate with people on impacts the depth of relationship, is it possible to increase the loyalty of your followers by meeting them face-to-face?

How real are you to your Twitter followers? Image Credit: Aristocrats-hat

Meeting Face-to-Face Increases the Likelihood of Social Interaction, IF You Have Shared Interests.

When you meet someone face-to-face, you become more likely to then interact with them online, providing that you have shared interests and consider them to add value to your newsfeed. Having met someone in real life is a 'filter' that some people (subconsciously) use to prioritise who they interact with socially.

I decided to ask my followers what they thought about whether meeting someone in real life improves the likelihood of engaging with them on Twitter afterwards. The general consensus was that most people found meeting someone face-to-face does have a positive impact on interaction on Twitter.

Can Skype Increase the Loyalty of Your Audience?

Nowadays, the process of meeting someone has become more of a spectrum than a binary choice, as you can virtually meet people through video calls, instant messaging and audio technology.

Research has proven that the use of voice and video communications helps when building relationships. What this means is that hypothetically if you were to Skype someone, they'd be more likely to interact with you in the future than if you were to have solely had correspondence over e-mail or Twitter.

I personally love Skype and use it regularly as a way of getting to know new people whose blogs I find interesting or tweets I enjoy. I find that it adds so much more context to the relationship I end up building with that person, as quite often I will end up finding something I have in common with that person or learning about the interesting things that they're up to.

Skyping with one person a week will increase your followers’ loyalty
I strongly recommend Skype as a means of getting to know your followers. Deepening your relationships with your followers will not only allow you to get to know your audience, but it will also increase the interaction you have with your followers and they have with you. I recommend trying to put aside a little bit of time each week to having a Skype call with your followers, even if it's just one call a week.

Final Thoughts – How Many of Your Most Loyal Followers Have You Met?

I ran some data on my Twitter followers and found out that I've met 80% of my ten most loyal Twitter followers (those who retweet me the most) in real life. I asked around the office and on Twitter and the general consensus was that most people had met between 60 – 90% of their most loyal Twitter followers (the average being 76%). Although based on a small sample size, I think it's fair to assume that for most of us, the people who interact with us the most are those who we meet in person. Which, if you turn it around, makes a lot of sense – those who we share the most interests with (and thus interact with online) are the people we're most likely to meet and develop relationships with in person.

A big thanks to Stuart Duff, Richard Fergie, James Carson, Eloi Casali, Gavin Llewelyn, Bas Van Den Beld, Ruben Martinez, and the SEOptimise team for inputting into this post!

© SEOptimise - Download our free business guide to blogging whitepaper and sign-up for the SEOptimise monthly newsletter. Can Meeting Your Followers Face-to-Face Increase Loyalty?

Related posts:

  1. How to clean up your act and your timeline on Twitter
  2. Everything I Know About Effective Blogger Outreach
  3. 30 Ways to Use Social Media for Business People

Agent Rank: Google’s Internal Klout Score

Posted: 09 Dec 2011 04:18 AM PST

Agent 007

Recently I've written about Klout score optimisation. Since then I and others who outed themselves as actively using Klout have been attacked by self proclaimed SEO stars and other people who seemingly "hate Klout". Can you hate a metric? Obviously people get very emotional when it comes to Klout.

Klout measures the social media influence of people. While it fails at determining your real life influence, it's quite accurate for measuring how active and influential you are on social media, including Facebook, LinkedIn, Twitter and Google+.

That's why some people hate Klout:  they are only influential within a small closed group, while they have never shared enough with the general public on social media to get appreciation from the masses.

What did I say when people ridiculed me for using Klout to determine people's influence? I said that I am quite sure that Google internally has a similar system of finding out who exerts influence on the social web and who does not. It wasn't a very daring prediction, it was just an extrapolation based on the steps Google has undertaken in the past. Google has already been focusing on authorship, real names and the social graph for a while.

Now Bill Slawski has written an article on the reputation systems Google uses, might use or will use in the future. There are three mentioned in the post. The most interesting one is the Agent Rank. Not only does the name sound familiar and self-explanatory to some extent, but it's also a patent Google has filed. It most probably gets or will be used for Google +1 votes.

The Agent Rank patent does not describe in detail how such Agent Rank might rank people, but the other papers mentioned above suggest a few ways to determine trust on a collaborative social site. At Wikipedia for example:

"Users gain reputation when they make edits that are preserved by subsequent authors, and lose reputation when their work is partially or wholly undone."

What does this mean? Opportunism and mainstream opinions pay off.

This is similar to early Web 2.0 sites such as Digg or Hacker News, where a few dominant users gained reputation by submitting content from sites everybody likes and agrees on. Back then it was TechCrunch, for example. I have seen this phenomenon on my first generation social sites.

The more "one size fits all" and "smallest common denominator" a page was, the more likely it succeed.

With it the submitter succeeded as well. Thus people were always in a race to submit TechCrunch articles. The ones who submitted the most TechCrunch articles were the most reputable.

I've seen a similar phenomenon on the newer social sites you couldn't game as easily, people submitting all stories by the

  • NYT
  • Mashable
  • Search Engine Land
  • SEOmoz

or whatever the main authority in your field is. Most automated accounts do it. They gain authority by simply feeding RSS feeds of popular sites to Twitter, even though nobody clicks the links.

I can see because I see the stats from my blogs that get retweeted by these bots, the bit.ly stats for the URLs and the reputation metrics of these bot accounts. Many of them were able to game Topsy algorithm. Topsy is not as exact as Klout:  they just have three different kinds of user not influential ones, influential ones and highly influential ones.

Google will measure trust with more complexity than Klout

I guess but elements from all the measurement system above will be among the likely factors. We're still in a very early phase of this, but it's already clear that Google does not want to trust websites anymore but authors instead.

The insistence on real names and the many incentives to verify your identity on Google services all point in the same direction: Google will focus more on people than on websites in the future. Thus an author publishing a completely new website will be able to push it quickly to the top or at least the article they transfer their reputation to.

Also, the more reputable documents and pieces of content you support, the more reputable you get. As on Klout, it will be most likely the sheer activity that will make you more trustworthy. Google's Agent Rank won't be able to compute your reputation just based on one or two articles and a few votes.

The more you participate and the more content you create, the more you will become an authority. In the long run, Google will have to focus more and more on real authority that is not the sheer number of votes. In the beginning the search giant has no choice. It has to reward sheer activity as it doesn't have enough users, votes and other social signals yet.

You may have noticed that I use terms such as

  • reputation
  • trust
  • authority
  • influence

quite interchangeably in this article. As of yet, there is no clear standard on the web for measuring the worth or value of one's contributions. We already know that neither model is perfect yet but that the importance of measuring people and not websites is growing.

You don't have to be a prophet to extrapolate the most likely ranking signals for people Google will have to use. Consider these:

Activity – as noted above, you can't measure something where there is nothing – without activity there is not enough to measure. On the other hand there will be some limits to that. We know that having a million followers on Twitter does not necessarily mean that you are more important. Also, sending 50 automated messages a day may be too much.

Altruism – nobody, either on social media or in real life, likes constant self-promotion. Many marketers still get that wrong. You get what you give. Science has proven that only through altruism can the whole species survive. Algorithms can't rely on egoists to offer the best advice, as there would be no popularity at all:  everybody would just promote their own works. So it has to count the other people who share content, with whom they have no direct connection.

Authority – one of the reasons why Google succeeded in becoming the biggest search engine in the world is its reliance on authority. The more experts consider something to be a good resource, the better. It worked for a while with websites, as the original PageRank formula reflected the reputation model of the traditional scientific community. The more a document got cited, the better. We know that it's no longer enough, ever since Google has been the leading search engine. PageRank could be gamed quite easily with paid links. You can't bribe thousands of people as easily as you can pay for a few links though.

Expertise – authority can't be measured without measuring expertise as well. You can get very popular despite being dead wrong. So an Agent Rank will have to measure whether a given author gets supported by thousands or even millions of people who have no clue or whether they get supported by a few experts who are really knowledgeable in a given area of expertise.

Impartiality – just consider a Google +1 user who constantly votes up Fox News. Can Google count on this person to be an expert on news? Well, most probably the algorithm will consider such a user just an expert on US conservative views. In contrast, consider a user who gives +1 to all kinds of resources, including CNN, BBC, Al Jazeera etc. Will this user be more of an impartial expert?

Popularity – you may be right, but as long as you don't tell the world or convince more than a bunch of bookworms who do nothing else than deal with the issue all day, it won't be sign of influence. You have to be able to appeal to the masses. Google already favours Wikipedia in its results not because it's always the best results, but because most people can get that. Whether you search for SEO, film or God, Wikipedia will show up on top. You will surely agree that there are bigger authorities or better results for all three examples.

Quality – the aforementioned factor; mass appeal can be easily gamed though. We have seen content farms embrace the shallow but popular approach over the years until Google has to curb it. Quality will have to be measured as well. How can quality be determined? This is very difficult, I could write a huge post about this. The on-going Google high quality update, aka Panda, has been about it in 2011. The quality of published and voted for texts by authors will have to be determined by a complex mix of signals itself.

Reputation – someone can have mass appeal, be considered an expert by other experts, even be considered an authority. The reputation of this person can still be a nightmare. Just think about people like Jason Calacanis, Derek Powazek or Steve Rubel who declared SEO dead or rubbish. They are not even famous – they are infamous. People know them because they shout louder than others. So their reputation is awful no matter how much they can game other simpler social media metrics.

Topicality – as you see above these "experts" who indeed have enormous mass appeal, gained great success from their anti-SEO rants when measured by sheer reach and attention. Most of their other contributions haven’t been about SEO at all. So an Agent Rank will have to measure whether you are an expert on SEO, gardening or homeopathy. For example, Klout assumes I'm an expert on homeopathy because I've been involved in many online arguments with people who never tried it but attempt to convince me that it cannot work.

Trust – trust is not influence and not reputation either. Trust is about telling the truth, being reliable and not tricking people in order to gain something. How on earth do you measure that? You can be trustworthy without being influential or without having a reputation. You don't need to be an expert or have mass appeal to be trustworthy either. It's a very important but easy to grasp concept. Nonetheless you need it to survive, and Google will have to measure it as well. Can a person be trusted not to favour their own clients, colleagues or advertisers? Most people will have a bias. The less bias the better to determine a good resource or author. So Google will have to measure the trust other people ascribe to you.

Velocity – news that spreads fast is in many cases more important than news that spreads slowly. Of course this signal is not enough in most cases. Is the royal wedding or Osama Bin Laden's death really the most important news? It depends on many other factors. The speed with which articles by a particular author or social media user spread is one metric that has to included among the above as well. Some ideas need a decade or a century to spread; they aren't less important, they just need more time, but in many cases there’s a reason viral ideas spread like wildfire. Google will have to measure velocity, as it already does with breaking news.

 

It's a huge task to measure these abstract concepts, but at the end of the day they determine how important a person, a source or a document is.

Some old school SEOs who are envious of the social media influence of more active users frantically try to outpace the competition by making their employees vote them up on social sites or by bragging that they work for big brands and only accept the highest quality.

Telling people is not enough these days; you have to show or rather offer this quality while sharing your know-how free of charge, otherwise others will do it. Most people will look at the measurable social proof and not the clandestine contracts you have with a large corporation. Google will likewise care more for what other people say about you than what you say yourself or make your employees tell the world.

Already there are tendencies such as selling employee attention to the highest bidder as Walmart does with its more than one million underpaid workers. Google will have to determine quickly whether there are voting patterns between a particular group of people.

Still I'm quite optimistic, overall; authors will be judged by what they give to the world, not what they sell to a chosen few. That's a great way to find out what's important. I believe that 99% of the people know better than just the top 1%.

Let’s just hope that Google doesn’t mistake mob mentality for democracy.

© SEOptimise - Download our free business guide to blogging whitepaper and sign-up for the SEOptimise monthly newsletter. Agent Rank: Google's Internal Klout Score

Related posts:

  1. Klout Score Optimisation or Influencer SEO
  2. Linking Out Instead of Link Building to Rank in Google
  3. Where is Google Going with Google+ Pages?