marți, 13 decembrie 2011

Damn Cool Pics

Damn Cool Pics


How Safe is Pepper Spray? [infographic]

Posted: 13 Dec 2011 12:25 PM PST



pepper spray has become the crowd control method of choice for lawmen around the world – even when the crowd is not out of control. We've recently seen it used against non-violent protesters on the University of California, Davis campus, and against non-violent Occupy Wall Street protesters. Just how dangerous is pepper spray? It can kill you.

Click on Image to Enlarge.

Source: onlinecriminaljusticedegree


Taekwondo Finger Guy in Korea

Posted: 12 Dec 2011 08:49 PM PST



There is a new Karate kid in town that I wouldn't want to tango with. His name is Taekwondo Finger guy. He tries and shows his skills by breaking some wafers which is not an easy thing to do. If you ever see this finger guy in the streets, run. Nothing good can come about fighting a guy like that.


Lady Gaga Before She Was Famous

Posted: 12 Dec 2011 08:45 PM PST

These are some photos of Lady Gaga, which were shot befor she has made a fame and was still known as Stephanie Germanotta. I think she is very lovely then. Do you think so ?
























































































































Tips for Single Ladies from 1938

Posted: 12 Dec 2011 08:31 PM PST

Apparently, the only keys to successful dating in the 1930's for ladies were don't talk too much, wear a bra, and don't pass out in the middle of your date because you're drunk.


























Freshness Factor: 10 Illustrations on How Fresh Content Can Influence Rankings

Freshness Factor: 10 Illustrations on How Fresh Content Can Influence Rankings


Freshness Factor: 10 Illustrations on How Fresh Content Can Influence Rankings

Posted: 12 Dec 2011 01:06 PM PST

Posted by Cyrus Shepard

In 2003, engineers at Google filed a patent that would rock the SEO world. Named Document Scoring Based on Document Content Update, the patent not only offered insight into the mind of the world’s largest search engine, but provided an accurate roadmap of the path Google would take for years to come.

In his series on the 10 most important search patents of all time, Bill Slawski shows how this patent spawned many child patents. These are often near-duplicate patents with slightly modified passages – the latest discovered as recently as October 2011. Many of the algorithmic changes we see today are simply improvements of these original ideas conceived years ago by Google engineers.

One of these recent updates was Google’s Freshness Update, which places greater emphasis on returning fresher web content for certain queries. Exactly how Google determines freshness was brilliantly explored by Justin Briggs in his analysis of original Google patents. Justin deserves a lot of credit for bringing this analysis to light and helping to inspire this post.

Although the recent Freshness Update received a lot of attention, in truth Google has scored content based on freshness for years.

How Google Scores Fresh Content

Google Fellow Amit Singhal explains that “Different searches have different freshness needs.”

The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query. While some queries need fresh content, Google still uses old content for other queries (more on this later.)

Singhal describes the types of keyword searches most likely to require fresh content:

  • Recent events or hot topics: “occupy oakland protest” “nba lockout”
  • Regularly recurring events: “NFL scores” “dancing with the stars” “exxon earnings”
  • Frequent updates: “best slr cameras” “subaru impreza reviews”

Google’s patents offer incredible insight as to how web content can be evaluated using freshness signals, and rankings of that content adjusted accordingly.

Understand that these are not hard and fast rules, but rather theories consistent with patent filings, experiences of other SEOs, and experiments performed over the years. Nothing substitutes for direct experience, so use your best judgement and feel free to perform your own experiments based on the information below.

Images courtesy of my favorite graphic designer, Dawn Shepard.

1. Freshness by Inception Date

A webpage is given a “freshness” score based on its inception date, which decays over time. This freshness score can boost a piece of content for certain search queries, but degrades as the content becomes older.

The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.

Inception Date for Freshness

"For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set."
    
  - All quotes from US Patent Application Document Scoring Based on Document Content Update

2. Document Changes (How Much) Influences Freshness

The age of a webpage or domain isn’t the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn’t change. In this case, the amount of change on your webpage plays a role.

For example, the change of a single sentence won’t have as big of a freshness impact as a large change to the main body text.

Content Changes for Freshness

"Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

3. The Rate of Document Change (How Often) Impacts Freshness

Content that changes more often is scored differently than content that only changes every few years. In this case, consider the homepage of the New York Times, which updates every day and has a high degree of change.

How Often Content Changes for Freshness

"For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time."

4. Freshness Influenced by New Page Creation

Instead of revising individual pages, websites add completely new pages over time. This is the case with most blogs. Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.

Some SEOs insist you should add 20-30% new pages to your site every year. This provides the opportunity to create fresh, relevant content, although you shouldn’t neglect your old content if it needs attention.

New Pages Influence Freshness

"UA may also be determined as a function of one or more factors, such as the number of “new” or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document."

5. Changes to Important Content Matter More

Changes made in “important” areas of a document will signal freshness differently than changes made in less important content. Less important content includes navigation, advertisements, and content well below the fold. Important content is generally in the main body text above the fold.

Boilerplate Changes Count Less for Freshness

"…content deemed to be unimportant if updated/changed, such as Javascript, comments, advertisements, navigational elements, boilerplate material, or date/time tags, may be given relatively little weight or even ignored altogether when determining UA."

6. Rate of New Link Growth Signals Freshness

If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you are about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)

That said, an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Be careful, as engines are likely to devalue such behavior.

Link Growth Rate for Freshness

"…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document’s score."

7. Links from Fresh Sites Pass Fresh Value

Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.

For example, if you obtain a link off an old, static site that hasn’t been updated in years, this doesn't pass the same level of freshness value as a link from a fresh page – for example, the homepage of Wired.com. Justin Briggs coined this FreshRank.

Freshrank Illustration

"Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh."

8. Changes in Anchor Text Signals may Devalue Links

If a website changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.

For example, If you buy a domain about automobiles, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.

In this instance, Google might determine that your site has changed so much that the old anchor text is no longer relevant, and devalue those older links entirely.

Anchor Text Freshness Signals

"The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good."

9. User Behavior Indicates Freshness

What happens when your once wonderful content becomes old and outdated? For example, your website hosts a local bus schedule... for 2009. As content becomes outdated, folks spend less time on your site. They press the back button to Google's results and choose another url.

Google picks up on these user behavior metrics and scores your content accordingly.

User Behavior for Freshness

"If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively."

10. Older Documents Still Win Certain Queries

Google understands the newest result isn’t always the best. Consider a search query for “Magna Carta". An older, authoritative result is probably best here. In this case, having a well-aged document may actually help you.

Google’s patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.

Older Content Wins Query

"For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set."

Conclusion

The goal of a search engine is to return the most relevant results to users. For your part, this requires an honest assessment of your own content. What part of your site would benefit most from freshness?

Old content that exists simply to generate pageviews, but accomplishes little else, does more harm than good for the web. On the other hand, great content that continually answers a user's query may remain fresh forever.

Be fresh. Be relevant. Most important, be useful.


Do you like this post? Yes No

Wake Up SEOs, the New Google is Here

Posted: 12 Dec 2011 01:51 AM PST

Posted by gfiorelli1

Puppy reaction while watching the Google updates list of 2011I must admit that lately Google is the cause of my headaches.

No, not just because it decided I was not going to be not provided with useful information about my sites. And neither because it is changing practically every tool I got used since my first days as an SEO (Google Analytics, Webmaster Tools, Gmail…). And, honestly, not only because it released a ravenous Panda.

No, the real question that is causing my headaches is: What the hell does Google want to go with all these changes?

Let me start quoting the definition of SEO Google gives in its Guidelines:

Search engine optimization is about putting your site's best foot forward when it comes to visibility in search engines, but your ultimate consumers are your users, not search engines.

Technical SEO still matters, a lot!

If you want to put your site’s best foot forward and make it the most visible possible in search engines, then you have to be a master in technical SEO.

We all know that if we do not pay attention to the navigation architecture of our site, if we don't care about the on-page optimization, if we mess up with the rel=”canonical” tag, the pagination and the faceted navigation of our web, and if we don’t pay attention to the internal content duplication, etc. etc., well, we are not going to go that far with Search.

Is all this obvious? Yes, it is. But people in our circle tend to pay attention just to the last bright shining object and forget what one of the basic pillars of our discipline is: make a site optimized to be visible in the search engines.

The next time you hear someone saying “Content is King” or “Social is the new link building”, snap her face and ask her when it was the last time she logged in Google Webmaster Tools.

Go fix your site, make it indexable and solve all the technical problems it may have. Just after done that, you can start doing all the rest.

User is king

Technical SEO still matters, but that does not mean that it is synonym of SEO. So, if you hear someone affirming it, please snap her face too.

No... content is not the only King. User is the King! Image by Jeff Gregory

User and useful have the same root: use. And a user finds useful a website when it offers an answer to her needs, and if its use is easy and fast..

From the point of view that Google has of User, that means that a site to rank:

  1. must be fast;
  2. must have useful content and related to what it pretends to be about;
  3. must be presented to Google so that it can understand the best it can what it is about.

The first point explains the emphasis Google gives to site speed, because it is really highly correlated to a better user experience.

The second is related to the quality of the content of a site, and it is substantially what Panda is all about. Panda, if we want to reduce it at its minimal terms, is the attempt by Google of cleaning its SERPs of any content it does not consider useful for the end users.

The third explains the Schema.org adoption and why Google (and the other Search Engines) are definitely moving to the Semantic Web: because it helps search engines organize the bazillion contents they index every second. And the most they understand really what is your content about, the better they will deliver it in the SERPs.

The link graph mappedThe decline of Link graph

We all know that just with on-site optimization we cannot win the SERPs war, and that we need links to our site to make it authoritative. But we all know how much the link graph can be gamed.

Even though we still have tons of reasons to complain with Google about the quality of SERPs, especially due to sites that ranks thanks to manipulative link building tactics, it is hard for me to believe that Google is doing nothing in order to counteract this situation. What I believe is that Google has decided to solve the problem not with patches but with a totally new kind of graph.

That does not mean that links are not needed anymore, not at all, as links related factors still represent (and will represent) a great portion of all the ranking factors, but other factors are now cooked in the ranking pot.

Be Social and become a trusted seed

In a Social-Caffeinated era, the faster way to understand if a content is popular is to check its "relative" popularity in the social media environment. I say “relative”, because not all contents are the same and if a meme needs many tweets, +1 and likes/share to be considered more popular than others, it is not so for more niche kind of contents. Combining social signals with the traditional link graph, Google can understand the real popularity of a page.

The problem, as many are saying since almost one year, is that it is quite easy to spam in Social Media.

The Facebook Social Graph from Silicon Angle

For this reason Google introduced the concepts of Author and Publisher and, even more important, Google linked them to the Google Profiles and is pushing Google Plus, which is not just another Social Media, but what Google aims to be in the future: a social search engine.

Rel=”author” and Rel=”publisher” are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs.

If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel=”publisher” in its code.

Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes.

The new Google graph

As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graph and which is possibly harder to game. Because it can be gamed still, but – hopefully – needing so many efforts that it may become not-viable as a practice.

Wake up SEOs, the new Google is here

As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine):

Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly.

This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You’ll have better, more relevant search results and ads.

Think about it this way … last quarter, we’ve shipped the +, and now we’re going to ship the Google part.

I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words.

What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don’t assume that winter - oops - the change of Google is coming.

The New SEO graph

 


Do you like this post? Yes No