vineri, 25 martie 2011

Damn Cool Pics

Damn Cool Pics


Munchkin Cats

Posted: 24 Mar 2011 09:58 PM PDT

The Munchkin is a cat breed created by a naturally occurring genetic mutation that results in cats with abnormally short legs. However, the shortness of their legs does not seem to interfere with their running and leaping abilities.

The gene responsible has been compared to the one that give Welsh Corgis, Basset Hounds and Dachshunds their short stature; however, Munchkins do not suffer from the many spinal problems that are typically associated with those canine breeds as cats' spines are physically different from dogs'. The spine of a Munchkin cat is usually indistinguishable from that of other cats.




























Creative Double Page Magazine Ads

Posted: 24 Mar 2011 09:42 PM PDT

Today, when a single person sees about 3000 ads each day, and we are becoming more and more blind to advertising, companies must find new and creative ways to catch consumer's attention. If you are having a hard time coming up with new ideas, or you are just a looking for some inspiration this collection of Creative Double Page Magazine Ads should really help you!
































































Source: boredpanda


SEOmoz Daily SEO Blog

SEOmoz Daily SEO Blog


Which Link Metrics Should I Use? Part 1 of 2 - Whiteboard Friday

Posted: 24 Mar 2011 02:13 PM PDT

Posted by Aaron Wheeler

 For both personal knowledge and client satisfaction, it's really important to be able to track SEO progress quantitatively as well as qualitatively. One of the benefits and detriments of the field of SEO is that there is a lot of data out there, which helps make SEO tracking easier but at the same time can be overwhelming to even advanced SEOs. In a lot of ways, it's just a matter of choosing which data to use. For instance, just as you wouldn't use a katana to spread chevre, you wouldn't use the PR of a homepage to track a domain's success in search results. In a two part series beginning today, Rand is going to go over the definitions of some of the most popular metrics available right now, as well as the best ways to use metrics in your SEO analysis. Check back next Friday for part 2!

 

Video Transcription

Howdy SEOmoz fans. Welcome to another edition of Whiteboard Friday. I'm very excited to have you with us today. Today we're talking about a question that plagues a lot of webmasters, a lot of SEOs, and a lot of marketers. And that is: How should I use metrics? There are so many metrics on the Web. How should I use metrics to analyze links, analyze pages, analyze sites, use them in my link building practices, and use them in my outreach efforts?

Today I'm going to start with part one of two. In part one of two, what we're going to cover is all of the metrics that are available for links, well, many of the metrics that are available for links from the primary sources, and what each of these mean, because it can be really confusing if you don't know and aren't familiar with these metrics, to apply them in your day-to-day work. But if you know what these mean, you'll be able to have a lot of insight into how they can be used to analyze the sites and pages that you're looking at and what you can do with them. So, let's get started.

First off, we've got three big groups of places that link metrics come from, at least from the SEO perspective. There's Open Site Explorer and Linkscape. There's Google, Yahoo!, and Bing. There's Majestic SEO. All three of these – Google, Yahoo!, and Bing of course being separate ones – but all three of these have metrics that they compose on their own.

That means if you're looking in lots of different tools, for example, I saw someone asking in the SEOmoz Q&A, I'm using this link diagnostics tool, or I'm using Raven SEO. Or I'm using some other tool sources. I'm using Google Webmaster Tools. I'm using the search engines. I'm using a third party tool that's pulling in different information. Virtually all of those sources come from one of these three.

Essentially, they all build their own indices, right? Open Site Explorer and Linkscape builds its own index of the Web. Majestic SEO has their index of the Web. Google and Bing have their own indices of the Web. Yahoo!'s is going away. But all of these can produce individual metrics.

So let's start with Open Site Explorer and Linkscape. Linkscape powers Open Site Explorer. It powers the link intersect tool. It powers your pro web app, and it powers lots and lots of other tool bars, so the mozBar, the quirk search status bar, the SEO book toolbar. And you'll find these metrics in tools like Raven, as well.

First off, mozRank. mozRank is analogous to Google's PageRank. It's essentially using virtually the same formula. Ours is slightly different, so as not to be patent infringing, but it runs the same way. So it's an iterative algorithm running across the Web's link graph, and essentially, it says links are votes. The pages with more votes have higher page rank, and therefore, they cast more important votes. It's sort of like a representational democracy with mozRank and PageRank.

mozRank is for an individual page only, so it's assigned to each URL on the Web. For the 40 billion to 50 billion pages that are in our Web index, most of those pages have very, very tiny amounts of mozRank. So when you see 3, 4, 5, those may not look like big scores, but remember that mozRank is logarithmic. That means that there's a tremendous number of pages with very low scores, and then it's increasingly harder and harder to get more and more mozRank.

For example, a mozRank 5 is actually, I think, eight and a half times more important. It means it has eight and a half times more link juice, or more mozRank, than a mozRank 4 web page. So this is a very good thing to know. And that's why it's so granular, why it shows two significant digits, like a 4.56, because 4.56 might actually be substantially more than, say, a 4.21. That's quite a big difference.

mozTrust is similar to mozRank, but it does something very unique. It biases so that mozRank, or link juice, can only flow from trusted sites, and then it calculates the same type of thing. Essentially, what it's saying is not every page on the Web passes mozRank, only these initial trusted seed sets of sites, which we essentially gather the same way as we've seen search engines do it in their research papers and patents. Identifying those sites and having those, I believe it's around 250 or 350 sites passing out mozTrust.

So the more mozTrust you have, if I've got a 3.75 mozTrust and I move up to a 5.05, if your mozTrust moves up in this fashion, this means that not only have you gotten more links, which will be reflected in your mozRank as well, but it means that you've gotten more trusted links. It means that the sources that are linking to you are coming from better and better places, or the sources that are already linking to you are getting more trust from the sites that are linking to them. Those are both possibilities.

Domain mozRank and Domain mozTrust are essentially exactly the same as mozRank and mozTrust, but they happen on the domain-wide level. So the problem with looking at mozRank on a homepage or PageRank on a homepage is that it's not actually for the domain as a whole. If I go to SEOmoz.org and I look at its homepage page rank, it used to be an 8 and now it's a 7. That doesn't actually tell me how important the whole domain is. It just tells me how important the homepage of that site is. That's not what I want. What I want is how important is this domain on the Web compared to all the other domains. That's exactly what these two metrics will do. The first one, mozRank, will look at raw popularity, raw importance. The second one, mozTrust, looks at trustworthiness of that domain.

Next, you get some metrics that you should be pretty familiar with. They're fairly self-explanatory. So there's number of links, and number of links will include all the kinds of links that we know about – followed and no-followed links, external and internal links to a page, 301 redirects to a page. Soon, they're going to include rel=canonicals, the number of pages that rel=canonical to a page, and we'll be marking those out as that's become a pretty big part of the Web now. With each of these, you can dig deeper. So if you're in Open Site Explorer or if you're in the mozBar, you can dig deeper into a full list of metrics and get all of those.

Number of linking route domains is similar in that it describes the number of links. But rather than saying this is how many unique pages have a link here, it's how many domains as a whole have a link here. Number of linking root domains is well correlated with Google's rankings generally indicating that domain diversity, getting links from lots of different places, is quite good. In fact, the best single metric, non-aggregated metric, that we've got to predict Google's rankings with correlation data is the number of Linking C-Blocks. C-Blocks is a little bit tricky. A domain might be something like SEOmoz.org, but a C-Block might include SEOmoz.org and OpenSiteExplorer.org, and I think we might host a few other domains, SEOmoz.com, which redirects to SEOmoz.org.

Linking C-Blocks is essentially saying, "This C-Block of IP addresses, how many of those are there on the Web that contain at least one link to this page?" So looking at linking C-Blocks can be quite a good metric, as well. This is currently in our API, so you can download and look at it. I think a few different tools use it. Soon it will be in more of our places. The new version of Open Site Explorer that's coming out in July should have that.

Page authority and domain authority are a little complicated. Basically, imagine all of these metrics and lots more of these metrics, lots more pieces and factors of these metrics being calculated using a machine learning model against Google's search results to try and predict what single metric best correlates by aggregating all of these and multiplying them and dividing them and using all sorts of fragments of them to get the highest correlation numbers.

Page authority is a number from 0 to 100 that describes how important or how potentially well this page could rank given no other features about it. So we don't know what query it's trying to rank for. We don't know the anchor text that it's trying to rank for. We don't know what keywords are on that page or anything. All we know is based on the links, how well-correlated is this particular page with rankings?

The second one, domain authority, is the same thing of 0 to 100, but doing a similar thing for domains. It's essentially saying, "How well would this domain overall perform?" As you can imagine, correlation with page authority is better than domain authority. We're going to talk about how to use all these metrics in the next segment, where I might possibly be wearing this shirt. That's just a coincidence, never mind that.

Let's jump over to Google, Yahoo!, Bing, and Majestic and talk about some of their metrics, as well. Google obviously calculates PageRank internally. They have PageRank scores for every page on the Web. My understanding is that's updated multiple times per day. But in the toolbar, which is where we get the data, we get the data in the Google toolbar, which looks like this and it's colored in and it's a 4 out of 10, that amount of PageRank.

The Toolbar PageRank is only updated every three to nine months, so not particularly regularly. It doesn't always reflect the true numbers. Sometimes Google will penalize pages or sites by removing their Toolbar PageRank, bringing down their Toolbar PageRank if they think they've been selling links and they want to show that they know about those sold links, that kind of thing.

The toolbar number from 0 to 10 is a rough indication of how important Google thinks that page is. But I'd be careful about relying on it because it's not updated very frequently. You could launch a new site tomorrow and it would be three or four or five months before it showed PageRank, and yet it would have PageRank probably starting the next day when you got links into it. PageRank does correlate very well to mozRank. They're usually just a few, .5 or .6 apart. But mozRank updates every time the Linkscape web index updates, which is once a month. PageRank is much less frequently.

Homepage PageRank or what some people call Site PR – they'll say my website is PR 6 or my website is a PR 5. This is a fallacy. There are no PR 6 websites. There is only a website whose homepage has a PageRank score of a 5 or a 6 or a 7, whatever it is. Site PR is not a particularly good metric, and it doesn't actually describe the site. Sometimes you'll find sites that have a Homepage PageRank of a 5, but internal pages that are a 6 or a 7. That happens quite frequently when there are important resources on those sites that get more linked to than the homepage.

This number can be found in Google's toolbar, and there are lots of other toolbars that you can add in and many tools show it. The SEOmoz toolset doesn't show it. Some of you might know that Google asked us a couple years ago not to show PageRank in our web app or our toolbar anymore. So we took it out because we wanted to stay on good terms with those guys.

The number of links that Google shows, this is via the link colon command, so link:www.SEOmoz.org will show a number of links that are not particularly interesting or accurate. It's usually a very small sub-sample. I think for SEOmoz they show maybe 1,500 or 1,800 links. Obviously, there are several hundred thousand, maybe millions of links pointing to SEOmoz.

The reason that they do that is because they don't want to show all the link information that they've got. If you go inside Google Webmaster Tools, they will show you a more accurate, but still not wholly accurate, link count. But that will only be seen for your particular site that you've registered in Google Webmaster Tools. So do be aware of that.

Yahoo! also shows the number of links. They show kind of two link numbers. One comes from Site Explorer, which may or may not be going away. We still haven't heard from Bing about what's happening with that. The other one is from the Yahoo! Web Index, which has gone away in a lot of places, but you can still find it in a few countries. For example, if you go to Yahoo! India, which I think is IN.Yahoo.com, you can still run link commands against that web index and see numbers of links. They don't exactly match up to the Site Explorer numbers, but it's okay. When you're using Site Explorer and the Yahoo! index, it's more about trying to find who's linking to this page, particularly if Open Site Explorer or Majestic is not showing that data. Yahoo! is often fresher and crawls more deeply than some of those other ones.

Let's move into Majestic SEO. Before I get started, I just want to say, although they're a competitor, I have a lot of respect for these guys. They've done some great work. I am not intimately familiar, so I hope I'm going to describe them accurately. As far as I know, from talking to the guys over there, what I've got is pretty right, but someone can correct me in the comments if I'm wrong.

Majestic does show, like SEOmoz, the number of links. That's just the raw number of pages that are linking to a particular site. That's not a key metric, though. The key metrics that they usually show right on top in the new Majestic explorer is, I think, number of external back links, which essentially says how many links come from sites that are not this site, not counting internal links.

They also show referring domains, which is their word for linking root domains, I believe. I think that's root domains, not sub-domains, when you look at referring domains. They have numbers of unique IP addresses as well as number of Class C subnets. So, Class-C subnets correlates to Linking C-Blocks.

Here's the tough part. When you look at these numbers, SEOmoz's numbers, Majestic's numbers, Yahoo!'s numbers, Google's numbers, they're all different. The reason is pretty obvious, because they all maintain different web indices. Majestic has an extremely large web index, much larger than Yahoo!'s or Linkscape's, but it's quite old. There is a lot of old data in there that hasn't necessarily been recrawled that might exist or might not. There's not a lot of canonicalization and de-duplication of content, which the search engines and Linkscape are relatively better at.

Google and Yahoo!, obviously, have great web indices, but they expose much less data about them and a lot fewer metrics. SEOmoz has a smaller web index that's updated once a month, 40 billion or 50 billion pages, versus Google, Yahoo!, and Bing, which are probably in the 100 to 110, maybe 120 billion range. So when you're comparing these numbers, you're not always going to get good similarity between them.

What you will find, though, is if you compare competitors or if you compare different websites with each other, so if I compare SEOmoz.org and Search Engine Land and SEO Book and Webmaster World, I should see that usually, in each of these cases, the link numbers are going to be higher for one of them and lower for another one within a certain percentage range. Those are the numbers that you want to pay some more attention to.

Now you've got a really good background on all these link metrics, a ton of link metrics. Next week, we're going to talk about how to use and apply these link metrics in your link building outreach processes. Take care. See you later.

Video transcription by SpeechPad.com


Do you like this post? Yes No

West Wing Week: "OCONUS II: Mamalluca"

The White House Your Daily Snapshot for
Friday, March 25,  2011
 

West Wing Week: "OCONUS II: Mamalluca"

West Wing Week is your guide to everything that's happening at 1600 Pennsylvania Avenue. This week, President Obama remained focused on Libya, receiving secure communications from his national security team as the First Family visited Latin America. The President made stops in Brazil, Chile, and El Salvador to promote American exports and economic cooperation among the neighbors in our hemisphere.

Watch the video.

In Case You Missed It

Here are some of the top stories from the White House blog.

Better Benefits, Better Health for Women
As part of the one-year anniversary of the Affordable Care Act, the Secretary of Health and Human Services talks about the law's benefits for women's health care.

What You Missed: Open for Questions on the Startup America Initiative to Grow Entrepreneurship and Innovation
Director of the National Economic Council, Gene Sperling, and SBA Administrator, Karen Mills, along with Fast Company Editor Nancy Cook, answer your questions on the Startup America initiative to grow entrepreneurship and innovation.

Small Businesses Contribute to NASA's Mission
NASA Administrator Charles Bolden reports on the important role that small businesses play in shaping NASA's present and future.

Today's Schedule

All times are Eastern Daylight Time (EDT).

9:30 AM: The President receives the Presidential Daily Briefing

10:30 AM: The President meets with senior advisors

1:00 PM: Briefing by Press Secretary Jay Carney WhiteHouse.gov/live

4:35 PM: The President hosts a reception for Greek Independence Day WhiteHouse.gov/live

WhiteHouse.gov/live Indicates events that will be live streamed on White House.com/Live.

Get Updates

Sign Up for the Daily Snapshot 

Stay Connected

 

This email was sent to Email Address
Manage Subscriptions for Email Address
Sign Up for Updates from the White House

Unsubscribe Email Address | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111 
    
 

 

 
  
 

SEOptimise

SEOptimise


What Checking Broken Links Can Teach You About the Web & Linking Out

Posted: 24 Mar 2011 07:25 AM PDT

Probably you haven’t noticed yet as I haven’t told you yet: starting from 2011 I’m stepping up my efforts on SEOptimise. I write more regularly for the blog and I’ll help the SEOptimise team with their blog SEO among others. So I start with the basics: broken links.

Blogs amass a record number of broken links over the years as they often deal with short-lived developments or news.

The Web deteriorates fast and with it your site quality when you link out. In my last post I argued that linking out is important for SEO. That’s true but you have to keep in mind who you link out to. Seeing hundreds of websites either disappear or move without proper redirects can teach you a lot about the Web and links. What did I learn?

 

Google, Microsoft and Yahoo don’t care for broken links

You might assume that companies owning search engines themselves would get the issue of broken links on their own sites. They don’t. There are plenty links to fix here. Google is really awful, many URLs do neither return a 404 nor a 301 to the new version. Yahoo and Microsoft sites like MSN tend to offer proper 404 pages bu does not bother to redirect to newer URLs. So in case you care about longevity you better just link to the blog posts that announce a new service by Google as these tend to stay online longer.

 

Startups tend to disappear after a year or two

In my highly popular tools list which I compiled over the years but even with a list covering web analytics I saw a considerable number of dead links. I’ve seen that before on my own blog over at SEO 2.0 – tools disappear fast, especially tools based on third party services liketter. Almost all of the first generation Twitter tools from around 2008 had disappeared by the time I update it in late 2010. At the same time most of the blog links were still working. Even those that get acuired are often gone, either by getting swallowed completely by the company that bought them or by sheer neglect.

 

Sites by people who comment for SEO vanish fast

When installing Broken Link Checker for WordPress and letting it run for a while on a blog that has been online for a few years you might experience a shock: hundreds of links are broken! Soon you’ll discover though that most broken links stem from blog comments. Most of these are not your regular commneters but those who offer less value and comment mostly for SEO purposes. It’s amazing how many comments, track and pingbacks lead to nowhere after a while. It’s often generic sites the likes of expert-internet-marketing-services.biz that vanish the fastest.

 

Some well known SEO publications do not redirect all posts to new URLs

Pronetadvertising.com was one of my favorite sites over the years. Also Kevin has linked their posts as far back as 2007. They are still around but some seemingly still valid posts have changed URLs or disappeared completely. I could restore some of those links manually both on SEOptimise and on SEO 2.0 I was surprised though to see SEO experts forget this. There are many reasons for such issues, sometimes the blog gets rid of old outdated posts. In other cases the company restructures or the blog gets underfunded in the long run so that obvious tasks like redirecting to new URLs get forgotten.

 

Even flagship blogs sometimes go offline after a few years

This is probably the saddest part of checking broken links: witnessing the demise of your idols’ blogs. Well known and respected industry blogs doshdosh.com and tropicalseo.com are just two sad examples of blogs from acknowledged experts that are no more. So long out is always a bit of a hazard n the long run. Even the best of us can fail to keep up their blogs after a few years.

 

Black hat SEOs hijack your legitimate links

Linking out to bad neighborhoods is even worse than linking to broken pages as your site is not just full of holes due to broken links but also may inadvertently become part of spammy link network. Thus you have to check redirects as well not just 404s. Often links get redirected to

 

How can you check your blog/site for broken links?

You can use lots of free or affordable tools.

  1. Google Webmaster Tools
  2. Screaming Frog SEO Spider
  3. Broken Link Checker for WordPress
  4. Xenu LinkSleuth
  5. Website Auditor
  6. Web Link Validator
  7. Orchid Box Broken Links Checker

 

How to link out for longevity?

Link out to news sources from stable websites or blogs not the actual tools you describe. The tool might be long gone while the article will be there for years. So you better link to an NYT article about a new feature by Google than the Google feature itself which might disappear altogether in a year. Remember Google Wave and Google SearchWiki. There were gone within a year.

 

How to fix outgoing links without breaking your posts?

  • Link out to archive.org for important sites. When doshdosh.com went down I linked to the archived versions of Maki’s articles.
  • Use a site:brokenlinkurl.com “headline of linked post” to find it elsewhere on the site.
  • Unlink the the link altogether when it doesn’t hurt the value of the post
  • Delete the whole post whenever a post only consists of the link or is built around on it. Many early posts on SEOptimise were tumblogging short notes consisting mainly of a link.
  • Make a screen shot and link to it for pages where you expect a quick change (like AFP stories on Google News or Wikipedia that might get deleted soon)
  • Save a local copy and set up a mirror once the original disappears you can use the Firefox extension ScrapBook o save web pages with ease

 

Why check and fix broken links at all?

The SEOptimise blog exists since 2006, yes it’s already five years old so it’s only logical that many links are not valid anymore. Google seems to assume that older blogs have lots of broken links as I couldn’t see much of an positive impact in search results after fixing of hundreds broken links on my own blog over at SEO 2.0.

At the same time I’ve seen static pages jump back to top results after an update with the obligatory link check. It may depend on the level of change or on the fact of adding a new date.

Still I highly recommend fixing your old postings as it directly harms the user experience and ultimately your reputation when visitors end up on outdated content full of broken or misleading outgoing links. Sometimes even internal links break. As search engines often show older content people will end up on your outdated pages as well.

So at the end of the day it doesn’t matter whether Google really takes broken links into account as a direct ranking factor.

You have to fix links anyways to stay a credible source over the years. It’s also not just about the broken links. Years old advice or suggestions are often not true anymore or there are better solutions by now. Thus I sometimes replace old links with new ones to a completely different resource.

© SEOptimise – Download our free business guide to blogging whitepaper and sign-up for the SEOptimise monthly newsletter. What Checking Broken Links Can Teach You About the Web & Linking Out

Related posts:

  1. Linking Out Instead of Link Building to Rank in Google
  2. How to Write Highly Popular 30+ Item List Blog Posts to Get Links and Traffic
  3. Advanced Blog & WordPress SEO: 30 Points Most Bloggers Overlook

Seth's Blog : "How much can I get away with?"

"How much can I get away with?"

There are two ways to parse that question.

The usual way is, "How little can I do and not get caught?" Variations include, "Can we do less service? Cut our costs? Put less cereal in the box? Charge more?" In short: "How little can I get away with?"

The other way, the more effective way: "How much can we afford to give away? How much service can we pile on top of what we're selling without seeming like we're out of our minds? How big a portion can we give and still stay in business? How fast can we get this order filled?"

In an era in which the middle is rapdily emptying out, both edges are competitive. Hint: The overdelivery edge is an easier place to make a name for yourself.

 
Email to a friend

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.


Click here to safely unsubscribe now from "Seth's Blog" or change your subscription, view mailing archives or subscribe

Your requested content delivery powered by FeedBlitz, LLC, 9 Thoreau Way, Sudbury, MA 01776, USA. +1.978.776.9498