miercuri, 16 martie 2011

Damn Cool Pics

Damn Cool Pics


Malawi Boy Builds Windmill Out of Junk

Posted: 15 Mar 2011 08:06 PM PDT

William Kamkwamba of Malawi was expelled from his high school for non-payment. At 14 years old he went to the library and started reading an American textbook called 'Using Energy.' People thought the kid was nuts. Turns out he was just a genius. For full story click here.


Leather Hands 'Vertical Lines'

Posted: 15 Mar 2011 08:05 PM PDT

Sex sells, and this music clip is straight to the point.


Bizarre Vending Machines

Posted: 15 Mar 2011 06:56 PM PDT

Vending machines are typically known as the best place to get an overpriced bottle of soda, or some chips that never quite replace the 2 meals you've missed that day. In our narrow way of thinking, we tend to think of vending machines as small, mechanized snack food dispensaries and nothing more. We in the Western world tend to be a little unimaginative with our vending machines. We think a vending machine with a credit card slot is a massive leap forward when it's not even close to the astoundingly strange, yet magnificently cool vending machines that are actually out there.

Here are some of them.

Used Schoolgirl Panties


Live Lobsters


Live Bait


Eggs


Pizza


Umbrellas


Porn


Kosher Foods


Batteries


Neckties


Sex Toys


Crabs


Proactiv


Pringles


Beer


Instant Noodles


Sneakers


Bread


Pet Beetles


Toilet Paper


Designer Condoms


iPods


SEOmoz Daily SEO Blog

SEOmoz Daily SEO Blog


Restricting Robot Access for Improved SEO

Posted: 15 Mar 2011 02:03 PM PDT

Posted by Lindsay

Left to their own devices, search engine spiders will often perceive important pages as junk, index content that shouldn’t serve as a user entry point, generate duplicate content, along with a slew of other issues. Are you doing everything you can to guide bots through your website and make the most of each visit from search engine spiders?

It is a little like child-proofing a home. We use child safety gates to block access to certain rooms, add  inserts to electrical outlets to ensure nobody gets electrocuted, and place dangerous items out of reach. At the same time we provide educational, entertaining, and safe items within easy access. You wouldn't open the front door of your unprepared home to a toddler, then pop out for a coffee and hope for the best.

Think of Googlebot as a toddler (If you need a more believable visual, try a really rich and very well-connected toddler). Left to roam the hazardsGood girl, reading a book. unguided you'll likely have a mess and some missed potential on your hands. Remove the choice to access the troublesome areas of your website and they’re more likely to focus on the good quality options at hand instead.

Restricting access to junk and hazards while making quality choices easily accessible is an important and often overlooked component of SEO.

Luckily, there are a number of tools that allow us to make the most of bot activity and keep them out of trouble on our websites. Lets look at the four main robot restriction methods; the Meta Robots Tag, Robots.txt files, the X-Robots Tag, and the Canonical Tag. We’ll summarize quickly how each method is implemented, cover the pros and cons of each, and provide examples of how each one can be best used.

CANONICAL TAG

The canonical tag is a page level meta tag that is placed in the HTML header of a web page. It tells the search engines which URL is the canonical version of the page being displayed. Its purpose is to keep duplicate content out of the search engine index while consolidating your pages strength into one ‘canonical’ page.

The code looks like this:

<link rel="canonical" href="http://example.com/quality-wrenches.htm"/>

There is a good example of this tag in action over at MyWedding. They used this tag to take care of tracking parameters important to the marketing team. Try this url - http://www.mywedding.com/?utm_source=whatever-they-want-to-track. Right click on the page, then view the source. You'll see the rel="canonical" entry on the page.

Pros

  • Relatively easy to implement. Your dev group can move on to bigger fish.
  • Can be used to source content across domains. This may be a good solution if you have syndication deals in the works but don't want to compromise your own search engine presence.

Cons

  • Relatively easy to implement incorrectly (see catastrophic canonicalization)
  • Search engine support can be spotty. The tag is a signal more than a command.
  • Doesn't correct the core issue.

Example Uses

  • There are usually other ways to canonicalize content, but sometimes this is a solid solution given all variables.
  • Cindy Krum, a Moz associate, recommends canonical tag use if you run into a sticky situation and your mobile site version is outranking your traditional site.
  • If you don't want to track your referal parameters with a cookie, the canonical tag is a good alternative.

ROBOTS.TXT

Robots.txt allows for some control of search engine robot access to a site; however it does not guarantee a page won’t be indexed. It should be employed only when necessary. I generally recommend using the Meta tag “noindex” for keeping pages out of the index instead. so easy a monkey could do it

Pros

  • So easy a monkey could do it.
  • Great place to point out XML Sitemap files.

Cons

  • So easy a monkey could do it (see Serious Robots.txt Misuse)
  • Serves as a link juice block. Search engines are restricted from crawling the page content so (internal) links aren't followed and passed the value they deserve.

no-juice-passes

Example Uses

  • I recommend only using the robots.txt file to show that you have one. It shouldn't really restrict anything, but serves to point to the XML Sitemaps or an XML Sitemap direcotry file.
  • Check out the SEOmoz robots.txt file. It is fun and useful.

META ROBOTS TAG

The Meta robots tag creates page-level instructions for search engine bots. The Meta robots tag should be included in the head section of the HTML document. Here is some info on how the tag should look in your code.

Meta Robots Commands

The Meta Robots Tag is my very favorite option. By using the 'noindex' tag, you keep content out of the index but the search engine spiders will still follow the links and pass the link love.

Pros

  • Use of 'noindex' keeps a page out of the search index better than other options like a robots.txt file entry.
  • As long as you don't use the 'nofollow' tag, link juice can pass. Woot!
  • Fine tune your entries in the SERPs by specifying NOSNIPPET, NOODP, or NODIR. (You're getting all fancy on me now!)

Cons

  • Many quite smart folks use 'noindex, nofollow' together and miss out on the important link juice flow piece. :(

Example Uses

  • Imagine that your log-in page is the most linked to (and powerful) page on your website. You don't want it in the index, but you certainly don't want to add it to the robots.txt file because that is a link juice block.
  • Search result sort pages.
  • Paginated versions of pages.

X-ROBOTS-TAG

Since 2007 Google and other search engines have supported the X-Robots-Tag as a way to inform the bots about crawling and indexing preferences in the HTTP Header used to serve the file. The X-Robots-Tag is very useful for controlling indexation of non-HTML media types such as PDF documents.

Pros

  • Allows you to control indexation of unusual content like Excel files, PDFs, PPTs, and whatever else you've got hanging around.

Cons

  • This kind of weird content can be troublesome in the first place. Why not publish an HTML version on the web for indexation and this secondary file type for download, etc?

Example Uses

  • You offer product information on your site in HTML, but your marketing department also wants to make the beautiful PDF version available. You'd add the X-Robots to the PDFs.
  • You have an awesome set of excel templates that are link bait. If your bothered by the Excel files outranking your HTML landing pages you could add noindex to your x-robots tag in teh HTTP Header.

Lets Turn this Ship Back Around

What was all the baby talk you started out with, Lindsay? Oh, that's right. Thanks. In your quest to bot-proof your website, you have a number of tools at lets-turn-this-ship-aroundyour disposal. These differ greatly from those used for baby-proofing but the end result is the same. Everybody (babies and bots) stays safe, on track, out of trouble, and focused on the most important stuff that is going to make a difference. Instead of baby gates and electric socket protectors you've got the Meta Robots Tag, Robots.txt files, the X-Robots Tag, and the Canonical Tag.

In my personal order of preference, I'd go with...

  1. Meta Robots Tag
  2. Canonical Tag
  3. X-Robots-Tag
  4. Robots.txt file

Your Turn!

I would love, love, love to hear how you use each of the above robot control protocols for effective SEO. Please share your uses and experience in the comments and let the conversation flow.

Happy Optimizing!

Stock Photography by Photoxpress


Do you like this post? Yes No

Removing Mercury and Other Toxics From the Air We Breathe

The White House Wednesday, March 16, 2011
 


Today, EPA Administrator Lisa Jackson announced
proposed Mercury and Air Toxics Standards - commonsense goals for reducing harmful pollution in the air we breathe that can save lives, prevent illnesses and promote the creation of new jobs.

We want to be sure you saw this blog post from EPA Administrator Lisa Jackson on WhiteHouse.gov.

Tomorrow at 10:55 a.m. EST Administrator Jackson will be hosting a special live Open for Questions discussion on this important issue.

Tune into WhiteHouse.gov/live to watch live and submit your questions on Facebook.

Removing Mercury and Other Toxics From the Air We Breathe
By EPA Administrator Lisa P. Jackson

This week, the EPA proposed Mercury and Air Toxics Standards, a Clean Air Act protection that sets the first-ever national safeguards to limit power plant releases of mercury, arsenic, chromium, nickel and acid gases into the air we breathe.

America’s power plants are the source of half of the mercury emissions, half of the acid gases, and a quarter of all toxic metal pollution in the U.S, and almost half of America’s coal plants lack advanced pollution controls. Instead of operating without set limits for these pollutants – which are linked to costly and often fatal health threats like asthma, cancer and developmental disorders – American power plants will install widely available, American made pollution control technology to cut emissions.

Setting commonsense goals for reducing harmful pollution in the air we breathe can save lives, prevent illnesses and promote the creation of new jobs. We’re confident in these expectations for the Mercury and Air Toxics Standards because this has been the history of Clean Air Act protections for the last forty years.

In 2010 alone, protections in the Clean Air Act prevented 160,000 premature deaths and 170,000 hospital visits. Cleaner air has meant trillions of dollars in benefits to our nation – not only through fewer medical bills, but by keeping our kids in school and our workers on the job. The Clean Air Act has also helped create jobs. As of 2008 the environmental technology industry – which develops, manufactures and maintains the tools that help keep our air clean – employed more than 1.7 million Americans.

The Mercury and Air Toxics Standards build on this decades-long success. Once the standards are in place, widespread use of existing pollution control technology will prevent an estimated 17,000 premature deaths and 11,000 heart attacks each year. These safeguards will also protect against 120,000 incidents of childhood asthma symptoms and ensure 11,000 fewer cases of acute bronchitis in children each year, making this is one of the largest steps forward in protecting our kids from toxic air pollution in a generation.

Implementing these proposed standards is also expected to create jobs. The Mercury and Air Toxics Standards will increase demand for pollution control technology that is already being produced by American companies. And new workers will be needed to install, operate and maintain pollution control technology. We estimate these first-ever standards will support 31,000 construction jobs and 9,000 long-term utility jobs.

The Mercury and Air Toxics Standards will also be beneficial to American utilities. Setting clear standards alleviates 20 years of uncertainty, and opens a long-awaited path for investments in multi-pollution reduction planning, energy efficiency and clean technology. It will level the playing field, closing loopholes for big polluters and putting our cleanest power generators at a competitive advantage. Consistent with the President’s Executive Order, EPA is ensuring flexibility, cost effectiveness and robust public comment before finalizing the standards.

The Mercury and Air Toxics Standards represent a milestone in the Clean Air Act’s already unprecedented record of defending the health of American families. At the EPA, we are eager to work with the American people through the coming public comment period, so that we can craft safeguards that best protect our health and strengthen our economy.

Get Updates

Sign up for the Energy and Environment Agenda

 

 
 
This email was sent to e0nstar1.blog@gmail.com
Manage Subscriptions for e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Unsubscribe e0nstar1.blog@gmail.com | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111