vineri, 17 iulie 2015

Controlling Search Engine Crawlers for Better Indexation and Rankings - Whiteboard Friday - Moz Blog

Controlling Search Engine Crawlers for Better Indexation and Rankings - Whiteboard Friday

Posted by randfish

When should you disallow search engines in your robots.txt file, and when should you use meta robots tags in a page header? What about nofollowing links? In today's Whiteboard Friday, Rand covers these tools and their appropriate use in four situations that SEOs commonly find themselves facing.

Controlling Search Engine Crawlers for Better Indexation and Rankings Whiteboard

For reference, here's a still of this week's whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to talk about controlling search engine crawlers, blocking bots, sending bots where we want, restricting them from where we don't want them to go. We're going to talk a little bit about crawl budget and what you should and shouldn't have indexed.

As a start, what I want to do is discuss the ways in which we can control robots. Those include the three primary ones: robots.txt, meta robots, and—well, the nofollow tag is a little bit less about controlling bots.

There are a few others that we're going to discuss as well, including Webmaster Tools (Search Console) and URL status codes. But let's dive into those first few first.

Robots.txt lives at yoursite.com/robots.txt, it tells crawlers what they should and shouldn't access, it doesn't always get respected by Google and Bing. So a lot of folks when you say, "hey, disallow this," and then you suddenly see those URLs popping up and you're wondering what's going on, look—Google and Bing oftentimes think that they just know better. They think that maybe you've made a mistake, they think "hey, there's a lot of links pointing to this content, there's a lot of people who are visiting and caring about this content, maybe you didn't intend for us to block it." The more specific you get about an individual URL, the better they usually are about respecting it. The less specific, meaning the more you use wildcards or say "everything behind this entire big directory," the worse they are about necessarily believing you.

Meta robots—a little different—that lives in the headers of individual pages, so you can only control a single page with a meta robots tag. That tells the engines whether or not they should keep a page in the index, and whether they should follow the links on that page, and it's usually a lot more respected, because it's at an individual-page level; Google and Bing tend to believe you about the meta robots tag.

And then the nofollow tag, that lives on an individual link on a page. It doesn't tell engines where to crawl or not to crawl. All it's saying is whether you editorially vouch for a page that is being linked to, and whether you want to pass the PageRank and link equity metrics to that page.

Interesting point about meta robots and robots.txt working together (or not working together so well)—many, many folks in the SEO world do this and then get frustrated.

What if, for example, we take a page like "blogtest.html" on our domain and we say "all user agents, you are not allowed to crawl blogtest.html. Okay—that's a good way to keep that page away from being crawled, but just because something is not crawled doesn't necessarily mean it won't be in the search results.

So then we have our SEO folks go, "you know what, let's make doubly sure that doesn't show up in search results; we'll put in the meta robots tag:"

 

So, "noindex, follow" tells the search engine crawler they can follow the links on the page, but they shouldn't index this particular one.

Then, you go and run a search for "blog test" in this case, and everybody on the team's like "What the heck!? WTF? Why am I seeing this page show up in search results?"

The answer is, you told the engines that they couldn't crawl the page, so they didn't. But they are still putting it in the results. They're actually probably not going to include a meta description; they might have something like "we can't include a meta description because of this site's robots.txt file." The reason it's showing up is because they can't see the noindex; all they see is the disallow.

So, if you want something truly removed, unable to be seen in search results, you can't just disallow a crawler. You have to say meta "noindex" and you have to let them crawl it.

So this creates some complications. Robots.txt can be great if we're trying to save crawl bandwidth, but it isn't necessarily ideal for preventing a page from being shown in the search results. I would not recommend, by the way, that you do what we think Twitter recently tried to do, where they tried to canonicalize www and non-www by saying "Google, don't crawl the www version of twitter.com." What you should be doing is rel canonical-ing or using a 301.

Meta robots—that can allow crawling and link-following while disallowing indexation, which is great, but it requires crawl budget and you can still conserve indexing.

The nofollow tag, generally speaking, is not particularly useful for controlling bots or conserving indexation.

Webmaster Tools (now Google Search Console) has some special things that allow you to restrict access or remove a result from the search results. For example, if you have 404'd something or if you've told them not to crawl something but it's still showing up in there, you can manually say "don't do that." There are a few other crawl protocol things that you can do.

And then URL status codes—these are a valid way to do things, but they're going to obviously change what's going on on your pages, too.

If you're not having a lot of luck using a 404 to remove something, you can use a 410 to permanently remove something from the index. Just be aware that once you use a 410, it can take a long time if you want to get that page re-crawled or re-indexed, and you want to tell the search engines "it's back!" 410 is permanent removal.

301—permanent redirect, we've talked about those here—and 302, temporary redirect.

Now let's jump into a few specific use cases of "what kinds of content should and shouldn't I allow engines to crawl and index" in this next version...

[Rand moves at superhuman speed to erase the board and draw part two of this Whiteboard Friday. Seriously, we showed Roger how fast it was, and even he was impressed.]

Four crawling/indexing problems to solve

So we've got these four big problems that I want to talk about as they relate to crawling and indexing.

1. Content that isn't ready yet

The first one here is around, "If I have content of quality I'm still trying to improve—it's not yet ready for primetime, it's not ready for Google, maybe I have a bunch of products and I only have the descriptions from the manufacturer and I need people to be able to access them, so I'm rewriting the content and creating unique value on those pages... they're just not ready yet—what should I do with those?"

My options around crawling and indexing? If I have a large quantity of those—maybe thousands, tens of thousands, hundreds of thousands—I would probably go the robots.txt route. I'd disallow those pages from being crawled, and then eventually as I get (folder by folder) those sets of URLs ready, I can then allow crawling and maybe even submit them to Google via an XML sitemap.

If I'm talking about a small quantity—a few dozen, a few hundred pages—well, I'd probably just use the meta robots noindex, and then I'd pull that noindex off of those pages as they are made ready for Google's consumption. And then again, I would probably use the XML sitemap and start submitting those once they're ready.

2. Dealing with duplicate or thin content

What about, "Should I noindex, nofollow, or potentially disallow crawling on largely duplicate URLs or thin content?" I've got an example. Let's say I'm an ecommerce shop, I'm selling this nice Star Wars t-shirt which I think is kind of hilarious, so I've got starwarsshirt.html, and it links out to a larger version of an image, and that's an individual HTML page. It links out to different colors, which change the URL of the page, so I have a gray, blue, and black version. Well, these four pages are really all part of this same one, so I wouldn't recommend disallowing crawling on these, and I wouldn't recommend noindexing them. What I would do there is a rel canonical.

Remember, rel canonical is one of those things that can be precluded by disallowing. So, if I were to disallow these from being crawled, Google couldn't see the rel canonical back, so if someone linked to the blue version instead of the default version, now I potentially don't get link credit for that. So what I really want to do is use the rel canonical, allow the indexing, and allow it to be crawled. If you really feel like it, you could also put a meta "noindex, follow" on these pages, but I don't really think that's necessary, and again that might interfere with the rel canonical.

3. Passing link equity without appearing in search results

Number three: "If I want to pass link equity (or at least crawling) through a set of pages without those pages actually appearing in search results—so maybe I have navigational stuff, ways that humans are going to navigate through my pages, but I don't need those appearing in search results—what should I use then?"

What I would say here is, you can use the meta robots to say "don't index the page, but do follow the links that are on that page." That's a pretty nice, handy use case for that.

Do NOT, however, disallow those in robots.txt—many, many folks make this mistake. What happens if you disallow crawling on those, Google can't see the noindex. They don't know that they can follow it. Granted, as we talked about before, sometimes Google doesn't obey the robots.txt, but you can't rely on that behavior. Trust that the disallow in robots.txt will prevent them from crawling. So I would say, the meta robots "noindex, follow" is the way to do this.

4. Search results-type pages

Finally, fourth, "What should I do with search results-type pages?" Google has said many times that they don't like your search results from your own internal engine appearing in their search results, and so this can be a tricky use case.

Sometimes a search result page—a page that lists many types of results that might come from a database of types of content that you've got on your site—could actually be a very good result for a searcher who is looking for a wide variety of content, or who wants to see what you have on offer. Yelp does this: When you say, "I'm looking for restaurants in Seattle, WA," they'll give you what is essentially a list of search results, and Google does want those to appear because that page provides a great result. But you should be doing what Yelp does there, and make the most common or popular individual sets of those search results into category-style pages. A page that provides real, unique value, that's not just a list of search results, that is more of a landing page than a search results page.

However, that being said, if you've got a long tail of these, or if you'd say "hey, our internal search engine, that's really for internal visitors only—it's not useful to have those pages show up in search results, and we don't think we need to make the effort to make those into category landing pages." Then you can use the disallow in robots.txt to prevent those.

Just be cautious here, because I have sometimes seen an over-swinging of the pendulum toward blocking all types of search results, and sometimes that can actually hurt your SEO and your traffic. Sometimes those pages can be really useful to people. So check your analytics, and make sure those aren't valuable pages that should be served up and turned into landing pages. If you're sure, then go ahead and disallow all your search results-style pages. You'll see a lot of sites doing this in their robots.txt file.

That being said, I hope you have some great questions about crawling and indexing, controlling robots, blocking robots, allowing robots, and I'll try and tackle those in the comments below.

We'll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

You are subscribed to the Moz Blog newsletter sent from 1100 Second Avenue, Seattle, WA 98101 United States
To stop receiving those e-mails, you can unsubscribe now.
Newsletter powered by FeedPress

Seth's Blog : Raising money is not the same thing as making a sale

Raising money is not the same thing as making a sale

Both add to your bank balance...

But raising money (borrowing it or selling equity) creates an obligation, while selling something delivers value to a customer.

Raising money is hard to repeat. Selling something repeatedly is why you do this work.

If things are going well, it might be time to sell more things to even more customers, so you won't ever need to raise money.

And if things aren't going well, the money you'll be able to raise will come with expectations or a price you probably won't be happy to live with.

When in doubt, make a customer happy.

[My exception: it pays to borrow money to pay for something (an asset) that delivers significantly more value to more customers more profitably over time. In the right situation, it's an essential building block to significance, but it's too often used as a crutch.]

[A different myth, re book publishing.]

       

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.



Email subscriptions powered by FeedBlitz, LLC, 365 Boston Post Rd, Suite 123, Sudbury, MA 01776, USA.

joi, 16 iulie 2015

Mish's Global Economic Trend Analysis

Mish's Global Economic Trend Analysis


Bailout Marches On, Decision Makers Ignore IMF; Schäuble Puts Grexit Back on Table; Banks to Open, With a Catch

Posted: 16 Jul 2015 04:02 PM PDT

As the Greek bailout marches on, no one that matters dares ask the pertinent question: How can Greece pay back over €400 billion, when they could not pay back either of the last two bailouts?

Paul Pays Peter to Pay Paul

Supposedly we have progress. After all, Greece will be able to make its required July 20 ECB repayment.

How?

Thanks to another €86 billion bailout, Greece will be handed the money and will hand it right back. The first installment will magically be just enough for Greece to repay the ECB.

Draghi Affirms Faith

In this faith-based, can-kicking exercise, Draghi Affirms Faith in Greece's Place in Euro.
Mario Draghi, head of the European Central Bank, affirmed his faith in Greece remaining in the euro as the central bank raised its limit on emergency loans to Greek banks by €900m over one week.

"The ECB continues to act on the assumption that Greece is and will remain a member of the euro area," Mr Draghi said.
Banks to Open, With a Catch

Greek banks purportedly will open on Monday after having been shut for about two weeks.

Is there a catch? You bet.

Banks will be open for "all services which do not give rise to capital flight" and capital controls will remain in place."

Want to make a deposit? Sure, that will be allowed. Want your money back? Well, you are going to have a problem getting it.

How big a problem? We find out Monday.

To give appearance to the idea that things are improving, I suspect customers will find they can take out another €10 euros or so, per day, perhaps €70 euros, up from the current €60.

If my guess is correct (and you are foolish enough to have €25,550 in the bank), it will take you a year to get your money, even if you religiously take out the maximum every day.

Of course, that assumes everyone else does not do the same. If they do, the ECB will put a halt to it.

Schäuble Says Grexit a Better Idea

Reuters reports Schäuble Casts Doubt on Chance of Greek Bailout Success.
German Finance Minister Wolfgang Schaeuble questioned whether Greece will ever get a third bailout programme on Thursday, a day after the Greek parliament passed a package of stringent measures required to open negotiations on financial aid.

Schaeuble has submitted a request to parliament to agree to opening talks, but he has said it would be hard to make Greece's debt sustainable without writing some of it off - an idea Berlin considers to be illegal as long as Greece remains within the euro zone.

Schaeuble, who has raised the idea that Greece take a "time-out" from the euro zone, said a haircut would be incompatible with the currency union's rules. "But this would perhaps be the better way for Greece," he said.

The proposal for a temporary 'Grexit' has already caused ructions in Merkel's ruling coalition, upsetting some senior Social Democrats.
Question of Haircuts

Schäuble's position, and that of the German constitution, is there can be no haircuts within the eurozone.

No one of any importance really cares about the German constitution or logical assumptions on debt sustainability.

Merkel will ram through the bailout package, oblivious to Schäuble's objections, and also oblivious to the IMF position that Greek debt is not remotely sustainable and haircuts are needed.

For details on the IMF's position, please see White Knight Irony: IMF Threatens to Walk Away From Bailout Deal Citing Unsustainable Debt.

Will the IMF do what they threaten?

Good question, but so far no one but Germany has done what they threatened.

Addendum: Grexit Back on Table

Shortly after I typed the above, the Financial Times provided more details in Germany's Wolfgang Schäuble Puts Grexit Back on the Agenda.
Days after Greece appeared to escape crashing out of the euro, hawkish German finance minister Wolfgang Schäuble has put Grexit back on the political agenda, raising tensions in Berlin and across the EU.

Speaking before a key Bundestag vote on Friday, Mr Schäuble said voluntary departure from the eurozone "could perhaps be a better way" for Greece than a proposed €86bn bailout package, which was painfully assembled at a marathon eurozone summit in Brussels over the weekend.

It is uncertain how much leeway he has been given by chancellor Angela Merkel to advance a historic rupture of the eurozone that he believes would ultimately strengthen both Greece and the single currency.

Mr Schäuble said in a radio interview there was widespread concern — including at the International Monetary Fund — that Greece needed a debt cut for the rescue to work. But, he noted, a "debt cut is incompatible with membership of the currency union".

Some EU officials believe Mr Schäuble's repeated insistence that the IMF, which has partnered the EU in previous rescues, be included in a new bailout may be intended to engineer an eventual Grexit.
Another Hour Another Twist

If Germany approves the bailout package, which seems overwhelmingly likely, then Merkel has a huge problem if the IMF insists on haircuts.

Clearly that is Schäuble's fear, and quite a reasonable one.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com

"Too Soon to Fret" Says ECRI, No Recession on Horizon: Another Blown ECRI Call?

Posted: 16 Jul 2015 12:49 PM PDT

After insisting for over a year that the US was on verge of recession if not in one (I made the same mistake), The ECRI now maintains Recoveries Remain Resilient.
Growing ranks of the great and the good are worried that the global economy, like Humpty Dumpty, will have a great fall, never to be put together again. We understand their apprehension, given our concern since the summer of 2008 about collapsing trend growth.

[Mish comment: what about the ECRI recession call made in September 2011?]

But in terms of our current assessment of global recession risk, we aren't ready to join in. This is because the major developed economies aren't yet in windows of vulnerability that our leading indexes are designed to detect.

Right now, no matter what the source of the potential shock – Greece, China or the Fed – the major developed economies are unlikely to be tipped into recession. No doubt this will change in time, and the U.S., in particular, bears close watching. But it's too soon to fret about recession just yet.
False Alarm

I was wondering if the ECRI ever admitted their mistake. I found it, on May 11, 2015, quite a few years late.

Doug Short on Advisor perspective writes ACHUTHAN: The US recession I've been warning about for years was actually a 'false alarm'
The ECRI finally admits to a bad recession call in September 2011, referring to it as a "false alarm". They describe the situation as "Greater Moderation", where the 2012-2013 downturn was the worst "non-recession" in 50 years and is unlikely to be repeated.

History of ECRI's 2011 Recession Call

ECRI's weekly leading index has become a major focus and source of controversy ever since September 30, 2011, when ECRI publicly announced that the U.S. is tipping into a recession, a call the Institute had announced to its private clients on September 21st.

Chronology of ECRI's Recession Call

  • September 30, 2011 : Recession Is "Inescapable"
  • February 24, 2012 : GDP Data Signals U.S. Recession
  • May 9, 2012 : Renewed U.S. Recession Call
  • July 10, 2012 : "We're in Recession Already"
  • September 13, 2012 : "U.S. Economy Is in a Recession"

Tipping Into Recession

Here's the September 30, 2011 U.S. Economy Tipping into Recession call.
Early last week, ECRI notified clients that the U.S. economy is indeed tipping into a new recession. And there's nothing that policy makers can do to head it off.

ECRI's recession call isn't based on just one or two leading indexes, but on dozens of specialized leading indexes, including the U.S. Long Leading Index, which was the first to turn down – before the Arab Spring and Japanese earthquake – to be followed by downturns in the Weekly Leading Index and other shorter-leading indexes. In fact, the most reliable forward-looking indicators are now collectively behaving as they did on the cusp of full-blown recessions, not "soft landings."

Why should ECRI's recession call be heeded? Perhaps because, as The Economist has noted, we've correctly called three recessions without any false alarms in-between.
Repeated Lies

Quite frankly that last statement is a lie as I have pointed out before. The ECRI called the 2007 recession way late.

In March of 2008, and when they finally did, they called it a "recession of choice".

I don't mind blown calls. I certainly have had a number of them. What I do mind is repeated lies about them.

A Look at ECRI's Recession Predicting Track Record

On October 13, 2009 I penned A Look at ECRI's Recession Predicting Track Record.

I pointed to the November-December 2007 ECRI Outlook. Unfortunately, the link I had no longer works, but I did capture this image.



Window of Opportunity 

On January 25, 2008, the ECRI claimed There Is A Window of Opportunity for the US Economy.
The U.S. economy is now in a clear window of vulnerability, given the plunge in ECRI's Weekly Leading Index (WLI) since last spring. Yet there is a brief window of opportunity within that window of vulnerability to avert a recession. That is why ECRI has not yet forecast a recession.

If we have a recession this year, it will be the best advertised in history. Recently, several Wall Street houses joined the 70% of Americans who have been expecting a recession for the last few months. A number of other prominent economists boosted their estimates of the probability of a recession above 50%.

Yet such probability estimates imply that a recession is a matter of chance, whereas it is still a matter of choice. This is why, having correctly predicted the last two recessions in real time without crying wolf in between, we are not forecasting one yet.
ECRI Denial

The ECRI laid it on pretty thick, openly mocking the "best advertised [recession] in history" while claiming "This is why, having correctly predicted the last two recessions in real time without crying wolf in between, we are not forecasting one yet."

The irony is the recession was about 2 months old at the time.

Recession of Choice

Finally, on Friday, March 28, 2008 the ECRI pronounced a "A Recession of Choice".
The U.S. economy is now on a recession track. Yet this is a recession that could have been averted. In January, given the plunge in the Weekly Leading Index, we declared that the economy had entered a clear window of vulnerability. Yet we emphasized the brief window of opportunity within that window of vulnerability for timely policy stimulus to head off a recession.

The bottom line is that the outcome was not pre-ordained. Policy-makers had a choice about the speed with which stimulus took effect. If they had understood this, their actions could indeed have averted this recessionary downturn.
A Choice in 2008, but No Choice in 2011?

We are supposed to believe there was a choice in 2008 (two or three months into a recession), but no choice in 2011 for a recession that never happened!

Third Blown Call Coming Up?

And the ECRI still brags about not missing calls, while attempting to sweep two blown calls under the rug.

Is a third consecutive blown call on the way?

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com

Damn Cool Pics

Damn Cool Pics


Fun And Interesting Facts About The Terminator Movies

Posted: 16 Jul 2015 07:05 PM PDT

The Terminator movies are like a juggernaut that just won't stop. The first film was released in 1984 and a new film "Terminator: Genisys" was just released a few weeks ago. Load up on Terminator knowledge with these fun facts. 

The idea for the Terminator series came to James Cameron when he was in Rome during the release of his film 'Piranha II: The Spawning'. Cameron fell ill and had a fever-dream in his hotel room about "this metal death figure coming out of a fire … the implication was that it had been stripped of its skin by the fire and exposed for what it really was."



Cameron admits that the final design for the T-800 is identical to the 'death metal' figure he saw in his dream. Which is actually kind of…



…Terrifying.



Cameron had more than nine months before work began on 'The Terminator' movie as Arnold was busy with 'Conan The Destroyer'. Since he didn't have time to do a full movie, Cameron wrote the screenplay that later developed into the movie 'Aliens' .



Arnold was apprehensive of initially playing the T-800 as it was a villain's role. Cameron, however, convinced the actor that the movie will be shot in such a way that made audiences cheer the killing machine.



However, In Judgment Day, Arnold didn't like that T-800 would now be a good guy, and that he would not be killing people. So, he convinced James Cameron that the character would only stop killing when John Connor would ask him to.



Arnold Schwarzenegger and James Cameron violently disagreed on the former's iconic catchphrase, "I'll be back." Arnold wanted to say "I will be back" because he thought it sounded more machine-like, while "I'll" sounded too feminine. All Cameron had to say to that was "I don't tell you how to act, so don't tell me how to write."



Given Arnold Schwarzenegger's $15-million salary and his total of 700 words of dialog, he was paid $21,429 per word. "Hasta la vista, baby" cost $85,716.



In Poland, The Terminator was re-named, The Electronic Murderer. You see, in Polish "terminator" more or less means "an apprentice".



The Terminators famous laser pistol was a custom built Colt .45 longslide. The laser sight was custom-made for the movie, with a 10,000 volt power supply hidden in Arnold's pocket.



Surprisingly for a James Cameron film, 'The Terminator' had hardly any special effects. Most effects were created and shot in-camera while some other sequences used miniatures sets. In fact, the skulls being crushed in the opening sequence were about the size of marbles.



Cameron also shot most of the scenes of 'The Terminator' at night with streets that had mercury-vapour lamps that helped to keep the filming costs low. This later gave the film its neo-noir look.



Industrial Light and Magic's computer graphics department had to grow from six artists to almost 36 to accommodate all the work required to bring the T-1000 to life, costing $5.5 million and taking 8 months to produce, which ultimately amounted to 3.5 minutes of screen time.



The Terminators seen at the beginning of the movie were fully workable animatronic models.



In the movie's last scene at the steel mill, Linda Hamilton's twin sister Leslie doubled for her when the T-1000 acquires Sarah Connor's form.



At the time T2 was being shot, Edward Furlong (who plays a young John Conner) was only 13 years of age. Because of a sudden voice change, Furlong had to rerecord a substantial amount of his dialogue.



Arnold's 'gunflip' during the chase scene in Terminator 2: Judgement Day is arguably one of the most memorable scenes in the series. For that scene to materialize, however, the studio had to make a gun with a larger lever as doing it with a regular gun could have broken the actor's fingers.



For the scene where the naked T-1000 arrives and steals the cops clothes, the effects team had to digitally remove a sensitive part of Robert Patrick's anatomy.



'Terminator 2: Judgment Day' was the first film that cost more than $100 million to make. It was an investment well worth it. This movie also set a precedent for winning an Oscar award when its prequel was not even nominated.



More explicit shots of the arm cutting scene were removed as director James Cameron felt they were tasteless and unnecessary.



To make the Terminator seem more unsettling, Schwarzenegger refrained from blinking wherever possible and spent the majority of the film with his skin covered in a thin layer of Vaseline so that his face had a perpetually waxy appearance.



The damaged Terminator look in the climax of the film took five hours to apply and an hour to remove.



With the film's domestic box office adjusted for inflation, Terminator 2: Judgement Day is the top grossing R-rated action film of all time.

A Quick Look At The Last 21 Years Of Scarlett Johansson's Career

Posted: 16 Jul 2015 05:03 PM PDT

Scarlett Johansson has been lighting up our screens for 21 years now. See how much she's changed throughout the course of her career. 

1994 - North



1995 - Just Cause



1996 - Manny & Lo



1997 - Home Alone 3



1998 - The Horse Whisperer



1999 - My Brother the Pig



2000 - An American Rhapsody



2001 - Ghost World



2002 - Eight Legged Freaks



2003 - Lost in Translation



2004 - In Good Company



2005 - Match Point



2006 - The Prestige



2007 - The Nanny Diaries



2008 - Vicky Cristina Barcelona



2009 - He's Just Not That Into You



2010 - Iron Man 2



2011 - We Bought a Zoo



2012 - The Avengers



2013 - Don Jon



2014 - Lucy



2015 - Avengers: Age of Ultron

If You Want A Cat That Stays A Kitten Forever You Need A Sand Cat

Posted: 15 Jul 2015 09:13 PM PDT

Cat owners often talk about how they want their cats to stay kittens forever. Well the good news is, that can happen you just need to get a sand cat.











You'll Be Much Smarter After You Learn These Fun Facts

Posted: 15 Jul 2015 09:01 PM PDT

You just never know when these fun and random facts might come in handy.