vineri, 4 septembrie 2015

Mish's Global Economic Trend Analysis

Mish's Global Economic Trend Analysis


US Role in Europe's Refugee Crisis; Migration in Numbers; Dead Baby Syndrome; Australia PM Promotes Hard-Line Stance

Posted: 04 Sep 2015 10:33 AM PDT

Crisis in Numbers

With an influx of 800,000 migrants, per year, and rising steeply, Europe struggles with what to to with the refugees.

Here's the Migrant Crisis in Numbers.
The EU is struggling to respond to a surge of desperate migrants, thousands of whom have perished in their attempts to seek a better life in Europe. Where are they going and where are they coming from?

The largest group of people reaching Europe through the Mediterranean or the western Balkans are Syrians fleeing a civil war, but there are also many from Eritrea and Afghanistan, as well as Kosovo and Nigeria.
EU Migrants



Italy



Greece



Hungary



EU rules require refugees apply for asylum in the country in which they first arrive. That is unworkable because nearly all the refugees first arrive in peripheral countries.

Dead Baby Syndrome

Pictures of mass migration camps did not change sympathies much, but an image of a small child washed up on shore did.

For similar reasons, that's why all the hunger donation sites always use an image of a single child for their poster campaigns.



With that and other images, German chancellor Angela Merkel sought to shame the UK into taking more refugees.

EU Leaders Soften Stance

The Financial Times reported EU Leaders Soften Stance on Migrants Amid Harrowing Images.
EU leaders began to soften their resistance to sharing the burden of Europe's migrant crisis, spurred by its harrowing images of human misery, as Brussels prepared a plan to quadruple the number of asylum seekers member states must accept.

The shifting stance remained uneven, with some countries in eastern Europe, particularly Hungary, insisting those fleeing the war-torn Middle East and north Africa were still not welcome in Europe and should remain in stable neighbouring countries such as Turkey.

Viktor Orban, Hungarian prime minister, said the crisis was "a German problem".

But France said it would now support a push for mandatory quotas on the number of asylum seekers member states would be required to take in. It will make joint proposals with Germany in the coming days.

Jean-Claude Juncker, the European Commission president, will next week call for EU countries take in 160,000 migrants from Greece, Italy and Hungary, a fourfold increase on the 40,000 proposed less than two months ago.

The most outspoken critic of mandatory relocations has been Mr Orban who has come under intense criticism for erecting a razor-wire fence on his country's border with Serbia, which has become a major crossing point for Syrians and other refugees fleeing the region.

But German chancellor Angela Merkel, who has put intense pressure on other European leaders to be more forthcoming, again laid down the gauntlet, saying all of the EU must be willing to take in more refugees, something that in the past has fallen on only a handful of countries.
Trapped in Budapest

The Guardian reports Hundreds Set Off From Budapest on Foot

.

The curious thing about Orban's detainment policy is the refugees do not want to stay in Hungary. They want to go to Germany.

Would Germany be happy if Orban decided to send them on their way? Of course, Juncker wants to force Hungary to accept more.

Let Them All In



Giles Fraser says the Christian thing to do is "Let Them All In".

Bizarre Ideas

Al Arabiya reports Egypt Billionaire Offers to Buy Island for Refugees.
Egyptian billionaire Naguib Sawiris has offered to buy an island off Greece or Italy and develop it to help hundreds of thousands of people fleeing from Syria and other conflicts.

The telecoms tycoon first announced the initiative on Twitter.

"Greece or Italy sell me an island, I'll call its independence and host the migrants and provide jobs for them building their new country," he wrote.

He is going to provide jobs for them? food? Housing?
Australia Says Send Them Back

At the other end of the extreme, Abbott Urges EU to Follow Australia's Hardline Migrant Policy
Tony Abbott has urged Europe to follow Australia's hardline asylum policy and begin "turning back boats" as a way to stop drownings at sea and smash people-smuggling operations.

"It's obviously a crisis right now on the borders of Europe," Mr Abbott, Australia's prime minister, said on Friday.

"I think a lot of people right around the world are looking at what we've done and said, 'well, if Australia can stop the people-smuggling trade, if Australia can end deaths at sea, perhaps we can learn from them'."

The comments were criticised by human rights advocates, who warn that Australia's harsh policies force refugees to flee to countries less able to cope with a humanitarian crisis, violate human rights and undermine UN refugee conventions.

Mr Abbott was responding to publication of photographs of the body of Aylan Kurdi, the three-year-old Syrian boy washed up on a Turkish beach, and chaotic scenes of migrants in Europe, which is struggling to deal with an influx of asylum seekers fleeing war-torn Syria. 

When Mr Abbott was elected in September 2013 he deployed the Australian navy to turn boats back into Indonesian waters and began resettling refugees in Cambodia, one of Southeast Asia's poorest countries. After a bitter debate at its annual conference, Labor supported the controversial "turn back" policy in July.
Infinite Demand for Free Services

Those who say "let them all in" are good-hearted fools.

There is unlimited demand for free services. If countries let them all in, half of Africa would move to Europe. And along with the influx of refugees (none of them with jobs), crime and resentment would build.

Unemployment in Greece and Spain is over 20%.

Greece is extremely short of funds in case no one noticed. Where is Greece going to put the refugees? At what cost? Who pays?

US Role

One thing reporters fail to mention is the US role in this mess. US policy created ISIS. Arguably the EU should sue us for damages.

What's needed is a stable Syria. Instead, witness scenes like this one.



Russia Presence in Syria

Fox News reports US Monitoring Reports Russia has Stepped up Syria Presence.
The White House and State Department said Thursday that it was monitoring reports that Russia is carrying out military operations in Syria's civil war on behalf of President Bashar al-Assad, with both warning that such actions would further destabilize Syria's perilous situation.

"Russia has asked for clearances for military flight to Syria," a U.S. official was quoted as telling Britain's Daily Telegraph, "[but] we don't know what their goals are ... Evidence has been inconclusive so far as to what this activity is."

Russian military involvement in Syria, if confirmed, would add a new layer of complexity to a war that has killed an estimated 220,000 people and displaced over 4 million, according to United Nations estimates.

The conflict has facilitated the rise of the ISIS terror group, drawn in the United States as the head of a coalition launching airstrikes against ISIS, as well as the trainer and supplier of rebel groups who are asked to fight a three-way battle against Assad and ISIS.
Reflections on Destabilizing Syria

The US claims: "Russia is destabilizing Syria". That's a hoot. The US is at the heart of a crisis that has displaced over 4 million people.

As with Iraq, the US thought it could overthrow a government with no repercussions.

So much for another brutal lesson in "nation building" madness.

Mike "Mish" Shedlock

Establishment Survey +173K Jobs, Private Jobs +140,000; Unemployment Rate 5.1%

Posted: 04 Sep 2015 08:20 AM PDT

Initial Reaction

The establishment survey came in a weaker than expected 173,000 job. The Bloomberg Consensus estimate was 223,000 jobs.

However, the preceding two months were revised up by 44,000 and wages were strong. Bloomberg provides a nice summation of the strengths and weaknesses.
The headline may not look it but there's plenty of strength in the August employment report. Nonfarm payrolls rose only 173,000 which is at the low-end estimate, but the two prior months are now revised up a total of 44,000. The unemployment rate fell 2 tenths to 5.1 percent which is below the low end estimate and the lowest of the recovery, since April 2008. And wages are strong, with average hourly earnings up 0.3 percent for a 2.2 percent year-on-year rate that's 1 tenth higher than July. Debate will definitely be lively at the September 17 FOMC!

Private payroll growth proved very weak, at only 140,000. Government added 33,000 jobs vs July's 21,000. Manufacturing, held back by weak exports and trouble in energy equipment, shed a steep 17,000 jobs followed by a 9,000 loss for mining which is getting hit by low commodity prices. A plus is a 33,000 rise in professional & business services and a respectable 11,000 rise in the temporary help subcomponent. This subcomponent is considered a leading indicator for long-term labor demand. Retail rose 11,000 with vehicle dealers, who have been very busy, adding 2,000 jobs following July's gain of 11,000.

Seasonality, especially the timing of the beginning of the school year, always plays an outsized role in August employment data which are often revised higher. Policy makers are certain to take this into consideration at this month's FOMC. There's something for everybody in this report which won't likely settle expectations whether the Fed lifts off or not this month.
Revisions

The employment change for June revised up from +231,000 to +245,000, and the change for July revised up from +215,000 to +245,000. Incorporating revisions, employment has increased by an average 221,000 per month over the past 3 months.

Wages

Average hourly earnings for all employees on private, nonfarm payrolls rose by 8 cents in August, following a 6-cent gain in July. Hourly earnings are up 2.2 percent over the year. In August, average weekly hours of all employees edged up 0.1 hour to 34.6 hours.

BLS Jobs Statistics at a Glance

  • Nonfarm Payroll: +173,000 - Establishment Survey
  • Employment: +196,000 - Household Survey
  • Unemployment: -33,000 - Household Survey
  • Involuntary Part-Time Work: +158,000 - Household Survey
  • Voluntary Part-Time Work: -131,000 - Household Survey
  • Baseline Unemployment Rate: -0.2 to 5.1% - Household Survey
  • U-6 unemployment: -0.1 to 10.3% - Household Survey
  • Civilian Non-institutional Population: +220,000
  • Civilian Labor Force: -41,000 - Household Survey
  • Not in Labor Force: +261,000 - Household Survey
  • Participation Rate: +0.0 to 62.6 - Household Survey

August 2015 Employment Report

Please consider the Bureau of Labor Statistics (BLS) Current Employment Report.

Total nonfarm payroll employment increased by 173,000 in August, and the unemployment rate edged down to 5.1 percent. Job gains occurred in health care and social assistance and in financial activities. Manufacturing and mining lost jobs.

Unemployment Rate - Seasonally Adjusted



Nonfarm Employment



Click on Any Chart in this Report to See a Sharper Image

Nonfarm Employment Change from Previous Month by Job Type



Hours and Wages

Average weekly hours of all private employees rose by 0.1 hours to 34.6 hours (from a downward revision of 34.5 hours last month) . Average weekly hours of all private service-providing employees was flat at 33.4 hours.

Average hourly earnings of production and non-supervisory private workers rose $0.05 at $21.07. Average hourly earnings of production and non-supervisory private service-providing employees rose $0.06 at $20.88.

For discussion of income distribution, please see What's "Really" Behind Gross Inequalities In Income Distribution?

Birth Death Model

Starting January 2014, I dropped the Birth/Death Model charts from this report. For those who follow the numbers, I retain this caution: Do not subtract the reported Birth-Death number from the reported headline number. That approach is statistically invalid. Should anything interesting arise in the Birth/Death numbers, I will add the charts back.

Table 15 BLS Alternate Measures of Unemployment



click on chart for sharper image

Table A-15 is where one can find a better approximation of what the unemployment rate really is.

Notice I said "better" approximation not to be confused with "good" approximation.

The official unemployment rate is 5.1%. However, if you start counting all the people who want a job but gave up, all the people with part-time jobs that want a full-time job, all the people who dropped off the unemployment rolls because their unemployment benefits ran out, etc., you get a closer picture of what the unemployment rate is. That number is in the last row labeled U-6.

U-6 is much higher at 10.3%. Both numbers would be way higher still, were it not for millions dropping out of the labor force over the past few years.

Some of those dropping out of the labor force retired because they wanted to retire. The rest is disability fraud, forced retirement, discouraged workers, and kids moving back home because they cannot find a job.

Mike "Mish" Shedlock
http://globaleconomicanalysis.blogspot.com

Clean Your Site's Cruft Before It Causes Rankings Problems - Whiteboard Friday - Moz Blog

Clean Your Site's Cruft Before It Causes Rankings Problems - Whiteboard Friday

Posted by randfish

We all have it. The cruft. The low-quality, or even duplicate-content pages on our sites that we just haven't had time to find and clean up. It may seem harmless, but that cruft might just be harming your entire site's ranking potential. In today's Whiteboard Friday, Rand gives you a bit of momentum, showing you how you can go about finding and taking care of the cruft on your site.

Cleaning the Cruft from Your Site Before it Causes Pain and Problems with your Rankings Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're chatting about cleaning out the cruft from your website. By cruft what I mean is low quality, thin quality, duplicate content types of pages that can cause issues even if they don't seem to be causing a problem today.

What is cruft?

If you were to, for example, launch a large number of low quality pages, pages that Google thought were of poor quality, that users didn't interact with, you could find yourself in a seriously bad situation, and that's for a number of reasons. So Google, yes, certainly they're going to look at content on a page by page basis, but they're also considering things domain wide.

So they might look at a domain and see lots of these green pages, high quality, high performing pages with unique content, exactly what you want. But then they're going to see like these pink and orange blobs of content in there, thin content pages with low engagement metrics that don't seem to perform well, duplicate content pages that don't have proper canonicalization on them yet. This is really what I'm calling cruft, kind of these two things, and many variations of them can fit inside those.

But one issue with cruft for sure it can cause Panda issues. So Google's Panda algorithm is designed to look at a site and say, "You know what? You're tipping over the balance of what a high quality site looks like to us. We see too many low quality pages on the site, and therefore we're not just going to hurt the ranking ability of the low quality pages, we're going to hurt the whole site." Very problematic, really, really challenging and many folks who've encountered Panda issues over time have seen this.

There are also other probably non-directly Panda kinds of related things, like site-wide analysis of things like algorithmic looks at engagement and quality. So, for example ,there was a recent analysis of the Phantom II update that Google did, which hasn't really been formalized very much and Google hasn't said anything about it. But one of the things that they looked at in that Phantom update was the engagement of pages on the sites that got hurt versus the engagement of pages on the sites that benefited, and you saw a clear pattern. Engagement on sites that benefited tended to be higher. On those that were hurt, tended to be lower. So again, it could be not just Panda but other things that will hurt you here.

It can waste crawl bandwidth, which sucks. Especially if you have a large site or complex site, if the engine has to go crawl a bunch of pages that are cruft, that is potentially less crawl bandwidth and less frequent updates for crawling to your good pages.

It can also hurt from a user perspective. User happiness may be lowered, and that could mean a hit to your brand perception. It could also drive down better converting pages. It's not always the case that Google is perfect about this. They could see some of these duplicate content, some of these thin content pages, poorly performing pages and still rank them ahead of the page you wish ranked there, the high quality one that has good conversion, good engagement, and that sucks just for your conversion funnel.

So all sorts of problems here, which is why we want to try and proactively clean out the cruft. This is part of the SEO auditing process. If you look at a site audit document, if you look at site auditing software, or step-by-step how-to's, like the one from Annie that we use here at Moz, you will see this problem addressed.

How do I identify what's cruft on my site(s)?

So let's talk about some ways to proactively identify cruft and then some tips for what we should do afterwards.

Filter that cruft away!

One of those ways for sure that a lot of folks use is Google Analytics or Omniture or Webtrends, whatever your analytics system is. What you're trying to design there is a cruft filter. So I got my little filter. I keep all my good pages inside, and I filter out the low quality ones.

What I can use is one of two things. First, a threshold for bounce or bounce rate or time on site, or pages per visit, any kind of engagement metric that I like I can use that as a potential filter. I could also do some sort of a percentage, meaning in scenario one I basically say, "Hey the threshold is anything with a bounce rate higher than 90%, I want my cruft filter to show me what's going on there." I'd create that filter inside GA or inside Omniture. I'd look at all the pages that match that criteria, and then I'd try and see what was wrong with them and fix those up.

The second one is basically I say, "Hey, here's the average time on site, here's the median time on site, here's the average bounce rate, median bounce rate, average pages per visit, median, great. Now take me 50% below that or one standard deviation below that. Now show me all that stuff, filters that out."

This process is going to capture thin and low quality pages, the ones I've been showing you in pink. It's not going to catch the orange ones. Duplicate content pages are likely to perform very similarly to the thing that they are a duplicate of. So this process is helpful for one of those, not so helpful for other ones.

Sort that cruft!

For that process, you might want to use something like Screaming Frog or OnPage.org, which is a great tool, or Moz Analytics, comes from some company I've heard of.

Basically, in this case, you've got a cruft sorter that is essentially looking at filtration, items that you can identify in things like the URL string or in title elements that match or content that matches, those kinds of things, and so you might use a duplicate content filter. Most of these pieces of software already have a default setting. In some of them you can change that. I think OnPage.org and Screaming Frog both let you change the duplicate content filter. Moz Analytics not so much, same thing with Google Webmaster Tools, now Search Console, which I'll talk about in a sec.

So I might say like, "Hey, identify anything that's more than 80% duplicate content." Or if I know that I have a site with a lot of pages that have only a few images and a little bit of text, but a lot of navigation and HTML on them, well, maybe I'd turn that up to 90% or even 95% depending.

I can also use some rules to identify known duplicate content violators. So for example, if I've identified that everything that has a question mark refer equals bounce or something or partner. Well, okay, now I just need to filter for that particular URL string, or I could look for titles. So if I know that, for example, one of my pages has been heavily duplicated throughout the site or a certain type, I can look for all the titles containing those and then filter out the dupes.

I can also do this for content length. Many folks will look at content length and say, "Hey, if there's a page with fewer than 50 unique words on it in my blog, show that to me. I want to figure out why that is, and then I might want to do some work on those pages."

Ask the SERP providers (cautiously)

Then the last one that we can do for this identification process is Google and Bing Webmaster Tools/Search Console. They have existing filters and features that aren't very malleable. We can't do a whole lot with them, but they will show you potential site crawl issues, broken pages, sometimes dupe content. They're not going to catch everything though. Part of this process is to proactively find things before Google finds them and Bing finds them and start considering them a problem on our site. So we may want to do some of this work before we go, "Oh, let's just shove an XML sitemap to Google and let them crawl everything, and then they'll tell us what's broken." A little risky.

Additional tips, tricks, and robots

A couple additional tips, analytics stats, like the ones from GA or Omniture or Webtrends, they can totally mislead you, especially for pages with very few visits, where you just don't have enough of a sample set to know how they're performing or ones that the engines haven't indexed yet. So if something hasn't been indexed or it just isn't getting search traffic, it might show you misleading metrics about how users are engaging with it that could bias you in ways that you don't want to be biased. So be aware of that. You can control for it generally by looking at other stats or by using these other methods.

When you're doing this, the first thing you should do is any time you identify cruft, remove it from your XML sitemaps. That's just good hygiene, good practice. Oftentimes it is enough to at least have some of the preventative measures from getting hurt here.

However, there's no one size fits all methodology after the don't include it in your XML sitemap. If it's a duplicate, you want to canonicalize it. I don't want to delete all these pages maybe. Maybe I want to delete some of them, but I need to be considered about that. Maybe they're printer friendly pages. Maybe they're pages that have a specific format. It's a PDF version instead of an HTML version. Whatever it is, you want to identify those and probably canonicalize.

Is it useful to no one? Like literally, absolutely no one. You don't want engines visiting. You don't want people visiting it. There's no channel that you care about that page getting traffic to. Well you have two options -- 301 it. If it's already ranking for something or it's on the topic of something, send it to the page that will perform well that you wish that traffic was going to, or you can completely 404 it. Of course, if you're having serious trouble or you need to remove it entirely from engines ASAP, you can use the 410 permanently delete. Just be careful with that.

Is it useful to some visitors, but not search engines? Like you don't want searchers to find it in the engines, but if somebody goes and is paging through a bunch of pages and that kind of thing, okay, great, I can use no index, follow for that in the meta robots tag of a page.

If there's no reason bots should access it at all, like you don't care about them following the links on it, this is a very rare use case, but there can be certain types of internal content that maybe you don't want bots even trying to access, like a huge internal file system that particular kinds of your visitors might want to get access to but nobody else, you can use the robots.txt file to block crawlers from visiting it. Just be aware it can still get into the engines if it's blocked in robots.txt. It just won't show any description. They'll say, "We are not showing a site description for this page because it's blocked by robots."

If the page is almost good, like it's on the borderline between pink and green here, well just make it good. Fix it up. Make that page a winner, get it back in the engines, make sure it's performing well, find all the pages like that have those problems, fix them up or consider recreating them and then 301'ing them over if you want to do that.

With this process, hopefully you can prevent yourself from getting hit by the potential penalties, or being algorithmically filtered, or just being identified as not that great a website. You want Google to consider your site as high quality as they possibly can. You want the same for your visitors, and this process can really help you do that.

Looking forward to the comments, and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

You are subscribed to the Moz Blog newsletter sent from 1100 Second Avenue, Seattle, WA 98101 United States
To stop receiving those e-mails, you can unsubscribe now.
Newsletter powered by FeedPress

Seth's Blog : On doing your best

On doing your best

It's a pretty easy way to let ourselves (or someone else) off the hook. "Hey, you did your best."

But it fails to explain the improvement in the 100-meter dash. Or the way we're able to somehow summon more energy and more insight when there's a lot on the line. Or the tremendous amount of care and love we can bring to a fellow human who needs it.

By defining "our best" as the thing we did when we merely put a lot of effort into a task, I fear we're letting ourselves off the hook.

In fact, it might not require a lot of effort, but a ridiculous amount of effort, an unreasonable amount of preparation, a silly amount of focus... and even then, there might be a little bit left to give.

It's entirely possible that it's not worth the commitment or the risk or the fear to go that far along in creating something that's actually our best. But when we make that compromise, we should own it. "It's not worth doing my best" is actually more honest and powerful than failing while being sort of focused.

       

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.



Email subscriptions powered by FeedBlitz, LLC, 365 Boston Post Rd, Suite 123, Sudbury, MA 01776, USA.