joi, 5 decembrie 2013

New Moz-Builtwith Study Examines Big Website Tech and Google Rankings

New Moz-Builtwith Study Examines Big Website Tech and Google Rankings


New Moz-Builtwith Study Examines Big Website Tech and Google Rankings

Posted: 04 Dec 2013 03:01 PM PST

Posted by Cyrus-Shepard

BuiltWith knows about your website.

Go ahead. Try it out.

BuiltWith also knows about your competitors' websites. They've cataloged over 5,000 different website technologies on over 190 million sites. Want to know how many sites use your competitor's analytics software? Or who accepts Bitcoin? Or how many sites run WordPress?

Like BuiltWith, Moz also has a lot of data. Every two years, we run a Search Engine Ranking Factors study where we examine over 180,000 websites in order to better understand how they rank in Google's search results.

We thought, "Wouldn't it be fun to combine the two data sets?"

That's exactly what our data science team, led by Dr. Matt Peters, did. We wanted to find out what technologies websites were using, and also see if those technologies correlated with Google rankings.

How we conducted the study

BuiltWith supplied Moz with tech info on 180,000 domains that were previously analyzed for the Search Engine Ranking Factors study. Dr. Peters then calculated the correlations for over 50 website technologies.

The ranking data for the domains was gathered last summerâ€"you can read more about it hereâ€"and the BuiltWith data is updated once per quarter. We made the assumption that basic web technology, like hosting platforms and web servers, don't change often.

It's very important to note that the website technologies we studied are not believed to be actual ranking factors in Google's algorithm. There are huge causation/correlation issues at hand. Google likely doesn't care too much what framework or content management system you use, but because SEOs often believe one technology superior to the other, we thought it best to take a look..

Web hosting platforms performance

One of the cool things about BuiltWith is not only can you see what technology a website uses, but you can view trends across the entire Internet.

One of the most important questions a webmaster has to answer is who to use as a hosting provider. Here's BuiltWith's breakdown of the hosting providers for the top 1,000,000 websites:

Holy GoDaddy! That's a testament to the power of marketing.

Webmasters often credit good hosting as a key to their success. We wanted to find out if certain web hosts were correlated with higher Google rankings.

Interestingly, the data showed very little correlation between web hosting providers and higher rankings. The results, in fact, were close enough to zero to be considered null.

Web Hosting Correlation
Rackspace 0.024958629
Amazon 0.043836395
Softlayer -0.02036524
GoDaddy -0.045295217
Liquid Web -0.000872457
CloudFlare Hosting -0.036254475

Statistically, Dr. Peters assures me, these correlations are so small they don't carry much weight.

The lesson here is that web hosting, at least for the major providers, does not appear to be correlated with higher rankings or lower rankings one way or another. To put this another way, simply hosting your site on GoDaddy should neither help or hurt you in the large, SEO scheme of things.

That said, there are a lot of bad hosts out there as well. Uptime, cost, customer service and other factors are all important considerations.

CMS battle â€" WordPress vs. Joomla vs. Drupal

Looking at the most popular content management systems for the top million websites, it's easy to spot the absolute dominance of WordPress.

Nearly a quarter of the top million sites run WordPress.

You may be surprised to see that Tumblr only ranks 6,400 sites in the top million. If you expand the data to look at all known sites in BuiltWith's index, the number grows to over 900,000. That's still a fraction of the 158 million blogs Tumblr claims, compared to the only 73 million claimed by WordPress.

This seems to be a matter of quality over quantity. Tumblr has many more blogs, but it appears fewer of them gain significant traffic or visibility.

Does any of this correlate to Google rankings? We sampled five of the most popular CMS's and again found very little correlation.

CMS Correlation
WordPress -0.009457206
Drupal 0.019447922
Joomla! 0.032998891
vBulletin -0.024481161
ExpressionEngine 0.027008018

Again, these numbers are statistically insignificant. It would appear that the content management system you use is not nearly important as how you use it.

While configuring these systems for SEO varies in difficulty, plugins and best practices can be applied to all.

Popular social widgets â€" Twitter vs. Facebook

To be honest, the following chart surprised me. I'm a huge advocate of Google+, but never did I think more websites would display the Google Plus One button over Twitter's Tweet button.

That's not to say people actually hit the Google+ button as much. With folks tweeting over 58 million tweets per day, it's fair to guess that far more people are hitting relatively few Twitter buttons, although Google+ may be catching up.

Sadly, our correlation data on social widgets is highly suspect. That's because the BuiltWith data is aggregated at the domain level, and social widgets are a page-level feature.

Even though we found a very slight positive correlation between social share widgets and higher rankings, we can't conclusively say there is a relationship.

More important is to realize the significant correlations that exist between Google rankings and actual social shares. While we don't know how or even if Google uses social metrics in its algorithm (Matt Cutts specifically says they don't use +1s) we do know that social shares are significantly associated with higher rankings.

Again, causation is not correlation, but it makes sense that adding social share widgets to your best content can encourage sharing, which in turn helps with increased visibility, mentions, and links, all of which can lead to higher search engine rankings.

Ecommerce technology â€" show us the platform

Mirror, mirror on the wall, who is the biggest ecommerce platform of them all?

Magento wins this one, but the distribution is more even than other technologies we've looked at.

When we looked at the correlation data, again we found very little relationship between the ecommerce platform a website used and how it performed in Google search results.

Here's how each ecommerce platform performed in our study.

Ecommerce Correlation
Magento -0.005569493
Yahoo Store -0.008279856
Volusion -0.016793737
Miva Merchant -0.027214854
osCommerce -0.012115017
WooCommerce -0.033716129
BigCommerce SSL -0.044259375
Magento Enterprise 0.001235127
VirtueMart -0.049429445
Demandware 0.021544097

Although huge differences exist in different ecommerce platforms, and some are easier to configure for SEO than others, it would appear that the platform you choose is not a huge factor in your eventual search performance.

Content delivery networks â€" fast, fast, faster

One of the major pushes marketers have made in the past 12 months has been to improve page speed and loading times. The benefits touted include improved customer satisfaction, conversions and possible SEO benefits.

The race to improve page speed has led to huge adoption of content delivery networks.

In our Ranking Factors Survey, the response time of a web page showed a -0.10 correlation with rankings. While this can't be considered a significant correlation, it offered a hint that faster pages may perform better in search resultsâ€"a result we've heard anecdotally, at least on the outliers of webpage speed performance.

We might expect websites using CDNs to gain the upper hand in ranking, but the evidence doesn't yet support this theory. Again, these values are basically null.

CDN Correlation
AJAX Libraries API 0.031412968
Akamai 0.046785574
GStatic Google Static Content 0.017903898
Facebook CDN 0.0005199
CloudFront 0.046000385
CloudFlare -0.036867599

While using a CDN is an important step in speeding up your site, it is only one of many optimizations you should make when improving webpage performance.

SSL certificates, web servers, and framework: Do they stack up?

We ran rankings correlations on several more data points that BuiltWith supplied us. We wanted to find out if things like your website framework (PHP, ASP.NET), your web server (Apache, IIS) or whether or not your website used an SSL certificate was correlated with higher or lower rankings.

While we found a few outliers around Varnish software and Symanted VeriSign SSL certificates, overall the data suggests no strong relationships between these technologies and Google rankings.

Framework Correlation
PHP 0.032731241
ASP.NET 0.042271235
Shockwave Flash Embed 0.046545556
Adobe Dreamweaver 0.007224319
Frontpage Extensions -0.02056009
SSL Certificates
GoDaddy SSL 0.006470096
GeoTrust SSL -0.007319401
Comodo SSL -0.003843119
RapidSSL -0.00941283
Symantec VeriSign 0.089825587
Web Servers
Apache 0.029671122
IIS 0.040990108
nginx 0.069745949
Varnish 0.085090249

What we can learn

We had high hopes for finding "silver bullets" among website technologies that could launch us all to higher rankings.

The reality turns out to be much more complex.

While technologies like great hosting, CDNs, and social widgets can help set up an environment for improving SEO, they don't do the work for us. Even our own Moz Analytics, with all its SEO-specific software, can't help improve your website visibility unless you actually put the work in.

Are there any website technologies you'd like us to study next time around? Let us know in the comments below!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Hummingbird's Unsung Impact on Local Search

Posted: 04 Dec 2013 03:15 AM PST

Posted by David-Mihm

Though I no longer actively consult for clients, there seems to have been a significant qualitative shift in local results since Google's release of Hummingbird that I haven't seen reported on search engine blogs and media outlets. The columns I have seen have generally espoused advice to take advantage of what Hummingbird was designed to do rather than looked at the outcome of the update.

From where I sit, the outcome has been a slightly lower overall quality in Google's local results, possibly due in part to a "purer" ranking algorithm in local packs. While these kinds of egregious results reported soon after Hummingbird's release have mostly disappeared, it's the secondary Hummingbird flutter, which may have coincided with the November 14th "update," that seems to have caused the most noticeable changes.

I'll be working with Dr. Pete to put together more quantitative local components of Mozcast in the coming months, but for the time being, I'll just have to describe what I'm seeing today with a fairly simplistic analysis.

To do the analysis, I performed manual searches for five keywords, both geo-modified and generic, in five diverse markets around the country. I selected these keywords based on terms that I knew Google considered to have "local intent" across as broad a range of industries as I could think of. After performing the searches, I took note of the top position and number of occurrences of four types of sites, as well as position and number of results in each "pack."

Keywords Markets Result Type Taxonomy
personal injury lawyer Chicago national directory (e.g., Yelp)
assisted living facility Portland regional directory (e.g., ArizonaGolf.com)
wedding photographer Tampa local business website (e.g., AcmeElectric.com)
electrician Burlington barnacle webpage (e.g., facebook.com/acmeelectric)
pet store Flagstaff national brand (e.g., Petsmart.com)

I also performed an even smaller analysis using three keywords that returned carousel results (thanks to SIM Partners for this sample list of keywords): "golf course," "restaurant," and "dance club."

Again, a very simple analysis that is by no means intended to be a statistically significant study. I fully realize that these results may be skewed by my Portland IP address (even though I geo-located each time I searched for each market), data center, time of day, etc.

I'll share with you some interim takeaways that I found interesting, though, as I work on a more complete version with Dr. Pete over the winter.

1. Search results in search results have made a comeback in a big way

If anything, Hummingbird or the November 14th update seem to have accelerated the trend that started with the Venice update: more and more localized organic results for generic (un-geo-modified) keywords.

But the winners of this update haven't necessarily been small businesses. Google is now returning specific metro-level pages from national directories like Yelp, TripAdvisor, Findlaw, and others for these generic keywords.

This trend is even more pronounced for keywords that do include geo-modifiers, as the example below for "pet store portland" demonstrates.

Results like the one above call into question Google's longstanding practice of minimizing the frequency with which these pages occur in Google search results. While the Yelp example above is one of the more blatant instances that I came across, plenty of directories (including WeddingWire, below) are benefitting from similar algorithmic behavior. In many cases the pages that are ranking are content-thin directory pagesâ€"the kind of content to which Panda, and to some extent Penguin, were supposed to minimize visibility.

Overall, national directories were the most frequently-occurring type of organic result for the phrases I looked atâ€"a performance amplified when considering geo-modified keywords alone.

National brands as a result type is underrepresented due to 'personal injury lawyer,' 'electrician,' and 'wedding photographer' keyword choices. For the keywords where there are relevant national brands ('assisted living facility' and 'pet store'), they performed quite well.

2. Well-optimized regional-vertical directories accompanied by content still perform well

While a number of thriving directories were wiped out by the initial Panda update, here's an area where the Penguin and Hummingbird updates have been effective. There are plenty of examples of high-quality regionally focused content rewarded with a first-page positionâ€"in some cases above the fold. I don't remember seeing as many of these kinds of sites over the last 18 months as I do now.

Especially if keywords these sites are targeting return carousels instead of packs, there's still plenty of opportunity to rank: in my limited sample, an average of 2.3 first-page results below carousels were for regional directory-style sites.

3. There's little-to-no blending going on in local search anymore

While Mike Blumenthal and Darren Shaw have theorized that the organic algorithm still carries weight in terms of ranking Place results, visually, authorship has been separated from place in post-Hummingbird SERPs.

Numerous "lucky" small businesses (read: well-optimized small businesses) earned both organic and map results across all industries and geographies I looked at.

4. When it comes to packs, position 4 is the new 1

The overwhelming majority of packs seem to be displaying in position 4 these days, especially for "generic" local intent searches. Geo-modified searches seem slightly more likely to show packs in position #1, which makes sense since the local intent is explicitly stronger for those searches.

Together with point #3 in this post, this is yet another factor that is helping national and regional directories compete in local results where they couldn't beforeâ€"additional spots appear to have opened up above the fold, with authorship-enabled small business sites typically shown below rather than above or inside the pack. 82% of the searches in my little mini-experiment returned a national directory in the top three organic results.

5. The number of pack results seems now more dependent on industry than geography

This is REALLY hypothetical, but prior to this summer, the number of Place-related results on a page (whether blended or in packs) seemed to depend largely on the quality of Google's structured local business data in a given geographic area. The more Place-related signals Google had about businesses in a given region, and the more confidence Google had in those signals, the more local results they'd show on a page. In smaller metro areas for example, it was commonplace to find 2- and 3-packs across a wide range of industries.

At least from this admittedly small sample size, Google increasingly seems to be a show a consistent number of pack results by industry, regardless of the size of the market.

Keyword # in Pack Reason for Variance
assisted living facility 6.9 6-pack in Burlington
electrician 6.9 6-pack in Portland
personal injury lawyer 6.4 Authoritative OneBox / Bug in Chicago
pet store 3.0
wedding photographer 7.0

This change may have more to do with the advent of the carousel than with Hummingbird, however. Since the ranking of carousel results doesn't reliably differ from that of (former) packs, it stands to reason that visual display of all local results might now be controlled by a single back-end mechanism.

6. Small businesses are still missing a big opportunity with basic geographic keyword optimization

This is more of an observational bullet point than the others. While there were plenty of localized organic results featuring small business websites, these tended to rank lower than well-optimized national directories (like Yelp, Angie's List, Yellowpages.com, and others) for small-market geo-modified phrases (such as "electrician burlington").

For non-competitive phrases like this, even a simple website with no incoming links of note can rank on the first page (#7) just by including "Burlington, VT" in its homepage Title Tag. With just a little TLCâ€"maybe a link to a contact page that says "contact our Burlington electricians"â€"sites like this one might be able to displace those national directories in positions 1-2-3.

7. The Barnacle SEO strategy is underutilized in a lot of industries

Look at the number of times Facebook and Yelp show up in last year's citation study I co-authored with Whitespark's Darren Shaw. Clearly these are major "fixed objects" to which small businesses should be attaching their exoskeletons.

Yet 74% of searches I conducted as part of this experiment returned no Barnacle results.

This result for "pet store chicago" is one of the few barnacles that I came acrossâ€"and it's a darn good result! Not only is Liz (unintenionally?) leveraging the power of the Yelp domain, but she gets five schema'd stars right on the main Google SERPâ€"which has to increase her clickthrough rate relative to her neighbors.

Interestingly, the club industry is one outlier where small businesses are making the most of power profiles. This might have been my favorite resultâ€"the surprisingly competitive "dance club flagstaff" where Jax is absolutely crushing it on Facebook despite no presence in the carousel.

What does all this mean?

I have to admit, I don't really know the answer to this question yet. Why would Google downgrade the visibility of its Place-related results just as the quality of its Places backend has finally come up to par in the last year? Why favor search-results-in-local-search-results, something Google has actively and successfully fought to keep out of other types of searches for ages? Why minimize the impact of authorship profiles just as they are starting to gain widespread adoption by small business owners and webmasters?

One possible reason might be in preparation for more card-style layouts on mobile phones and wearable technology. But why force these (I believe slightly inferior) results on users of desktop computers, and so far in advance of when cards will be the norm?

At any rate, here are five takeaways from my qualitative review of local results in the last couple of months.

  1. Reports of directories' demise have been greatly exaggerated. For whatever reason (?), Google seems to be giving directories a renewed lease on life. With packs overwhelmingly in the fourth position, they can now compete for above-the-fold visibility in positions 1-2-3, especially in smaller and mid-size metro areas.
  2. Less-successful horizontal directories (non-Yelps and TripAdvisors, e.g.) should consider the economics of their situation. Their ship has largely sailed in larger metro areas like Chicago and Portland. But they still have the opportunity to dominate smaller markets. I realize you probably can't charge a personal injury lawyer in Burlington what you charge his colleague in downtown Chicago. But, in terms of the lifetime value of who will actually get business from your advertising packages, the happy Burlington attorney probably exceeds the furious one from Chicago (if she is even able to stay in business through the end of her contract with you).
  3. The Barnacle opportunity is huge, for independent and national businesses alike. With Google's new weighting towards directories in organic results and the unblending of packs, barnacle listings present an opportunity for savvy businesses to earn three first-page positions for the same keywordâ€"one pack listing, one web listing, and one (or more) barnacle listing.
  4. National brands who haven't taken my advice to put in a decent store locator yet should surely do so now. Well-structured regional pages, and easily-crawled store-level pages, can get great visibility pretty easily.
  5. Andrew Shotland already said it in the last section of his Search Engine Land column, but regionally-focused sitesâ€"whether directories or businessesâ€"should absolutely invest in great content. With Penguin and Hummingbird combined, thin-content websites of all sizes are having a harder time ranking relative to slightly thicker content directories.

Well, that's my take on what's happening in local search these days...is the Moz community seeing the same things? Do you think the quality of local results has improved or declined since Hummingbird? Have you perceived a shift since November 14th? I'd be particularly interested to hear comments from SEOs in non-U.S. markets, as I don't get the chance to dive into those results nearly as often as I'd like.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

6 Steps to Finding all Your Website URLs

6 Steps to Finding all Your Website URLs

Link to White Noise

6 Steps to Finding all Your Website URLs

Posted: 04 Dec 2013 06:59 AM PST

Removing pages from your website? Going through a site redesign or migration but not sure you have all the URLs on your website? It's an issue that we will all go through at some point in our SEO life. I have been involved in a lot of projects recently where I've needed to find all the URLs that were on a website, and it can be a pain to do! However, I've now managed to get it down to 6 easy steps, and I wanted to share them with you.

Step 1. Crawl your website

This is an obvious one in my opinion. Looking at your website and gathering all the URLs that you can find should be easy right? Well, maybe if you only have a 10′s of pages, but if you are at enterprise level then this isn’t so easy.

Most of you will already know that you can use tools such as Xenu and ScreamingFrog, but if your site is at enterprise level, these may not be robust enough. In this case you could turn to DeepCrawl, which specialises in crawling large websites.

Once you have run the crawl, place those URLs into a spreadsheet on a tab labelled ‘Website Crawl’. You will also want to start a ‘Master List’ so that you have a single URL list, so go ahead and create that too. We will be constantly adding to this list as we go through the steps.

Step 2. Visit your Analytics Package

Reviewing your analytics data is extremely important for all levels of marketing, but many people don’t realise that this is a good place to see what pages you have on your website.

We are looking for all the pages on the website, so you need to head over to Content or Behaviour in your analytics package, change the date so you have a range of at least 18 months, and hit the download button. If you have too many URLs for the download or it is taking a considerable amount of time, you may want to investigate using the API.

Once you have these URLs create a new tab called ‘Analytics Data’ and place your URLs here, you then want to add a copy to the bottom of the list in the ‘Master List’ tab. Now to step 3.

Step 3. XML Sitemaps

This is another place that is commonly forgotten when looking for URLs. The XML sitemap should, ideally, be the place to find the most up to date version of all URLs for your website. After all, it is the place you are asking the search engines to look to help improve your visibility.

Luckily, with an XML file you can open it straight into Excel, so do that now. You will need to do some formatting to remove the unnecessary tags that accompany the XML sitemap, but once you've done this it will leave you with a list of URLs. Copy this list into your original spreadsheet onto a worksheet called ‘Sitemaps’, and then copy it into the ‘Master List’ tab.

Step 4. Get a list of Your Most linked pages

Everyone likes a good link right? So you wouldn’t want to do anything that may lose an influential link. The next step is to download a list of the URLs that are your most linked to pages. You will probably need to do this from multiple tools if you have access to them, but at the very minimum you should download a list of URLs from GWT.

Tools to download most linked to pages from include:

  • Google Webmaster Tools
  • Bing Webmaster Tools
  • Open Site Explorer Top Pages
  • MajesticSEO

Once you have these URLs, collate them and add them to a new tab called ‘Most Linked Pages’, and add a copy to the bottom of the URL list on the ‘Master List’ tab. Now move on to the fifth step.

Step 5. Scraping the SERPs

So far. we have used numerous tools to get all our URLs, but we haven't checked the search engines! So let's do this now. You will need a scraping extension for your browser. I use Scrape Similar for Chrome.

Go to the search engine of choice, you will want to check at least Google & Bing in the UK, and type in site:domain.com. Now, change the settings of SERP page to show 100 results (we want to do this as quickly as possible!) – this can be done by going to your account settings and changing the view from 10 to 100. You may also need to remove Instant Search from the tick box options.

Now you should be able to see 100 results from your domain. Hover over the first result title tag, right click and select "scrape similar". This should bring up another dialog box with the list of the URLs from the first 100 SERPs and provide you with the option to either put it straight into excel or Google Drive. Either option is good at this point. You will need to go through all the listings that the search engines have returned – this could take a bit of time! There might be a quicker way to do this, and if you know one I would be happy to hear about it in the comments below.

Once you have gone through the results and collated the URLs, put them in a new tab called ‘SERP Scrapped URLs’ and add the list to the bottom of the URLs you have gathered from Steps 1-4 in the ‘Master List’ tab.

Step 6. De-dupe & Check

Wow, you have come a long way and more than likely have a lot of URLs within your spreadsheet. Most of those are likely to be duplicated, at least we hope they are as it will mean you are doing a good job. In Excel there is a feature that allows you to remove all duplicates and leave you with a unique list of URLs. This feature is found in Data > Remove Duplicates. Go ahead and do this.

Hopefully this will leave you with a good amount of URLs. Now for the final step, copy the list of URLs and run them through a crawler, I’d use ScreamingFrog to allow you to check the HTTP status of those URLs. Now you have the status codes, copy this list back into your spreadsheet, which will leave you with a list of as complete as possible URLs with status codes. Now you are done!

If you have completed all six steps, then you should have a pretty thorough list of the URLs that are located on your website. I hope this was helpful and provides some structure to finding all the URLs that you need. Have I missed anything out? Is there a quicker more reliable way of getting all the URLs? I would love to hear your thoughts in the comments below or over on twitter @danielbianchini.

The post 6 Steps to Finding all Your Website URLs appeared first on White Noise.

Seth's Blog : The moderation glitch

 

The moderation glitch

More doesn't scale forever. Why are we so bad at enaging with this obvious truth?

In Malcolm's new book, he points out that our expectation is that most things will respond in a linear way. More input gets us more output. If you want a hotter fire, add more wood. If you want more sales, run more ads.

In fact, it turns out, most things don't respond in a linear way. It's more of a steep curve (he calls it an inverted U). For a while, more inputs get you more results, but then, inevitably, things level off, and then, perversely, get worse. One brownie makes you happy, a second brownie, maybe a little more. The third brownie doesn't make us happy at all, and the fourth brownie makes us sick.

U curve godin

Health care is a fine example of this. First aid makes a huge difference. Smart medical care can increase our health dramatically. But over time, too much investment in invasive medicine, particularly at the end of life, ends up making us worse, not better. Or, in a less intuitive example, it turns out that class size works the same way. Small classes (going from 40 to 25 in the room) make a huge difference, but then diminishing class size (without changing teaching methods) doesn't pay much, and eventually ends up hurting traditional classroom education outputs.

But here's the unanswered question: if the data shows us that in so many things, moderation is a better approach than endless linearity, why does our culture keep pushing us to ignore this?

First, there are the situations where one person (or an organization) is trying to change someone else. Consider the high-end omakase sushi bar, where, for $200, you're buying a once-in-a-lifetime meal. The chef certainly has enough experience to know that he should stop bringing you more food, that one more piece of fish isn't going to make you happier, it's quite likely to make you uncomfortable. But he doesn't stop.

Or consider the zero-tolerance policy in some schools. We know that ever more punishment doesn't create better outcomes.

Here's the problem with the inverted U: We aren't certain when it's going to turn. We can't be sure when more won't actually be better.

As a result of this uncertainty, we're likely to make one of two mistakes. Either we will stop too soon, leaving stones unturned, patrons unsatisfied, criminals unpunished... or we will stop too late, wasting some money and possibly missing the moderation sweet spot.

You already guess what we do: we avoid the embarrassment of not doing enough. The sushi chef doesn't want someone to say, "it was great, but he wasn't generous." The politician says, "I don't want any voter to say that even one criminal got away because I was soft on crime."

We always start with intent, as Omar Wassow has pointed out. It's intent that gets us to take action and to start marketing and spending. But intent and results are different things.

We market our solution (to ourselves and to others) and that marketing drives our actions. As long as we're uncertain as to where the curve turns, we're going to have to push that marketing message forward. It's a lot more difficult to sell the idea of moderation than it is to sell the earnest intent of joy or punishment or health or education.

Moderation is a marketing problem.

(this is getting long, sorry, but I hope it's worth it)

The other category of interventions are the things we do to ourselves. This is the wine drinker who goes from the health benefits of a daily glass of wine to the health detriments of a daily bottle or two. This is the runner who goes from the benefits of five miles a day to knees that no longer work because he overdid it.

Here, the reason we can't stop is self marketing plus habit. Habits are the other half of the glitch. We learn a habit when it pays off for us, but we're hardwired to keep doing the habit, even after it doesn't.

Hence the two lessons:

1. Smart organizations need to build moderation-as-a-goal into every plan they make. Every budget and every initiative ought to be on the look out for the sweet spot, not merely "more." It's not natural to look for this, nor is it easy, which is why, like all smart organizational shifts, we need to work at it. How often does the boss ask, "have we hit the sweet spot of moderation yet?"

If doctors were required to report on quality of life instead of tests run, you can bet quality of life would improve faster than the number of tests run does.

2. Habits matter. When good habits turn into bad ones, call them out, write them down and if you can, find someone to help you change them.

"Because it used to work," is not a sensible reason to keep doing something.

[But please! Don't forget the local max.]

       

More Recent Articles

[You're getting this note because you subscribed to Seth Godin's blog.]

Don't want to get this email anymore? Click the link below to unsubscribe.




Your requested content delivery powered by FeedBlitz, LLC, 9 Thoreau Way, Sudbury, MA 01776, USA. +1.978.776.9498