luni, 18 martie 2013

SEO Blog

SEO Blog


The PEO Services That Can Make Your Business More Efficient

Posted: 17 Mar 2013 11:11 PM PDT

It is high time that you understand the benefits of the Professional Employer Organization or PEO services in making your business more efficient. PEO Company is an organization that specializes in outsourcing the Human Resource Management of the business companies. They have experts from the varied fields of HR department...
Read more »

Social Media Highs and Lows of 2012 [Infographics]

Posted: 17 Mar 2013 11:05 PM PDT

Social media networks offer up some of the best promotional mediums in our modern world, thanks to enormous, worldwide audiences that use them – but a social media campaign can go either way. In 2012, many companies learned this the hard way. American Apparel was forced to apologize for its...
Read more »

Mathematical Ideas for Marketers

Mathematical Ideas for Marketers


Mathematical Ideas for Marketers

Posted: 17 Mar 2013 07:06 PM PDT

Posted by willcritchlow

I've been hiding from my natural geekiness recently. My last few blog posts and my most recent presentations have all been about broad marketing ideas, things that play out well in the boardroom, and big picture "future of the industry" stuff.

Although those topics are all well and good, sometimes I need to feed the geek. And my geek lives on logic and maths (yes, I'm going to use the *s* throughout - it's how we roll in the UK and that's where I studied). One of our most recent hires in our London office is a fellow maths graduate and I've been enjoying the little discussions and puzzles.

(The last one we worked on together: in how many number bases does the number 2013 end in a "3"? Feel free to share your answers and workings in the comments.)

Rather than just purely geek out over pointless things, I have been casting my mind over the ways that mathematical ideas can help us out as marketers; either by making us better at our jobs, or by helping us understand more advanced or abstract concepts. Obviously a post like this can only scratch the surface, so I've designed it to link out to a bunch of resources and further reading. In approximate ascending order of difficulty and prerequisites, here are some of my favourite mathematical ideas for marketers:

Averaging averages

The first and simplest idea is really a correction of a common misconception. We were talking about it here in the context of some data we were visualising for a client. The problem goes like this:

Our client had data for average income broken down by all combinations of age, location, and gender (details changed to protect the innocent). We wanted to get the average income by gender.

It's tempting to think that you can do this from the data provided by averaging all the female values and averaging all the male values, but that would be incorrect. If the age or geographic distribution is not perfectly uniform by gender, then we will get the wrong answer. Consider the following entirely made up example:

  • Female, 25, London -  Average: 30,000 (10,000 people)
  • Female, 26, London - Average: 31,000 (11,000 people)

It's tempting to say that the average for the whole group is 30,500. In fact, it's 30,524 (because of the hidden variable that there are more in the second group than the first).

You will often encounter this in marketing when presented with percentages. Suppose you have a campaign that made 200% ROI in month one and 250% ROI in month two. What's the ROI of the campaign to date?

Answer: anywhere in the range 200-250%. You have no idea where.

Try it out on this brainteaser hat-tip @tomanthonyseo:

If I drive at 30mph for 60 miles, how fast do I have drive the next 60 to average 60mph for the whole trip?

Correlation coefficients

Although the mathematical background can look scary, linear regression and correlation coefficients represent a relatively simple concept. The idea is to measure how closely related two variables are; think about trying to draw a "line of best fit" through an X-Y scatter chart of the two variables.

The summary of how it works is that it finds the line through the scatter chart that minimises the sum of the distances of the points of the scatter plot away from the line.

The great part is that you don't even need to dig into the mathematical details to use this technique. Excel has built in functions to help you do it - check out this YouTube video showing how to do it:

Bayes

Thomas Bayes was a mathematician who lived in the early 1700s. The break-through he made was to come up with a way of analysing probability statements of the form:

"What's the probability of event A given that event B happened?"

Mathematicians write that as P(A|B).

Bayes discovered that this = P(A and B) / P(B)

In plain English, that means:

"The probability of both event A and B happening divided by the probability of B happening."

And also that P(A|B) = P(B|A) * P(A) / P(B)

Which means:

"The probability of B happening given A happened, times the probability of A happening, divided by the probability of B happening"

Why is this important? It's critical to understanding the results of all kinds of tests - ranging from medical trials to conversion rate. Here's a challenge from this great explanation of Bayesian thinking:

"1% of women at age forty who participate in routine screening have breast cancer. 80% of women with breast cancer will get positive mammographies. 9.6% of women without breast cancer will also get positive mammographies. A woman in this age group had a positive mammography in a routine screening. What is the probability that she actually has breast cancer?"

If you want to dig deeper into the marketing implications, I really like this article.

O(n) and o(n)

One of the things I did during my maths degree was write really bad code. My lecturers suggested using either Pascal or C. C sounded like "real programming," so I chose that. It's incredibly easy to write horrible programs in C because you manage your own memory (reminding me of this programming joke).

When you think of programs failing, you tend to think of crashes or bugs that return the wrong answer. But one of the most common failings when you start hacking on real world problems is writing programs that run for ever and never give you an answer at all.

As we get easy access to more and more data, it's becoming ever easier accidentally to write programs that would take hours, days, weeks, or even longer to run.

Computer scientists use what is known as "big O notation" to describe the characteristics of how long an algorithm will take to run.

Suppose you are running over a data set of "n" entries. Big O notation is the computer scientists' way of describing how long the algorithm will run in terms of "n."

In very rough terms, O(n^2) for example means that as the size of the dataset grows, the algorithm run-time will grow more like the square of the size of the dataset. For example, an O(n) algorithm on 100 things might take 100 seconds but an O(n^2) would take 100*100 =10,000 seconds.

If you're interested in digging deeper into this concept, this is a really good primer.

At a basic level, if you are writing data analysis programs, what I'm really recommending here is that you spend some time thinking about how long your program will take to run expressed in terms of the size of the dataset. Watch out for things like nested loops or evaluations of arrays. This article shows some simple algorithms that grow in different ways as the data size grows.

Nash equilibria

Using words like equilibria makes this sound scary, but it was explained in layman's terms in the film A Beautiful Mind:

"Games" are defined in all kinds of formal ways, but you can think of them as just being two people in competition, then:

"A Nash equilibrium occurs when both players can’t do any better by changing their strategies, given the likely response of their opponent."

The reason I include this bit of game theory is that it's critical to all kinds of business and marketing success; in particular, it's huge in pricing theory.

If you want a more pop culture example of game theory, this is incredible:

Time series

Time series is the wonkish mathematical name for data on a timeline. The most common time series data in online marketing comes from analytics.

This branch of maths covers the tools and methodologies for analysing data that comes in this form. Much like the regression analysis functions in Excel, the nice thing with time series analysis is that there is software and tools to apply the hard maths for you.

One of the most direct applications of time series analysis to marketing is decomposing analytics data into the different seasonality effects and real underlying trends. I covered how you do this using software called R in a presentation a few years ago - see slides 39+:

Prime numbers/RSA

OK. I'm getting a little tenuous now. It's not so much that you actually need to know the maths behind factoring large numbers or the technical details of public key cryptography.

What I do think is useful to us as technical marketers is to have some idea of how HTTPS/SSL secure connections work. The best resources I know of for this are:

Markov chains

You might have come across the concept of Markov chains in relation to machine-generated content (this is a great overview). If you want to dive deep into the underlying maths, this is a great primer [PDF]

The general concept of Markov chains is an interesting one - the mathematical description is that a Markov chain is a sequence of random variables where each variable depends only on the previous one (or, more generally, previous "n").

Google Scholar has a bunch of results for the use of Markov Chains in marketing.

It turns out that there are a bunch of great mathematical properties of Markov Chains. By removing any possibility of the outcome of the next step being dependent on arbitrary inputs (allowing only the outcomes of the most recent entries in the sequence), we get results like conditions for stationary distributions [PDF]. A stationary distribution is one that converges to a fixed probability distribution - i.e. one that *isn't* based on previous elements in the sequence. This leads me neatly into my final topic:

Eigenvectors/Eigenvalues

OK. Now we're talking real maths. This is at least undergraduate stuff and quickly gets into graduate territory.

There is a branch of maths called linear algebra. It deals with matrix and vector computations (see MIT opencourseware if you want to dig into the details).

To follow the rest of my analogy, all you really need to know is how to multiply a matrix and a vector.

The result of multiplying appropriate vectors and matrices is another vector. When that vector is a fixed (scalar) multiple of the original vector, the vector is called an "eigenvector" of the matrix and the scalar multiplier is called an "eigenvalue" of the matrix.

Why are we talking about matrices? And what do they have to do with stationary distributions of Markov chains?

Well, remember PageRank?

From a mathematical perspective, there are two models of PageRank:

  1. The random surfer model - where you imagine a web visitor who randomly clicks on outbound links (and randomly "jumps" to another arbitrary page with a fixed probability)
  2. The (dominant) eigenvector of the link matrix

You'll notice that the random surfer model is a Markov model (the probability of moving from page A to page B is dependent *only* on A).

It turns out that the eigenvector is actually the stationary distribution of the random surfer Markov chain.

And not only that. The random jump factor? Turns out that is necessary to (a) make sure that the Markov chain has a stationary distribution AND (b) make sure that the link matrix has an eigenvector.

Things like this are the the things that make mathematicians excited.

I appreciate that this post has been something a bit different. Thanks for bearing with me. I'd love to hear your geek-out tips and tricks in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Previewing President Obama's Trip to Israel

The White House Your Daily Snapshot for
Monday, March 18, 2013
 

At 11:40 a.m. ET: President Obama makes a personnel announcement from the East Room. Watch on WhiteHouse.gov/Live.

Previewing President Obama's Trip to Israel

In the first foreign trip of his second term in office, President Obama will visit Israel, the West Bank, and Jordan. The trip is an important opportunity to meet with the new Israeli government and speak to the Israeli people -- as well as to meet with the Palestinian leadership and the King of Jordan.

We asked Ben Rhodes, Deputy National Security Advisor for Strategic Communications, to preview the upcoming trip -- and some of the issues the President will be discussing in his meetings.

Watch the video to learn more about the President's trip.

Ben Rhodes Previews President Obama's Trip to Israel, the West Bank, and Jordan

In Case You Missed It

Here are some of the top stories from the White House blog:

Weekly Address: Time to Create the Energy Security Trust
President Obama discusses the need to harness American energy in order to reduce our dependence on oil and make the United States a magnet for new jobs. He highlights his all-of-the-above approach to American energy -- including a proposal to establish an Energy Security Trust, which invests in research that will help shift our cars and trucks off of oil.

Weekly Wrap Up: "We Don’t Have a Moment to Waste"
Here’s a quick glimpse at what happened last week on WhiteHouse.gov.

President Obama Visits the Argonne National Research Lab to Talk About American Energy Security
President Obama highlighted his proposal to create an Energy Security Trust, which uses revenue generated by oil and gas development on federal lands to support new research and technology that will shift our cars and trucks off of oil for good.

Today's Schedule

All times are Eastern Standard Time (EST).

7:00 AM: The Vice President meets with Italian President Giorgio Napolitano

8:15 AM: The Vice President meets with Italian Prime Minister Mario Monti

10:15 AM: The President receives the Presidential Daily Briefing

10:40 AM: The Vice President meets with President Bronislaw Komorowski of Poland

11:00 AM: The President meets with senior advisors

11:40 AM: The President makes a personnel announcement WhiteHouse.gov/live

12:00 PM: The Vice President meets with President Tomislav Nikolic of Serbia

12:15 PM: Press Briefing by Press Secretary Jay Carney WhiteHouse.gov/live

4:40 PM: The President delivers remarks at a Women’s History Month Reception WhiteHouse.gov/live

WhiteHouse.gov/live Indicates that the event will be live-streamed on WhiteHouse.gov/Live

Get Updates

Sign up for the Daily Snapshot

Stay Connected


This email was sent to e0nstar1.blog@gmail.com
Sign Up for Updates from the White House

Unsubscribe | Privacy Policy

Please do not reply to this email. Contact the White House

The White House • 1600 Pennsylvania Ave NW • Washington, DC 20500 • 202-456-1111

 

Your website on Google (March changes)

Google has changed the algorithm: act now to get high rankings. Google's recent algorithm updates have changed the way Google ranks web pages. Adjust your web pages now and get first page rankings on Google.
This is a SubmitStart Sponsor Update. Unsubscribe from this list.

Google has changed the algorithm: act now to get high rankings

Google's recent algorithm updates have changed the way Google ranks web pages. Adjust your web pages now and get first page rankings on Google.

Watch the video

Guaranteed top 10 rankings for your website

Our popular optimization tool IBP will get your website on Google's first result page for any keyword. It has a success rate of more than 98% and we will give you your money back if you belong to the very few people who don't get top 10 rankings on Google with IBP.

How does it work in detail?

  • You choose the keyword for which you want to get top 10 rankings. IBP will analyze your web page and it will tell you in detail what you have to change.
  • You make the changes on your web page and your website gets on Google's first result page.
  • IBP works with Google's latest ranking algorithm and IBP only uses safe SEO methods that are approved by search engines.

That's it. No catch, no fine print. If you follow the advice of IBP's Top 10 Optimizer then your website will be listed on Google's first result page.

Analyze your website

A real top 10 ranking guarantee should be for the keywords of your choice and it should be for the regular, unpaid search results. That's the guarantee that you get with IBP.

Companies all over the world use our products to get better search engine rankings. We're a member of the Chamber of Commerce and a contributing member of the Organisation of Independent Software developers.

Our products and our CEOs have been featured in computer magazines and investment magazines. If you use IBP to get high rankings on Google and other search engines, you use a proven solution from a reliable company.

Christmas offer: Order IBP today and save up to 24%.

many five star ratings


Sent to e0nstar1.blog@gmail.comwhy did I get this?

unsubscribe from this list | update subscription preferences

SubmitStart · Trade Center · Kristian IV:s väg 3 · Halmstad 302 50

Plugging the link leaks, part 1 – reclaim links you are throwing away

Plugging the link leaks, part 1 – reclaim links you are throwing away

Link to SEOptimise » blog

Plugging the link leaks, part 1 – reclaim links you are throwing away

Posted: 15 Mar 2013 09:51 AM PDT

In London hundreds of SEOs have gathered for LinkLove, and as it is a day of sharing tips on getting more links, we thought we would join in.

As the easy self-publishing or submission tactics fall by the wayside, link building has become a far more creative, and time-consuming, process.

But at SEOptimise, as well as building links through content, we also regularly boost clients’ link profiles without typing a word. There’s no asking for links, nor risking the wrath of Google's anti-spam team.

This is link reclamation – fixing existing links that point to broken or inefficiently redirected pages on your site.

As Ian Lurie pointed out in one of his excellent webinars last year, before worrying about various creative methods of generating links, "get the #@@!#@$ easy links" first. And link reclamation is just that – it might take a couple of hours to complete, but can be a boost for any campaign.

Tools

What you'll need for your link reclamation project:

Finding your broken links

Now, the quick-win version of this process is simply to put together a list of all the broken URLs on your site that have external links pointing to them, ready to put 301 redirects in place. There's nothing wrong with doing this, and it will certainly give you the boost of reclaiming your lost authority, but sometimes we need to know where all the broken links are.

This is so we can see which broken links we should redirect, and which we want to attempt to have fixed on the source URL. Plus, as an agency, it can be advantageous to be able to report all the links we have reclaimed.

So, let's put together as comprehensive a list of broken links pointing to our target site as possible.

Backlink data

Our first port of call is backlink data. Go to the tool of your choice and look up your site. We're using Open Site Explorer in the examples here, but Majestic and Ahrefs both also provide perfect data for this. Within the inbound links tab select links from "only external" pages and either "pages on this sub-domain" or "pages on this root domain", depending on the scope of your project.

Open Site Explorer

Grabbing the backlink data from Open Site Explorer

There's a whole range of metrics we could use to investigate, but to keep things moving, delete all the columns except for URL, anchor text, page authority, domain authority, followable and target URL.

Doing this allows us to analyse our broken links by PA or DA, and see which are no-followed, helping us decide which links to 301, and which to reach out to have fixed to the correct URL. If you are not using OSE, then Majestic SEO and Ahrefs have their own importance metrics.

Now to find our broken links. Copy the entries in the target URL column, and paste them into a new spreadsheet. Use the remove duplicates feature within the data tab, and save as a .csv or .txt file.

Removing duplicate URLs in Excel

Fire up Screaming Frog, and select 'List' from the mode menu. Choose your file of URLs, and start crawling. Once the crawl is complete, select the Response Codes tab and filter to 'Client Error (4XX)'. You now have a complete list of URLs that external sites are linking to which don't exist on your server.

No URLs on the list? Congratulations! You have no broken links to fix, and can crack on with working on ways to generate fresh links. If, like most sites we've worked with, you have URLs here, export the list.

Exporting all pages resulting i a 404 error from Screaming Frog

Finding 302 redirects

Still in screaming Frog, filter to 'Redirection (3xx)', and order the results by the 'Status Code' column. Are there any 302 redirects in there? If so, export this list, open in Excel and make the data a table (ctrl+T is the shortcut). Filter by Status Code to find the 302s, and copy the data. Open your exported list of URLs resulting in 404 errors, and paste your 302 data into the spreadsheet. You now have a complete list of linked-to pages we want to fix.

Getting clever

It's time to prune data again. Delete or hide every column until you are left with just the Address and Status Code columns. Once ready, select all the 404/302 data and copy. Go back to your spreadsheet with OSE data. You need to paste in the two columns, either to the right of the OSE data, or in a new sheet (however you prefer to work).

Now for the (relatively) clever bit. Add a column to the right of your OSE data, and call it 'status code', then turn all the OSE data into a table. Now we are going use a VLOOKUP function in the new 'status code' column to have Excel tell us which of our OSE links match the 404 errors we found in Screaming Frog.

The code we used is =IFERROR(VLOOKUP(F:F,I:J,2,FALSE),”"), with F:F specifying the Target URL column in the OSE data, and I:J the Address and Status Code columns respectively in the Screaming Frog data. (A big hat-tip to Joe and Tamsin for patiently helping me with Excel formulas!)

Alternatively use the Insert Function wizard in the Formulas tab to work through the process, though you will have to add the IFERROR part afterwards.

Our 'status code' column should now contain the code from the Screaming Frog data each time one of our external links points to a URL that returns a 404 or 302 code. Simply filter the source code column by 404 and 400 to give you a complete list of broken URLs. You can then reorder this list by PA, DA or by which are followed.

You may also wish to add a ‘date fixed’ column, so you can record when the redirect or edit is in place, and the link starts passing its sweet, sweet authority to your target site.

You can also filter by 302, and instantly have a list of redirects to be changed to 301s, and all the links that suddenly pass all their potential link authority to show your client or boss. Not bad for a few minutes' work!

Two sources are better than one

So are we done? Not quite; many SEOs work on the premise that using more than one data source is prudent.

Once you have done this process, it's very quick to do the same again from an alternative source; in our example I might now use Ahrefs. Once we have all my 404s/302s from Ahrefs in a new tab in our spreadsheet, we can create a third tab to combine with the 404s from OSE, using the remove duplicates tool once again.

Of course, the sources cannot share quality metrics – just URL, anchor text and target URL. However, the advantage of using multiple sources to find a greater number of broken links to fix is worthwhile, and we can still filter on individual sheets.
Google Webmaster Tools crawl errors.

To use every available source of external links leading to 404 errors, we need to use Google Webmaster Tools’ ‘Crawl Errors’ report (found under Diagnostics in the menu).

Alas, this is where things become a little more frustrating. As no doubt many of you know, it is impossible to cleanly download a list of each 404 URL address and the links pointing to it, despite the information being available on screen. Plus GWT is not always as up-to-date as we would like. So, we have to use a workaround.

What you can download from GWT is all the broken URLs Google has found on your site. So, our first step is to download this list as a .csv file by selecting Health, then Crawl Errors in the left-hand navigation.

Finding external links resulting in 404 errors in Google Webmaster Tools

Select the 'Not found' links, and hit the download button. This file can then be imported using Screaming Frog's list mode, and all the reported broken URLs checked. Any URLs that are now returning 200 or 301 status codes should be removed from your list, and marked as 'fixed' within GWT.

We now have a smaller and accurate list of the broken URLs on our site. Create a new tab in the spreadsheet with the broken links we found in our backlink tool, and create headings for URL, target URL and status code. Unfortunately, there's now some manual work involved; how much depends on how many 404 errors GWT is reporting.

  • Select each error in turn within GWT
  • Choose the 'Linked from' tab
  • Copy all the external URLs pointing to that URL
  • Paste these URLs in the URL column in your spreadsheet
  • Add the target URL you have just been checking to the target URL column

As you can see, if you have a lot of reported external links, this can quickly become quite a pain. One helpful shortcut I have found is the Link Clump extension for Chrome. This allows you to create keyboard and mouse action shortcuts for opening or copying multiple links. I set one for copying all URLs selected to the clipboard. This makes it relatively quick to grab all the URLs for each reported error and paste them into my spreadsheet.

There's plenty of other great extensions/add-ons that can help with this, such as Scraper for Chrome and Multi Links for Firefox. Please suggest any favourites you have in the comments below!

After a bit of leg-work, you will now have a list of all the source links, and their target URL. The final stage is to ensure that these external URLs still exist, and still link to our site. Doing this is a two stage process, both using the same VLOOKUP method we used earlier.

Copy all the source URLs and paste into a new spreadsheet, then save as a .csv file. Now go back to Screaming Frog and upload the list and crawl all the URLs. Firstly go to Response Codes and filter for any redirects. If you have some you need to export the list.

Open this list and copy the redirect destination URLs, then add these to your master list of URLs from GWT. Next use the same VLOOKUP methodology to remove any URLs that result in a 301 or 302 – we don't want them in our external link list as they no longer exist, but do want the redirect targets, in case our links are there!

Now go back to Screaming Frog and filter for any client errors (400/404s). If there are any, again export then use the VLOOKUP method to remove them from our list of external links from GWT.

The second step is to check they are still linking to you. Copy the edited column of URLs reported by GWT, and save to (yep, yet another) .csv file. Upload in Screaming Frog and go to the Configuration menu and select Custom to add a bespoke filter. Enter your domain, with or without subdomain depending on your project, and set to 'does not contain'.

Creating a custom filter in Screaming Frog

Crawl your URL list, then head to the Custom tab and filter to your bespoke filter. This then shows you all the URLs that no longer point to you. Export, copy into your main spreadsheet and VLOOKUP one last time and delete these links. You'll need to add some form of marker text in a second column so you can see which ones to delete, or use the Status column.

Side note: You may wish to keep a record of these to try and get your site back on them if still relevant – it may be they simply removed the link to you because it was a broken page. Being able to write to the site saying, "You used to link to us and we'd love to be featured once again", is a great reason to have to contact these sites.

Your final list

So, after a lot of editing, you have a list of broken external links reported by Google Webmaster Tools, plus the page they are linking to. Add these to your master list (the URLs from OSE and Ahrefs), de-duplicate and you have your final list of links to reclaim.

Using the individual sheets for each source you can check each link for importance, deciding which ones to try and have corrected, and which you will simply put a 301 redirect in place for. Of course, as we have recently learned, 301 redirects possibly pass all their authority, but many still prefer to have clean links wherever possible (as previous studies have shown some authority is lost).

So that's it. It might seem a little complex or time-consuming at first, but the process only takes a couple of hours, or less if Webmaster Tools hasn't reported too many errors. The rewards vary of course, but if you have an older domain, or one that went through a site migration without SEO assistance, there can be many broken links. We've found several hundred links for clients before doing this – worth getting for any site.

To make things a little easier (as this is a long post to follow!), we’ve put together a basic version you can access and copy for your own projects.

There’s plenty more that can be done of course. Another good use of time is finding the sites that linked to you at one point, but no longer do so, as excellently laid out here by Ethan Lloyd at Seer Interactive, and we’ll be bringing you more as well. Happy reclaiming!

© SEOptimise - Download our free business guide to blogging whitepaper and sign-up for the SEOptimise monthly newsletter. Plugging the link leaks, part 1 – reclaim links you are throwing away

Related posts:

  1. Post Penguin Recovery: Link Removal Strategy for Back Link Profile Clean Ups
  2. The Ultimate Resource Guide for Link Builders from Distilled LinkLove 2012
  3. Investigating Panda & Duplicate Content Issues