joi, 20 decembrie 2012

What Happened on December 13th?

What Happened on December 13th?


What Happened on December 13th?

Posted: 19 Dec 2012 06:43 PM PST

Posted by Dr. Pete

On the morning of December 14th, MozCast registered the largest 24-hour Google ranking flux on record since we started tracking data in early April. The temperature for Thursday, December 13th was 102.2° F (for reference, the original Penguin update was 93.1°):

102 degrees

This was especially striking since I had just rolled out a small fix in our computations for a problem that was slightly overestimating temperatures on some days since the rollout of 7-result SERPs.  SERPmetrics confirmed substantial levels of 24-hour flux, and webmaster chatter suggested that people were seeing major ranking and organic traffic changes.

Unfortunately, Google was unable to confirm an algorithm update. So, where does that leave us? It turns out that it’s not an easy question.

The Big Signals

A while back we launched a set of five top-view metrics to help provide an at-a-glance view of patterns across the entire set of rankings MozCast tracks.  Only one of those metrics moved noticeably between December 13th and 14th – PMD Influence suffered a sizeable one-day drop. PMD Influence is the percentage of Top 10 results occupied my partial-match domains (PMDs). This includes hyphenated and non-hyphenated domains that contain the keyword phrase but are not an exact match. Here’s the 30-day view:

PMD Influence (30-day)

PMD Influence dropped from 3.73% on 12/13 to 3.54% on 12/14 (about a 5.1% drop in 24 hours). While my gut says that drop wasn’t the full picture, it’s a good place to start. So, which sites lost out in this change?

Across the 1,000 SERPs tracked, this PMD drop represents a change of only 18 partial-match domains that fell out of the top ten. It’s a bit more complicated than that, though. There were actually 36 PMDs that fell out of the top ten, and 18 new PMDs that entered the top ten, for a net difference of 18. Analyzing these domains one-by-one can turn into a wild goose chase pretty quickly, so let’s look at a couple of situations where a keyword lost multiple PMDs.

One query that lost two PMDs was “barbeque”. On 12/13 the following PMDs ranked in the top ten:

  1. www.springcreekbarbeque.com
  2. www.qbarbeque.com
  3. www.barbequeman.com
  4. scbarbeque.com
  5. www.waltsbarbeque.com

The next day, domains (4) and (5) fell out of the top ten. Domain (5) had been floating near the #10 spot, so that may be a fluke. Interestingly, for just one day, Wikipedia’s barbecue page fell completely out of the top ten, after ranking in the #1 position consistently. We’ll explore that in the next section.

Here’s another example with multiple PMD losses – the keyword “joannes" had three PMDs ranking on 12/13:

  • www.joannesbedandback.com
  • www.joannesbb.com
  • www.joannesgourmetpizza.com

The next day, only (1) remained. Again, (2) and (3) were taking up the tail end of the top ten, and in this case were bumped out by Yelp and Urban Spoon, so this change may be smaller than it initially looks.

One PMD that lost ranking caught my eye – a query for “gmaps”. On 12/13, the domain [www.mgmaps.com] fell out of the top ten. This turns out to be a shift from a 10-result SERP to a 7-result SERP, and the PMD was sitting at #8 prior to the shift. Interestingly, though, Google Maps, which had been sitting at #2, took the #1 spot and got site-links and a 7-result SERP. We’ll come back to this one.

Sorry - we’re not exactly making the situation clearer, are we? I want to illustrate just how complex the situation really is. I’ve come to believe that not even Google fully understands the dynamic system they’ve created. Ultimately, there were no clear patterns across the PMD changes, so let’s dive into a couple of specific situations.

A Wiki Situation

Wikipedia suffered a rare (albeit temporary) loss of their coveted #1 position for the query “barbeque”. Since Wikipedia holds the largest share of top-ten real estate in our data set, a major change to the site (such as a technical problem that caused temporary de-indexation) could cause very large-scale flux in the rankings. Luckily, we can run these numbers.

On 12/13, Wikipedia had a 4.56% top-ten share in our data set, which dropped to 4.41%, for a net loss of 14 rankings. This may not sound like much, until you recall that that change is on par with the 18 ranking PMD shift (and Wikipedia is just one site). In some ways, this seems to be an anomaly of 12/13 more than 12/14, as Wikipedia held a 4.46% share on 12/12. Historically, the 12/14 numbers aren’t unheard of – Wikipedia had a 4.82% share back in June, for example.

I should also note that the Wikipedia page in question for the query “barbeque” was actually the “/Barbecue” (alternate spelling) page. It’s possible that a spell-check adjustment or other very minor code tweak could have had unexpected repercussions.

This does go to show, though, how a site as powerful as Wikipedia can definitely have an impact on the overall SERP landscape. Like the PMDs, I don’t think it’s the entire picture, but it is a piece of the puzzle.

The Curious Case

Let’s go back to another oddity in the PMD analysis – the query for “gmaps”. On the morning of 12/14, the official Google Maps site not only jumped from #2 to #1, but it got site-links and a 7-result SERP, pushing out three domains. It’s easy to jump to conclusions and assume Google is favoring their own products, except that two pieces of data make that unlikely here.

The first clue is that Google Maps returned to the #2 position on 12/15 (and a 10-result SERP). The second is that we know that something big happened on 12/13 – Google Maps finally re-launched on Apple’s iOS6. Here’s a headline and time-stamp from Forbes:

Forbes headline for 12/13

Obviously, this story had a ripple effect across 12/13, and probably had a huge impact on metrics (CTR, dwell time, etc.) related to Google Maps and the official site. While this doesn’t help our quest to find the source of the update, it is interesting to note that a major news item could not only change a ranking, but cause a 7/10 shift in results. My ongoing investigations indicate that 7-result SERPs are highly dynamic and automatically change based on factors that may include user metrics and QDF (“freshness”).

The Big Movers

Everything to this point came out of just one data point – the PMD shift. Let’s go back to the beginning and ask the other obvious question – which queries changed the most from 12/13 to 12/14? This turns out to be a tricky question, because some queries are just naturally higher-flux than others. Typically, I compare the 24-hour “temperature” for any given query to the 7-day average for that query, to get a ratio. This helps indicate which queries are unusually high-flux. For 12/14, here are ten unusually high-flux queries (with temperatures):

  1. “knockout roses” (181°)
  2. “condo rentals” (168°)
  3. “rosatis pizza” (161°)
  4. “aerosoles store locator” (158°)
  5. “bj wholesale hours” (151°)
  6. “party stores” (143°)
  7. “kitchen sinks” (137°)
  8. “millionaire matchmaker” (125°)
  9. “celiac disease diet” (119°)
  10. “garnishment” (115°)

Any one query is an anecdote – the web changes. What we’re looking for in the data is a calling card of sorts – a story that ties these queries together. Unfortunately, the patterns are all over the place. Our top mover (1) was just a case of an eHow page jumping up the rankings. Two of these queries (6 and 10) have no clear explanation other than multi-spot shifts. Query (8) seems to be a case of QDF and has high volatility outside of the 7-day window.

Four queries (2, 4, 5, and 9) showed shifts in domain diversity. For three of them, one domain went from a single spot in the top ten to multiple spots. For query (5), though, one domain lost spots (diversity increased). Our top-view metrics aren’t showing any big overall shifts in domain diversity, but there are always winners and losers day-to-day.

Query (3) was another case where Wikipedia dropped out of the top ten, and (7) saw an Amazon product page fall from #1 to #10. In the case of (3), Yelp moved up and went from one ranking in the top ten to two. In both cases, the big sites regained their positions on 12/15, which is certainly interesting. If we look at the MozCast “Big 10” data, though, Wikipedia was still #1 and Amazon #2 on 12/14, and the overall SERP share of the Big 10 didn’t move much.

The Bigger Mystery

So, where does all of this leave us? A handful of people were kind enough to send me evidence of search traffic losses on 12/14, but it’s very difficult to reconcile these specific cases against MozCast’s sampling of top ten SERPs. I can’t pinpoint any single factor here, but it seems clear that the amount of change was unusual, and it can’t be simply explained by any single event (this data was all recorded prior to the tragic events in Connecticut, for example).

It’s possible that Google made a small change – so small that they didn’t even consider it an “update” – that had unexpected repercussions. It’s possible that something non-algorithmic but still under Google’s control happened, such as processing a large chunk of disavow requests (we have no evidence of this – just covering the bases). It could be that a small set of highly influential sites, like Wikipedia, made large-scale changes. Or it could just be a massive coincidence (although my gut still says no on this one).

I’d welcome further data and discussion. We’re actively working to expand the MozCast data set, and the next version of it will include some enhancements, including a keyword set that’s cleanly divided across some major categories/verticals. We’ll also be working in the new year to automate some of the analysis tools, so that we can process large numbers of SERPs more quickly. We’re learning as we go, and I hope the exploration is useful.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Help Team University

Posted: 19 Dec 2012 02:26 AM PST

Posted by Nick Sayers

Here at Moz, we live for happy customers. We want to give our customers the best experience possible when using our product, interacting with our community, and solving problems if they arise, and the way we wanted to get there was to give everyone in the company a chance to do customer support. To help us reach our goal, we developed Help Team University, a daylong crash course where everyone at the company does support for our customers. The concept may seem a little odd. How does training developers, HR, and even executives on the finer details of customer care actually help customers?

Think about it this way: put a Mozzer who directly influences features in our products on the phone with a customer who has been adversely affected by a bug and it will provide a completely new perspective. By establishing a new window into customer support, we hope to make our products better from the ground up. We hope that HTU is a small step in the direction of being the most customer-driven SAAS company in the world!

Why HTU?

I believe that every single person at Moz is in customer support. We create a product for people to use and love. Every piece of the Moz process, from the E-team to Operations, deeply affects customers. HTU reinforces this connection and keeps the customer fresh in Mozzers’ minds. We like to think that the best customer support is a well-built product that is easy to use. Having the entire company interact with the people who use the product everyday will help us get there, because feedback is best coming directly from you, the customer.

Help Team University gives customers a tangible voice in shaping the entire company. For instance, one of our engineers had a conversation with a customer about a bug that had been persisting for a few weeks. The engineer then opened his laptop and started fixing the bug right then and there. In addition to providing on-the-spot fixes, HTU is also valuable in providing Mozzers insight into how customers actually use our product. Moz is rapidly growing and some of our newest additions may help develop our SEO crawler, but it's tough to know exactly what pieces customers use their crawl diagnostic reports without talking directly to users. HTU provides members from all teams the opportunity to learn how people use our tools, which aids the evolution of our analytics set to be more useful to you!

Another Level of TAGFEE

The most important thing Moz has to offer the world is TAGFEE. With HTU, we are teaching another level of TAGFEE to everyone at Moz, beyond how we express TAGFEE internally. The great thing about TAGFEE is that it looks different every time it's used, depending on the team or situation. People sitting down with us for HTU are getting a lesson in how to treat a customer with a deep level of empathy. Every conversation we have with a customer is framed with, “Well, put yourself in their shoes.” Mozzers are consistently surprised by how empathetic and generous we are to customers experiencing bugs. This will hopefully translate to generosity and empathy deeply embedded in our product.

Showing Off

The Help Team loves showing off how we interact with customers (for example, we send out a weekly digest to the entire company that lets them know how happy our customers are and what bugs are weighing heavily on our community). HTU takes this a step further. We get to show the rest of the team the entire process of how we collect happiness metrics and bug reports. HTU gives the Help Team a chance to show the rest of the team how we keep thousands of customers happy with 5-6 people.

Rewards

The HTU process wouldn’t exist if it wasn’t rewarding. Everyone on the Help Team rewards students by teaching them with optimism and humor. We like to make everyone laugh and keep their up, even if there is a tool outage or another issue is prompting a lot of customers to reach out. We also like to keep the HTU students grounded by focusing their attention to how important PRO is to our customers. In my opinion, that intimate connection could be the most rewarding takeaway from Help Team University. I think this is the reason people ask to come back and do HTU again, and why engineering leads frequently stop in to ask about any issues we’ve noticed, or even to just chat about customers. To top it off, we add an awesome HTU achievement badge (designed by Abe Schmidt) to their Moz profile page.

I am extremely lucky and proud that HTU continues inspire Mozzers. The entire process has been enlightening for the Help Team and the rest of Moz.  I think any company with a customer care team would benefit from having all hands help support their community. If your company does something similar or has thought about ways to be radical advocates for your customers, please share in the comments! Also, if you have any great ideas we can implement, please let us know! Oh, and don’t forget to thank everyone on our Help Team and Sarah Bird for giving HTU life.

Rand Fiskin helping customers

"It was a really fun day, and I feel like it grounded me back in the help world, which I've always loved." - Rand

"I really liked HTU. Specifically, I liked getting a feel for what kinds of issues come up and what Help Team thinks about to maintain a quality experience for customers.  I also liked being able to tell people outside of SEOmoz about HTU. Friends were very impressed by a company that values their Help Team's job enough to allow engineers to get mentoring from the Help Team. " - Ethel

"It good." - Miranda


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Niciun comentariu:

Trimiteți un comentariu