marți, 18 martie 2014

Announcing Moz Local: Simultaneous Listing Management on All Major Aggregators for $49/Year

Announcing Moz Local: Simultaneous Listing Management on All Major Aggregators for $49/Year


Announcing Moz Local: Simultaneous Listing Management on All Major Aggregators for $49/Year

Posted: 18 Mar 2014 01:19 AM PDT

Posted by David-Mihm

One of the many things that appealed to me about joining forces with Moz 18 months ago was the empathy that every Mozzer has for business owners and marketers trying to keep up with the frenetic pace of change in local search. Although it's generally thought of as less competitive than a lot of other disciplines (like news, video, or e-commerce SEO), the prerequisite set of tasks for success in local search continues to grow.

In the shift from desktop to mobile, local search is fragmenting more than ever, and business listings are an increasingly critical foundation. NAP consistency (establishing a canonical Name, Address, and Phone Number for your business location) is one of the top local search ranking factors every year. Establishing a consistent NAP is vital to ranking in local results. All the link building and social media in the world won't help a business if Google can't trust its information, and customers can't reach it.

Whether you're a small agency trying to serve dozens of mom-and-pops on a limited budget, or a large brand manager tasked with managing listings for hundreds of stores, the time it takes to ensure the accuracy and visibility of business information is overwhelming. Let alone the time it takes to correct errors, align categories, deal with PIN or postcard verifications, or add missing listings. And it's often prohibitively expensive.

So as we thought about how to evolve GetListed's original product, we decided to start by helping solve the fundamental pain point of local search: ensuring accurate, consistent business listing information on the most important sites on the web.

What does Moz Local do?

For a high-level overview, check out this video:

Our goal is to make Moz Local the most efficient option for location management, with an easy-to-use interface and an affordable price point.

In a nutshell, Moz Local allows you to upload a spreadsheet of all of your locations, which we then standardize and distribute to all five major U.S. data aggregators:

  • Infogroup
  • Neustar Localeze
  • Acxiom
  • Factual
  • Foursquare

and three important local directories:

  • Superpages
  • eLocal
  • Best of the Web Local

for $49/year per location.

After submitting your locations, we provide you with full reporting about the status of each listing (with links to those listings live on the web, where available). We'll also surface possible duplicate listings we discover across the ecosystem, provide you with the fastest path to correcting or closing those duplicates, and notify you of any unauthorized changes to your NAP that we come across in our local web crawl.

To dive into the product, visit Moz.com/local and download our CSV template. If you currently manage your locations at Google Places, though, you can get a head start by simply uploading that spreadsheet to Moz Local (we accept all the same field names and categories). Full documentation for the product is available here, and FAQs and a deeper description of how the product works are here.

Key features

Upgraded Listing Details page (free to all Moz Community members)

The original single-location lookup functionality from GetListed is still available at moz.com/local/searchâ€"and you can also access these Listing Details from your Moz Local dashboard. As part of the Moz Local changeover, we've upgraded it with a much snazzier results page and a quicker visual indication of how a business is doing and where you should focus your efforts.

Category Research Tool (free to all Moz Community members)

One of my persistent headaches back when I was a full-time local search consultant was performing category searches for slight wording variations as I was submitting listings across every single local search site.

With that in mind, we designed the Moz Local Category Research Tool to be a huge time- and energy-saver. Start typing the keywords or industry your business is in, and we'll start refining the list of categories right before your eyes. Selecting a category will then show you how it maps to different search engines or directories when we publish your listing.

If there's a more specific category on a particular search engine that you'd rather submit for a given listing, simply add it to the Category Overrides field in your CSV spreadsheet.

Duplicate listing notifications

As I mentioned above, we provide reporting on possible duplicate listings in the ecosystem, and where possible, we present you a direct path to closing them. Right now you'll see a relatively tight set of possible duplicates, but going forward you'll see a wider possible set to help you clean up old addresses, changed business names, or unwanted tracking phone numbers.

Expanded Learning Center (free to all Moz Community members)

Huge thanks to Miriam Ellis for her assistance in compiling, updating, and editing this greatly expanded version of the GetListed Learning Center. We now offer 41 pages full of local marketing background and best practices. The top pages from the original Learning Center like the local search glossary, marketing priority questionnaire, and the local search ecosystems are all still available.

Features we're already working on

We've already gotten some terrific feedback from our Customer Advisory Board and other customers during a private beta period, and the product we're releasing today is much better as a result. Going forward, we're anxious to hear from the Moz community what feature areas you'd like to see us expand into.

Features currently on our list include:

  • allowing for the editing of single locations in-app

  • building custom-branded and emailed reports

  • showing individual listing progress over time

  • adding additional search engine and data partners
    (if you're interested in a data partnership with Moz, please email Ryan Watson!)

I have a feeling it will be a common request, but at this point Moz Local only supports U.S. business locations. International versions of this product aren't in our near-term roadmap for development.

Thanks all around

There are a lot of people to thank, with such a big product releaseâ€"it has definitely been a team effort:

  • the entire Local Engineering and Inbound Engineering teams here at Moz

  • the Marketing and Community teams, especially my "point person" for coordinating those efforts, Elizabeth Crouch

  • the Executive Team for giving us the leeway and the budget to build this product

  • Josh Mortenson, Elijah Tiegs, and Elizabeth Crouch for our video

  • Jackie Immel and Courtney Davis for their help in coordinating our beta period

  • our beta testers for their participation and patience!

  • the data aggregators and directories who have partnered with us

  • the users of GetListed who have given us so much great feedback over the years

I'm sure that's leaving dozens, if not hundreds of people outâ€"but I'm truly grateful for the support of everyone in the local search community over the years. As with many software endeavors, it's taken us a little longer to get here than we'd hoped, but we also hope that you in the Moz community think it was worth the wait!

The formal press release announcing Moz Local can be found here.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

A Startling Case Study of Manual Penalties and Negative SEO

Posted: 17 Mar 2014 03:53 AM PDT

Posted by YonDotan

This January, I was at a talk at SMX Israel by John Mueller â€" Google’s Webmaster Trends Analyst â€" about how to recover from a manual penalty. The session’s moderator opened the talk by asking the hundreds of people seated in the room to raise their hands if they had ever been affected by or had a client that was affected by a manual penalty. Nearly the entire room raised their hands â€" myself included.

Setting the Plot

I am the head of SEO at yellowHEAD, an online marketing agency. One of our clients, whom we are very lucky to have, is a company called Ginger Software. Ginger has a set of context-sensitive grammar and spell check tools that can be integrated with e-mails, browsers, Microsoft Office, and more. When we began working with Ginger, they were in a great state from an SEO perspective. I won’t get into traffic specifics, but their site has an Alexa ranking of around 7,000.

Ginger was getting traffic from thousands of different keywords. They had links from news portals, review websites, forums, social bookmarks â€" all part of a really great backlink profile. Ginger could be in a whole separate case study about the benefits of a content strategy. They have put months of work into online tools, sections about spelling mistakes, grammar rules, and more. These things have attracted great traffic and links from around the world.

The Plot Thickens

Given the above, you can imagine our surprise when one day in my inbox I found the dreaded notice from Google that gingersoftware.com had a site-wide manual penalty for unnatural inbound links. We quickly set up a call and went through the tooth-rattling ordeal of explaining to our client that they weren’t even ranked for their brand name. Organic traffic dropped by a whopping 94% - and that for a website that gets 66% of its traffic from Google-based organic search.

I’m not going to highlight where they got the penalty … because I think you can tell.

Full Disclosure

Before we go on any further with this case study, I should come clean. In the years of my working in SEO, I have shamelessly bought links, posted crappy blog and forum comments, and run programs that automatically build thousands of spam links. I have bought expired domains, created blog networks, and have ranked affiliate sites with every manner of blackhat technique.

With that off my chest â€" I will say with as clean a conscience as possible, we did absolutely nothing of the sort for Ginger. While everyone at yellowHEAD has experience with all manners of SEO tactics, in our work as an agency we work with big brands, the presence of which we are categorically not willing to risk. Ginger is a true example of a site that has ranked well because of an extensive and well-thought out content strategy; a strategy driven by creating valuable content for users. When analyzing Ginger’s backlinks, we were amazed to see the kinds of links that had been created because of this strategy. Take, for example, this forum link on the Texas Fishing Forums.

I was positive that this link would be a spam forum comment or something of the sort. Turns out that it’s a page on a fishing forum about Zebra Mussels. Someone got confused and called them Zebra Muscles; a veteran user corrected them by linking to Ginger’s page about muscle vs mussel.

The Plot Thickens… More.

As we dug deeper into Ginger’s backlinks, we quickly began to find the problem. Ginger had recently accrued a large number of extremely spammy links. Bear with me for a little bit because these links require some explanation. GingerSoftware.com was being linked to from random pages on dozens of different websites in clearly spun articles about pornography, pharmaceuticals, gambling, and more. These pages were linking to random marginal articles on Ginger’s website like this page always using the same few keywords â€" “occurred,” “subsequently,” and a few other similar words. The only thing these words had in common was that Ginger was ranked in the top three for them in Google.

I had to blur most of the text from this page, as it was inappropriate.

Now, needless to say, even if we were trying to rank Ginger’s site let’s call it ‘unconventionally,’ we wouldn't have done it to unimportant pages that were already ranking in the top three from articles about pornography.

Now here’s where it gets REALLY interesting

Further investigation into these pages found the same exact articles on dozens of other websites, all linking to different websites using exactly the same keywords. For example:

Link to Wiktionary.org

Link to TheFreeDictionary.com

Link to Thesaurus.com

So â€" What the $#@!%!#$^ are these links?!

As I mentioned in my disclosure previously â€" I am no newcomer to link spam, so I happen to know a bit about what these links are. These articles were, first and foremost, not created by us or by anyone else at Ginger. They were also not posted with Ginger Software or any of the other websites linked to in those articles in mind. These articles were posted by spammers using programs which automatically build links (my guess is GSA Search Engine Ranker) in order to rank websites. Each one of these articles linked to some spam website (think something like the-best-diet-pills-green-coffee-beans-are-awesome . info or some nonsense like that) in addition to linking to Ginger.

These programs find places on the internet where they can automatically post articles with links. As a way to ‘trick’ Google into thinking the links are natural, they also include links to other big websites in good neighborhoods. Common targets for these kinds of links include Wikipedia, BBC, CNN, and other such websites.

Ginger was not the victim of negative SEO, but was simply caught in the crossfire of some spammers trying to promote their own websites.

We Had Doubts

Once we found these links, we honed our search to find all of them. We were able to do this using Ahrefs, which is a fantastic tool for any sort of link analysis. We organized all of the links to Ginger by anchor text and went after all of the ones with the aforementioned keywords. We removed as many of these links as possible, disavowed the rest, and filed for reconsideration as described above.

As confident as we were on the face of it all â€" we had serious doubts. We knew how important it was for Ginger’s business to get over this penalty as quickly as possible and didn't want to get anything wrong. We couldn't find any other “bad links” besides these ones but we kept thinking to ourselves “there’s no way that Google completely slapped a website due to some spam links to these random pages.” There had to be more to it than that!

Ginger themselves handled this situation incredibly. Where they could have yelled and gotten angry, instead they said, in a sentence “Ok â€" let’s fix this. How do we help?” With Ginger’s help, we mobilized dozens of people inside their company, trained them on finding bad links, manually reviewed over 40,000 links, contacted all domains which had spam links on them, disavowed everything we couldn't get to, and submitted the request for reconsideration on December 17th, only five days after the site got penalized. The extreme sense of urgency behind this came both because of the importance of organic traffic for Ginger Software, and because the upcoming Christmas and New Year’s holidays. We knew that everyone going on vacation would significantly increase the amount of time it took to have the reconsideration request reviewed. You can find a very long and detailed explanation of the process we used to clean up Ginger's links here.

Despite the speed with which we were able to submit the request, it took nearly a month to hear back from Google. On January 15th, we received a message in Google Webmaster Tools that the penalty had been revoked. We, and the staff at Ginger, were ecstatic and spent the next few days glued to our ranking trackers and to Google Analytics to see what would happen. Rankings and traffic quickly began to rise and, as of the writing of this article, traffic is at about 82% of pre-penalty levels.

Lo and Behold â€" Rankings!

The (Very) Unofficial Response from Google

Getting over the manual penalty, in some ways, was almost as surprising as getting it. The fact that all we did was remove and disavow the negative SEO links and the penalty was removed indicates that, indeed, the penalty may have been caused entirely by those links.

At the manual penalty session of SMX, towards the end of the talk, I crept slowly towards the front of the room and as soon as the talk was over, as unexpectedly as a manual penalty, I pounced to the front of the speakers’ podium to talk to John Mueller before everyone else. I explained to him (in a much shorter version than this article) the situation with Ginger and asked if they were aware of this at Google and what they plan to do about it.

John responded with something along the lines of the following:

“You mean like when somebody creates spam links but also links to Wikipedia? … We have seen it happen before. Sometimes we can tell but sometimes it’s a little bit harder… but [if] you get a manual penalty from it you will know about it so you can just disavow the links.”

I have to say, I was pretty surprised with that response. While it wasn't exactly an admission of guilt, it wasn't a denial either. He basically said yes, it can happen but if it happens you will get a manual penalty, so you’ll know about it!

So What Does It All Mean?

One wonders if Google understands the impact a manual penalty can have on a business and if they truly accept the responsibility that comes along with handing out these kinds of punishments. Ginger, as a company, relies on search traffic as their main method of user acquisition and they are not unique in that sense. There are a few important takeaways here.

1.) CHECK YOUR BACKLINKS

No matter who you are â€" big or small, this is crucial. This kind of thing can happen, seemingly, to anyone. We have instated a weekly backlink scan for Ginger Software in which we look through all of their new links from Webmaster Tools, AHREFS, and Majestic SEO. If we find any more spam links (which we still are finding), we try to remove them and add them to the disavow list. Time consuming? Yes. Critical? Yes.

2.) Negative SEO is Alive and Real

It has been my thinking for a long time that links should not be able to hurt your website. At the most, a link should be discounted if it is considered bad. The current system is dangerous and too easy to game. With Ginger, it was obvious (to us at least) that these links were no doing of their own. The links were in absurd places of the lowest quality and linked to low-benefit unimportant pages of Ginger’s website. If this was actually a negative SEO attack, imagine how easy it would be to make it look like it was the company’s doing.

3.) Google is making themselves look REALLY bad.

The action that Google took in this case was far too drastic. The site didn’t receive a partial penalty, but rather a full-blown sitewide penalty. According to the keyword planner, for the top four branded terms for Ginger, there are 23,300 searches per month. In this case that became 23,300 searches per month where people could not find exactly what they were looking for.

Google has an amazing amount of work on their hands staying ahead of the spammers of the world, but they have also become the foundation of the business models of companies worldwide. To quote from FDR and Spiderman (who can argue with that???), “with great power comes great responsibility.” We can only hope that Google will heed these words and, in the meantime, we will be happy with the fact that Ginger are back up and running.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Niciun comentariu:

Trimiteți un comentariu