vineri, 19 aprilie 2013

Keyword-Driven Personas - Whiteboard Friday

Keyword-Driven Personas - Whiteboard Friday


Keyword-Driven Personas - Whiteboard Friday

Posted: 18 Apr 2013 06:12 PM PDT

Posted by RuthBurr

As inbound marketing is gaining traction, marketers in all inbound disciplines are realizing the importance of taking on keywords with a more holistic approach. It's time to start building your keywords into the bones of your site, rather than adding them once your site is already completely mapped out. 

In today's Whiteboard Friday, Ruth Burr discusses how you can use your keywords to drive personas, and ultimately affect your site mapping process for the better. Leave your thoughts and questions in the comments below! 

 

For your viewing pleasure, here's a still image of the whiteboard used in this week's video!

Still image of Whiteboard Friday - Ruth Burr - Keyword-Driven Personas

Update: Ruth referred to some code that Mike King of iAquire put together that may help your site if integrated into your analtyics. Give it a look!

Video Transcription

"Howdy, SEOmoz fans. My name's Ruth Burr. Welcome to another Whiteboard Friday. I'm the Lead SEO here at SEOmoz, and today I want to talk about using keywords to drive personas and ultimately your site mapping process.

One thing that we're really thinking a lot about as we move more and more toward an inbound marketing model, where there are multiple different people with multiple different functions all working together to have the best inbound marketing possible, is what we're doing with keywords and sort of when we're adding keywords into the site. I know that we've all had the experience in years past where we would get a site or get a piece of copy that was completely written and then just kind of have to plug our keywords into that existing content wherever they would fit. You might have an entire site that's already completely mapped out, it's got a sitemap, it's got information architecture, and then you're supposed to go in and put in your keywords. I've found that that is not always the best user experience for the keyword, and also isn't as effective as taking a more holistic approach.

So what I'm really hoping you guys will get out of this is take it back to your UX and your IA teams and really think about how you can build keywords more into the bones of the site.

One thing that Google is thinking a lot about that is really important for us to be thinking about as marketers as well is searcher intent. Search engines are spending tons of money and tons of time and tons of effort trying to figure out what people are searching for when they use a keyword. It behooves us as marketers to do the same thing because that way we can give people what they want when they tell us they want it, and that's the beauty of search engine marketing.

My example here is chocolate cookies, because I like to think about cookies. You might have somebody that's searching for the keyword "chocolate cookies," and maybe you own ChocolateCookies.com, a great domain. If that's the case, you don't really know what they want when they want chocolate cookies. They could be looking to buy chocolate cookies. They could want to learn how to make chocolate cookies. They could want recipes. You might also have ingredients. Maybe in addition to cookies you sell ingredients for cookies. Maybe you have recipe content and sales content, and you want to know how to serve up each of those pieces of content in a way that's really going to serve the user. What you can start doing is really thinking about the search intent of each one of these keywords and building that in to a traditional persona-based marketing model.

This is my example model. All of these examples are made up. The data is not real. You cannot use this data and take it out and just go build ChocolateCookies.com. You could, but results are not guaranteed. To reiterate, this data, made up.

In my ChocolateCookies.com example, we've got three different personas. We've mapped out who they are and what they want. Now we can actually assign keywords to them. Say you're trying to target people who want to make cookies. What they're looking for, they're looking for recipes, they're looking for ingredients. They are not looking to buy cookies. If somebody googles "chocolate cookie recipes" and they click through to your site and it's a page about how you can buy cookies from you, that is a bad user experience. Those people are not going to buy cookies, and they're also going to bounce right back to the search results.

That is the kind of thing that search engines are tracking. How quickly did somebody return to the search results page from your site? Did they do it without taking an action? If so, that can be a signal that you're not serving up quality content. It's bad from a ranking factor's perspective, and it's also bad because that person did not give you money and that's what we're trying to do, trying to sell cookie recipes.

So you really want to make sure that this person when they're searching for these keywords, which you've mapped back to their persona, you're serving up chocolate cookie recipes. And if they're looking for ingredients, you're serving up ingredients. Then you're creating an entire experience. You're not just paying lip service saying, "Oh, here's a recipe and then buy a bunch of stuff." You really are serving them up that high quality content that users love, that brings them back to the site again and again. If the recipe content is good enough, this baker might even share your content and share it with their friends, and maybe even link to it from their blog that's all about making cookies. Wouldn't that be nice?

Then you might also have somebody who does not want to make cookies because they don't have that kind of time. They want to buy cookies. They just want to buy them and then eat them. It's a model that I practiced for years. So they're going to be looking to buy cookies online. They're not going to care about recipes at all. They're not going to care about ingredients at all. They're going to be much more purchase-driven and be looking at keywords around their favorite brands and looking for sales. These are the people that you can really incentivize with calls to action and trust signals, like free shipping, delivery, sales, coupons, join our mailing list, and things like that. You've now mapped these back, so again you're creating this entire experience and all of this content based around the fact that this person does not care about recipes at all, they just want to buy.

Then our third persona is somebody who's buying at the corporate level. Maybe they're an office manager, or at SEOmoz, Team Happy is constantly buying us goodies and snacks, and we love that. But this person is in charge of the cookie supply at their office. What, does your office not have cookies? I'm so sorry. Get some cookies.

So this guy, he doesn't care about recipes at all. He's not going to make cookies every day for 100 people. He wants to buy them, and he's not spending his own money. He's spending the company's money. So he's looking for things like a corporate discount, a bulk discount, Maybe he's catering a party. He needs same-day delivery. These are the things that are really going to be important to this person. Since you know that, you can create content that is solely targeted toward this one person, this one buyer. Especially if you have things like a corporate discount, this is the place to really show it off.

So you've got these three different personas, and they're taking three very different paths through the site and they're consuming the site in different ways, whether it's buying a bunch of stuff, buying one thing, consuming your content and buying ingredients, coming back. Each of these personas is experiencing your content in very different ways. Rather than just creating one site and popping in keywords all willy-nilly so that all of these people are having the same experience, you can start crafting unique user experiences for each of these people based on their paths through the site.

Great, except that that takes a lot of time and money. Both in the fact that at most businesses time in some ways is money, and you may actually have to spend some money on it. One of the things that I actually really recommend doing during this part of the process is running some PPC campaigns around the keywords where you're trying to define user intent. If somebody's just searching on chocolate cookies, you might not know if they want to buy them, or if they want to make them or what they want to do. So use PPC, run a little test, and see whether people respond better if you've got recipes, or free shipping, or what the different calls to action are for those more generic terms. Over time you can start to see what the majority of users' intent is and what they really respond to and craft experiences for those more generic terms based around that. That's a really great way to use PPC as a little guinea pig test.

Now here comes my favorite part because it involves metrics. What you can do is go into your Google Analytics or whatever, use your analytics tools and start looking at these behaviors based on keywords. Once you've got your persona and you've got your keywords assigned to your persona, first of all make sure that all of these keywords really are the same persona. Make sure that users who enter on those keywords are taking similar paths through the site and executing similar actions. That's a great secondary indicator that all of these keywords do belong to this same persona.

Start looking at what they do. Maybe you get the most traffic from the baker, but you get the most revenue per order from the corporate guy. Maybe the shopper doesn't return as much, but she does convert at 2.4%. The baker spends the longest time on site, but maybe she doesn't buy as much. These are the things that you can start to look at and say, "Okay, so we know that the baker spends a lot of time on site, that's great. What can we do to encourage her to turn that into a purchase? How can we brand message to her in ways that make her feel more comfortable buying ingredients, or what can we do to incentivize her sharing this content which clearly she's consuming or loving?"

The same thing with the corporate guy. If he's got the highest revenue per order, obviously we want more of this guy. We want to figure out what does he want, what is he doing, and what are the triggers that we can use that get him to buy more or get him to return to the site more. You can start really testing, and that's great because it allows you, even just before you've done any of that amazing tweaking and testing, to say, "Okay where is the biggest mover of the needle among these two personas? What are the activities that we could be doing that could encourage them to do more of the activities they want to do fastest?" Then that'll help you prioritize and it'll help you target your efforts and your budget.

Then if you want to go above and beyond and really get in there and be a little bit creepy, what you can do is actually link up your site to Facebook Open Graph so that people are opting in to a Facebook app when they're registering on your site. They're connecting with Facebook. So there is that opt-in. You don't just want to take people's information. Once you've done that, you can actually, in your Google Analytics code, link it up to your Facebook Open Graph data, and you can start getting real demographic data on the actual people who are using these keywords and coming to your site. Now in addition to knowing that the baker is 40% of searches, you know that she's 35 to 40, you know she's female, and you know she's a mom. The corporate guy you know that he works at a company of more than 100 people most of the time. So you can really start targeting these people based on their demographic information.

What you also learn then is who these people are that like you so much. They're coming to your site over and over. They're buying things from you, which is really what we're trying to do here. And you can start targeting more of those people in your own SEO efforts, in your own customer acquisition efforts. You're targeting them on social. You're reaching out to them for links. You're buying ads to put in front of them, and you have more confidence that you'll have a return on those ads because you already know these are the kind of people who like you.

So you have all of this information about keywords and about personas. Now you can take that back to your user experience team, to your information architects and say, "Hey, let's redo the sitemap and have it be based on these personas, based on these proven user behaviors that start with a keyword and end with a purchase, and let's build experiences for those keywords." Now instead of just saying, "Well, here's what I think. We've got like About Us, Contact Us, Products." You can really say, "These are three main personas, so in the header we should probably have cookie recipes, shop cookies, corporate discount," and know that even from page one on the site whenever one of your target people comes to the site, it's really easy for them to find the experience they're looking for, make their way through the site, and then buy something.

Mike King of iAquire, who blogs at ipullrank.com, put together some code using Stack Overflow, which may or may not work on your site. Take it to your devs and see if they can make it work with your analytics. Every site is different. Your mileage may vary, but there is a link to it here at the bottom of the screen. There should be. It's invisible to me, but you can see it.

Now that you have this data, go to your UX people and show them the power of keyword-driven site mapping. Show them how SEO has so much to do with what they do, and not only will this project work for you, but in the future they'll be more likely to come back to you and say, "Hey, we're going to change the whole site, and we thought you should know before we do it." That's what you want.

That's it for Whiteboard Friday this week. Thanks for coming by you guys. See you next time."

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Difference Between Penguin and an Unnatural Links Penalty (and some info on Panda too)

Posted: 18 Apr 2013 05:32 AM PDT

Posted by Marie Haynes

Are you confused about the difference between Penguin and an Unnatural Links penalty? Not sure whether you should be disavowing your links? Wondering whether you should file for reconsideration? Well...you're not alone! I have spent a good amount of time answering questions and learning from others in the SEOmoz Q&A and I see a lot of site owners and even SEOs who are unsure about the answers to these questions.

Recently, a YouMoz article (that was promoted to the main blog) was written in which the author showed an image of the unnatural links warning that his site received and then stated:

"We straight away knew that we had been hit by Google's Panda 3.9.1 update!"

Oh dear. An unnatural links warning is NOT indication that you have been affected by Panda! Now, this article, and the comments below it have some great information on unnatural links recovery, so I don't want to be too harsh on the author. My point in mentioning this though is that even SEOs who know a thing or two about Google penalties and algorithm changes can be confused on these matters.

A confession - I messed up too.

I am insanely obsessed with understanding Penguin, Unnatural Links Penalties and Panda. I really don't know why. But it all started because I made a mistake. I was part of an SEO forum discussion in which a site owner felt they had been affected by the Penguin algorithm. I told him to clean up his bad links and then file for reconsideration. A senior member of the forum rightfully corrected me and said that I was giving incorrect advice. And he was right! As I will discuss further on in this article, filing for reconsideration is not going to help a Penguin hit site. I gave some bad advice and I am grateful that I was corrected. What that correction did was make me realize that Penguin and Unnatural Links Penalties are confusing. A lot of SEOs, myself included at the time, had a lot to learn about these issues. I made a decision that day that I would learn everything I could about algorithm changes and Google penalties.

A Brief Description of Penguin, Unnatural Links and Panda

Before we start answering questions, here is some fundamental information about Penguin, Unnatural Links Penalties and Panda:

The Penguin Algorithm

On April 24, 2012, Google announced "Another Step to Reward High Quality Sites", an algorithm change aimed at fighting against webspam. The algorithm change was first called "The Webspam Algorithm" but eventually began to go by the name of "Penguin". This algorithm severely affected sites that had widespread keyword stuffing and participation in link schemes. Matt Cutts, head of webspam at Google, eventually admitted on Twitter that links are "a primary area to monitor" when you have been affected by Penguin.

Matt Cutts Tweet

What most SEOs believe is that one of the primary causes of Penguin is when sites create easily made links containing keywords as anchor text from low quality places such as article marketing sites, bookmarks and do-follow comments.

Unnatural Links Penalties

Unnatural links warningThese penalties are manual penalties that Google can place on sites when they determine that a site is widely attempting to manipulate the search engine results by creating links. These penalties are manual as opposed to Penguin which is algorithmic. So, what causes a site to be hit with an unnatural links warning?

Most webmasters believe that if someone files a spam report against you, then this will open up your site for a manual review. Some have speculated that Google monitors some of the more competitive niches such as "payday loans", "car insurance", casino sites, etc. and manually checks for unnatural links. No one knows for sure.

The Panda Algorithm

The Panda Algorithm was created by Google in an attempt to cause low quality sites to be displayed much lower in the search results. When Panda first hit, it was an unnamed algorithm. Many named it the "Farmer update" as it seemed to be aimed at content farms that ranked well as a result of scraping content from other sites. Most SEOs believe that sites affected by Panda have issues with on page quality as opposed to the quality of their links as in Penguin and Unnatural Links penalties. Sites that have been affected by Panda often have significant amounts of duplicated content (either on their own site or more commonly, from other sites) and also thin content. Thin content is usually a page that consists of very few words. If a site contains a lot of duplicate and thin content then Google sees little reason to show this site prominently in its search results. An entire site can be severely demoted because of Panda even if only parts of the site have duplicate and thin content.

Now let's cover some of the points where people are the most confused about these issues.

What is the difference between Penguin and an Unnatural Links Penalty?

Both of these issues have to do with unnatural links. In both cases, the use of keywords as anchor text can be a factor. However, the main difference between the two is that Penguin is an algorithmic issue while Unnatural Links penalties are manual. A manual penalty is one that is levied by a human being, one site at a time. For example, a competitor could file a spam report on you which could result in a Google Webspam employee looking at your site. The employee could look at your backlinks and see that you have been engaging in practices that are considered as link schemes. As such, they may decide to levy a manual penalty on your site.

Penguin is not levied one site at a time. Google has created an algorithm which is designed to programmatically find sites that have been engaging in unnatural link building tactics. When Penguin updates, if your site has been flagged as a site that is engaging in webspam, then your site will be affected on the date of the update. No human being is directly involved in determining whether your site is affected. As a point of interest, I have heard some SEOs who have done testing and believe that Penguin can affect a site on any day and not just Penguin refresh days. So far, in sites that I have seen, it seems that Penguin can only affect a site on a Penguin refresh day. The reality is that at this point no one knows for certain whether or not a site can be affected by Penguin on a date other than a Penguin refresh date.

Do Penguin, Unnatural Links and Panda affect the whole site or just part of the site?

Penguin: Penguin usually affects a site on a page and keyword level. Let's say that you have a page called example.com/greenwidgets/ and you have been building links to this page all containing the anchor text, "green widgets". If Penguin affected you, then it would mean that this particular page would no longer rank well for "green widgets". Penguin generally does not affect an entire site. However, quite often when sites have been affected by Penguin, they have built many anchor texted links, possibly for many different keywords all to the homepage. This can mean that the homepage will not rank for a number of terms.

Unnatural Links: A manual unnatural links penalty can affect the entire site, or just a page, or even just one keyword. Sometimes a site can be penalized and be totally removed from the Google index. Other times, the site can still be in the index but not be shown in the first 10 pages for any of its keywords. Or, sometimes the penalty will not be as severe and may only affect one or two keywords. Here is a quote from Matt Cutts regarding a site that was penalized on a keyword level:

Matt Cutts on Widgets

The site in this example would not be able to rank for the keywords that they had used as anchors for sites that embedded their widgets.

Panda: Panda can affect an entire site, or sometimes one section such as a news blog on the site. Panda does not tend to affect just single pages of a website. If you have a site that has some good content, but a lot of thin and duplicate content, then the Panda filter can cause the entire site to have trouble ranking, not just the thin and duplicate pages.

Should you file for reconsideration if you have been affected by Penguin, Unnatural Links or Panda?

Penguin: No. A reconsideration request is only meant for sites that have a manual warning. If you have a manual warning then you will have a message in your WMT. (See the image next to the section above on Unnatural Links.) If you have been affected by Penguin, then, because this is an algorithmic issue, having a Google employee review the site will not help.

Unnatural Links: Yes. If you have a manual warning in your WMT then once you have done the work required to clean up the site (see below) then you will need to file for reconsideration.

Panda: No. See Penguin. Panda is also an algorithmic change and a reconsideration request will not help you recover.

Should you be using the disavow tool if you have been affected by Penguin, Unnatural Links or Panda?

On October 16, 2012, Google released the disavow tool which allowed webmasters to essentially have Google add an invisible "nofollow" to certain links that are pointing to their site. Since the release of this tool, there have been so many webmasters asking questions in Q&A as well as other SEO forums wondering if they should be disavowing their links. Many have become paranoid about their links and want to disavow everything that looks suspicious. I've seen people who wanted to disavow a great link because it was site-wide. I've seen others who wanted to disavow a pile of links even though they are already nofollowed links. There is a lot of confusion around the use of the disavow tool. This is probably why the disavow tool comes with this disclaimer:

Disavow warning.

Penguin: Google vaguely suggests that the disavow tool could be useful for a Penguin hit site. In their blog post about the disavow tool, they say the following:

"Q: Should I create a links file as a preventative measure even if I haven’t gotten a notification about unnatural links to my site?

A: If your site was affected by the Penguin algorithm update and you believe it might be because you built spammy or low-quality links to your site, you may want to look at your site's backlinks and disavow links that are the result of link schemes that violate Google's guidelines."

Most SEOs believe that if you have been affected by Penguin then you should use the disavow tool to discount the unnatural links to your site. At the time of writing this, Penguin has not refreshed since the disavow tool was released. (The tool was released October 16th and the last Penguin refresh was October 5th.) What this means is that we do not have any proof yet as to whether or not disavowing links will help a site to recover from Penguin. Hopefully it will, but there may be other factors that need to be addressed as well such as on page issues like keyword stuffing.

Unnatural Links: Yes. This is what the disavow tool was made for. Google says, in regards to a manual unnatural links penalty, "If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page."

Panda: No. As Panda generally does not have anything to do with backlinks, disavowing links to your site is not likely to help.

Do you need to manually remove links?

Penguin: While removing links is probably a good idea, it is likely not necessary. Because Penguin is an algorithm, to recover you don't need to show a human being evidence that you have worked hard to remove links. Most SEOs who are experienced with Penguin issues believe that disavowing your problematic links will help and that physically removing the links is not necessary. With that being said, if the bad links are under your control and easy to remove, then it is a good idea to do so.

Unnatural Links: When trying to recover from a manual unnatural links penalty, it is not enough to just disavow the bad links. Google wants to see evidence that you have tried to get as many of the unnatural links removed as possible. When you file for reconsideration, one of the first things that the webspam team member does is check a number of the links that they have flagged as unnatural and see how many of them you have gotten physically removed. For the unnatural links that you are unable to get removed because the webmaster didn't reply, or because they wanted a large sum of money or for whatever other reason, then you can disavow those links.

Removing an unnatural links penalty from a site can take a lot of work. If you are struggling to remove a penalty from your site, or if you are an SEO who would like to get involved in doing penalty removal work, I have documented everything that I do in order to get penalties removed in my book (see bio section for link).

Panda: No, it is not believed that any links need to be removed for sites affected by Panda.

When will you recover?

Penguin: Most SEOs believe that you will not be able to recover a Penguin hit site until Penguin refreshes again. Google announced at SMX West that in 2013 there would be a major Penguin update but did not say when this would happen. There are some people who believe that they have seen Penguin hit sites recover on a day other than a refresh day. There are ways to recover a Penguin hit site without waiting for a refresh. For example, if you had a "green widgets" page that had been affected by Penguin because you built anchor text using the phrase "green widgets", you could build a new page called "buying-green-widgets" and get new, good quality links to that page and possibly rank again for this term. The original page would not rank, but the new one could. The problem with this is that getting new good quality links is difficult. Google wants you to earn links and not make them yourself.

I asked John Mueller, a Google employee about whether or not it was possible to recover a Penguin hit site outside of a Penguin refresh and here is what he said:

"+Marie Haynes theoretically, in an artificial situation where there’s only one algorithm (which is, in practice, never the case), if a site is affected by a specific algorithm, then the data for that algorithm needs to be updated before it would see changes. In practice, while some elements might be very strong depending on what was done in the past, there are always a lot of factors involved, so significantly improving the site will result in noticeable changes over time, as we recrawl & reindex the site and it’s dependencies, as well as reprocess the associated signals. So yes, you’d need to wait for the algorithm to update if it were the only thing involved, but in practice it’s never the only thing involved so you’re not limited to waiting.

Also keep in mind that for long-running processes (be it algorithm updates like this, or other higher-level elements in our algorithms), it’s never a good idea to limit yourself to small, incremental improvements; waiting to see if “it’s enough” can take a while, so I’d recommend working to take a very good look at the issues you’ve run across, and working to make very significant improvements that will be more than enough (which users will appreciate as well, so there’s that win too)."
 
A full discussion on ways to recover from Penguin is outside of the scope of this article.

manual spam action revokedUnnatural Links: Once you file for reconsideration, it will take anywhere from 3-14 days to hear back from Google. I have had it take as long as six weeks, but this was just after the disavow tool was released and Google probably had a large backlog of sites to review. If you get the wonderful "manual spam action revoked" message, for some sites recovery can happen in a couple of days. Depending on how severe the penalty was, it can take significantly longer such as several months.

There are some sites that can have a penalty revoked but not see any increase in rankings at all. This generally happens when sites have no good links to prop the site up. If your site's backlink profile consisted of 99% self made links and you have removed or disavowed almost all of those links then you will need to get good, quality links to your site in order to rank again. Gone are the days of being able to rank well on poor quality links.

Some sites can still appear to be penalized after their manual penalty is lifted if they are also under the effects of Penguin. In most cases, it is believed that the work that is done to recover from an unnatural links penalty will also get you out of Penguin trouble. However, you'll need to see a Penguin refresh in order to start ranking well again.

Panda: Again, a full discussion on Panda recovery is outside of the scope of this article. Once you have done what is necessary to fix Panda issues such as duplication and thin content, then many sites will recover with the next Panda refresh. However, I have seen some sites that have taken several Panda refreshes in order to recover. As of March, 2013, Matt Cutts stated that Panda will not be doing large regular refreshes as we have been used to but instead it will now be regularly rolled into the regular algorithm. I expect that this means that Panda hit sites can recover much sooner now once the work is done.

Conclusion

The purpose of this article was to answer some of the regularly asked questions when it comes to differences between Penguin, Unnatural Links and Panda issues. I don't claim to have all of the answers though. I hope this article generates some good discussion and questions!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Niciun comentariu:

Trimiteți un comentariu