miercuri, 11 iunie 2014

Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird

Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird


Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird

Posted: 10 Jun 2014 05:16 PM PDT

Posted by MarieHaynes

If you're reading the Moz blog, then you probably have a decent understanding of Google and its algorithm changes. However, there is probably a good percentage of the Moz audience that is still confused about the effects that Panda, Penguin, and Hummingbird can have on your site. I did write a post last year about the main  differences between Penguin and a Manual Unnautral Links Penalty, and if you haven't read that, it'll give you a good primer.

The point of this article is to explain very simply what each of these algorithms are meant to do. It is hopefully a good reference that you can point your clients to if you want to explain an algorithm change and not overwhelm them with technical details about 301s, canonicals, crawl errors, and other confusing SEO terminologies.

What is an algorithm change?

First of all, let's start by discussing the Google algorithm. It's immensely complicated and continues to get more complicated as Google tries its best to provide searchers with the information that they need. When search engines were first created, early search marketers were able to easily find ways to make the search engine think that their client's site was the one that should rank well. In some cases it was as simple as putting in some code on the website called a meta keywords tag. The meta keywords tag would tell search engines what the page was about.

As Google evolved, its engineers, who were primarily focused on making the search engine results as relevant to users as possible, continued to work on ways to stop people from cheating, and looked at other ways to show the most relevant pages at the top of their searches. The algorithm now looks at hundreds of different factors. There are some that we know are significant such as having a good descriptive title (between the <title></title> tags in the code.) And there are many that are the subject of speculation such as  whether or not Google +1's contribute to a site's rankings.

In the past, the Google algorithm would change very infrequently. If your site was sitting at #1 for a certain keyword, it was guaranteed to stay there until the next update which might not happen for weeks or months. Then, they would push out another update and things would change. They would stay that way until the next update happened. If you're interested in reading about how Google used to push updates out of its index, you may find this  Webmaster World forum thread from 2002 interesting. (Many thanks to Paul Macnamara  for explaining to me how algo changes used to work on Google in the past and pointing me to the Webmaster World thread.)

This all changed with launch of "Caffeine" in 2010. Since Caffeine launched, the search engine results have been changing several times a day rather than every few weeks. Google makes over 600 changes to its algorithm in a year, and the vast majority of these are not announced. But, when Google makes a really big change, they give it a name, usually make an announcement, and everyone in the SEO world goes crazy trying to figure out how to understand the changes and use them to their advantage.

Three of the biggest changes that have happened in the last few years are the Panda algorithm, the Penguin algorithm and Hummingbird.

What is the Panda algorithm?

Panda first launched on February 23, 2011. It was a big deal. The purpose of Panda was to try to show high-quality sites higher in search results and demote sites that may be of lower quality. This algorithm change was unnamed when it first came out, and many of us called it the "Farmer" update as it seemed to affect content farms. (Content farms are sites that aggregate information from many sources, often stealing that information from other sites, in order to create large numbers of pages with the sole purpose of ranking well in Google for many different keywords.) However, it affected a very large number of sites. The algorithm change was eventually officially named after one of its creators, Navneet Panda.

When Panda first happened, a lot of SEOs in forums thought that this algorithm was targeting sites with unnatural backlink patterns. However, it turns out that links are most likely not a part of the Panda algorithm. It is all about on-site quality.

In most cases, sites that were affected by Panda were hit quite hard. But, I have also seen sites that have taken a slight loss on the date of a Panda update. Panda tends to be a site-wide issue which means that it doesn't just demote certain pages of your site in the search engine results, but instead, Google considers the entire site to be of lower quality. In some cases though Panda can affect just a section of a site such as a news blog or one particular subdomain.

Whenever a Google employee is asked about what needs to be done to recover from Panda, they refer to a  blog post by Google Employee Amit Singhal that gives a checklist that you can use on your site to determine if your site really is high quality or not. Here is the list:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don't get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you'd want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

Phew! That list is pretty overwhelming! These questions do not necessarily mean that Google tries to algorithmically figure out whether your articles are interesting or whether you have told both sides of a story. Rather, the questions are there because all of these factors can contribute to how real-life users would rate the quality of your site. No one really knows all of the factors that Google uses in determining the quality of your site through the eyes of Panda. Ultimately though, the focus is on creating the best site possible for your users.  It is also important that only your best stuff is given to Google to have in its index. There are a few factors that are widely accepted as important things to look at in regards to Panda:

Thin content

A "thin" page is a page that adds little or no value to someone who is reading it. It doesn't necessarily mean that a page has to be a certain number of words, but quite often, pages with very few words are not super-helpful. If you have a large number of pages on your site that contain just one or two sentences and those pages are all included in the Google index, then the Panda algorithm may determine that the majority of your indexed pages are of low quality.

Having the odd thin page is not going to cause you to run in to Panda problems. But, if a big enough portion of your site contains pages that are not helpful to users, then that is not good.

Duplicate content

There are several ways that duplicate content can cause your site to be viewed as a low-quality site by the Panda algorithm. The first is when a site has a large amount of content that is copied from other sources on the web. Let's say that you have a blog on your site and you populate that blog with articles that are taken from other sources. Google is pretty good at figuring out that you are not the creator of this content. If the algorithm can see that a large portion of your site is made up of content that exists on other sites then this can cause Panda to look at you unfavorably.

You can also run into problems with duplicated content on your own site. One example would be for a site that has a large number of products for sale. Perhaps each product has a separate page for each color variation and size. But, all of these pages are essentially the same. If one product comes in 20 different colors and each of those come in 6 different sizes, then that means that you have 120 pages for the same product, all of which are almost identical. Now, imagine that you sell 4,000 products. This means that you've got almost half a million pages in the Google index when really 4,000 pages would suffice. In this type of situation, the fix for this problem is to use something called a canonical tag. Moz has got a really good guide on using canonical tags  here, and Dr. Pete has also written this great article on canonical tag use

Low-quality content

When I write an article and publish it on one of my websites, the only type of information that I want to present to Google is information that is the absolute best of its kind. In the past, many SEOs have given advice to site owners saying that it was important to blog every day and make sure that you are always adding content for Google to index. But, if what you are producing is not high quality content, then you could be doing more harm than good. A lot of Amit Singhal's questions listed above are asking whether the content on your site is valuable to readers. Let's say that I have an SEO blog and every day I take a short blurb from each of the interesting SEO articles that I have read online and publish it as a blog post on my site. Is Google going to want to show searchers my summary of these articles, or would they rather show them the actual articles? Of course my summary is not going to be as valuable as the real thing! Now, let's say that I have done this every day for 4 years. Now my site has over 4,000 pages that contain information that is not unique and not as valuable as other sites on the same topics.

Here is another example. Let's say that I am a plumber. I've been told that I should blog regularly, so several times a week I write a 2-3 paragraph article on things like, "How to fix a leaky faucet" or "How to unclog a toilet." But, I'm busy and don't have much time to put into my website so each article I've written contains keywords in the title and a few times in the content, but the content is not in depth and is not that helpful to readers. If the majority of the pages on my site contain information that no one is engaging with, then this can be a sign of low quality in the eyes of the Panda algorithm.

There are other factors that probably play a roll in the Panda algorithm.  Glenn Gabe recently wrote an  excellent article on his evaluation of sites affected by the most recent Panda update.  His bullet point list of things to improve upon when affected by Panda is extremely thorough.

How to recover from a Panda hit

Google refreshes the Panda algorithm approximately monthly. They used to announce whenever they were refreshing the algorithm, but now they only do this if there is a really big change to the Panda algorithm. What happens when the Panda algorithm refreshes is that Google takes a new look at each site on the web and determines whether or not it looks like a quality site in regards to the criteria that the Panda algorithm looks at. If your site was adversely affected by Panda and you have made changes such as removing thin and duplicate content then, when Panda refreshes, you should see that things improve. However, for some sites it can take a couple of Panda refreshes to see the full extent of the improvements. This is because it can sometimes take several months for Google to revisit all of your pages and recognize the changes that you have made.

Every now and then, instead of just refreshing the algorithm, Google does what they call an update. When an update happens, this means that Google has changed the criteria that they use to determine what is and isn't considered high quality. On May 20, 2014, Google did a major update which they called Panda 4.0. This caused a lot of sites to see significant changes in regards to Panda:

Not all Panda recoveries are as dramatic as this one. But, if you have been affected by Panda and you work hard to make changes to your site, you really should see some improvement.

What is the Penguin algorithm?

Penguin

The Penguin algorithm initially rolled out on April 24, 2012. The goal of Penguin is to reduce the trust that Google has in sites that have cheated by creating unnatural backlinks in order to gain an advantage in the Google results. While the primary focus of Penguin is on unnatural links, there can be other  factors that can affect a site in the eyes of Penguin as well. Links, though, are known to be by far the most important thing to look at.

Why are links important?

A link is like a vote for your site. If a well respected site links to your site, then this is a recommendation for your site. If a small, unknown site links to you then this vote is not going to count for as much as a vote from an authoritative site. Still, if you can get a large number of these small votes, they really can make a difference. This is why, in the past, SEOs would try to get as many links as they could from any possible source.

Another thing that is important in the Google algorithms is anchor text. Anchor text is the text that is underlined in a link. So, in this link to a great  SEO blog, the anchor text would be "SEO blog." If Moz.com gets a number of sites linking to them using the anchor text "SEO blog," that is a hint to Google that people searching for "SEO blog" probably want to see sites like Moz in their search results.

It's not hard to see how people could manipulate this part of the algorithm. Let's say that I am doing SEO for a landscaping company in Orlando. In the past, one of the ways that I could cheat the algorithm into thinking that my company should be ranked highly would be to create a bunch of self made links and use anchor text in these links that contain phrases like Orlando Landscaping Company, Landscapers in Orlando and Orlando Landscaping. While an authoritative link from a well respected site is good, what people discovered is that creating a large number of links from low quality sites was quite effective. As such, what SEOs would do is create links from easy to get places like directory listings, self made articles, and links in comments and forum posts.

While we don't know exactly what factors the Penguin algorithm looks at, what we do know is that this type of low quality, self made link is what the algorithm is trying to detect. In my mind, the Penguin algorithm is sort of like Google putting a "trust factor" on your links. I used to tell people that Penguin could affect a site on a page or even a keyword level, but Google employee John Mueller has said several times now that Penguin is a sitewide algorithm. This means that if the Penguin algorithm determines that a large number of the links to your site are untrustworthy, then this reduces Google's trust in your entire site. As such, the whole site will see a reduction in rankings.  

While Penguin affected a lot of sites drastically, I have seen many sites that saw a small reduction in rankings.  The difference, of course, depends on the amount of link manipulation that has been done.

How to recover from a Penguin hit?

Penguin is a filter just like Panda. What that means, is that the algorithm is re-run periodically and sites are re-evaluated with each re-run. At this point it is not run very often at all. The last update was October 4, 2013 which means that we have currently been waiting eight months for a new Penguin update. In order to recover from Penguin, you need to identify the unnatural links pointing to your site and either remove them, or if you can't remove them you can ask Google to no longer count them by using the  disavow tool. Then, the next time that Penguin refreshes or updates, if you have done a good enough job at cleaning up your unnatural links, you will once again regain trust in Google's eyes.  In some cases, it can take a couple of refreshes in order for a site to completely escape Penguin because it can take up to 6 months for all of a site's disavow file to be completely processed.

If you are not certain how to identify which links to your site are unnatural, here are some good resources for you:

The disavow tool is something that you probably should only be using if you really understand how it works. It is potentially possible for you to do more harm than good to your site if you disavow the wrong links. Here is some information on using the disavow tool:

It's important to note that when sites "recover" from Penguin, they often don't skyrocket up to top rankings once again as those previously high rankings were probably based on the power of links that are now considered unnatural. Here is some information on  what to expect when you have recovered from a link based penalty or algorithmic issue.

Also, the Penguin algorithm is not the same thing as a manual unnatural links penalty. You do not need to file a reconsideration request to recover from Penguin. You also do not need to document the work that you have done in order to get links removed as no Google employee will be manually reviewing your work. As mentioned previously, here is more information on the  difference between the Penguin algorithm and a manual unnatural links penalty.

What is Hummingbird?

Hummingbird is a completely different animal than Penguin or Panda. (Yeah, I know...that was a bad pun.) I will commonly get people emailing me telling me that Hummingbird destroyed their rankings. I would say that in almost every case that I have evalutated, this was not true. Google made their announcement about Hummingbird on September 26, 2013. However, at that time, they announced that Hummingbird had already been live for about a month. If the Hummingbird algorithm was truly responsible for catastrophic ranking fluctuations then we really should have seen an outcry from the SEO world of something drastic happening in August of 2013, and this did not happen. There did seem to be some type of fluctuation that happened around August 21 as reported here on Search Engine Round Table, but there were not many sites that reported huge ranking changes on that day.

If you think that Hummingbird affected you, it's not a bad idea to look at your traffic to see if you noticed a drop on October 4, 2013 which was actually a refresh of the Penguin algorithm. I believe that a lot of people who thought that they were affected by Hummingbird were actually affected by Penguin which happened just a week after Google made their announcement about Hummingbird.

There are some excellent articles on Hummingbird here and here. Hummingbird was a complete overhaul of the entire Google algorithm. As Danny Sullivan put it, if you consider the Google algorithm as an engine, Panda and Penguin are algorithm changes that were like putting a new part in the engine such as a filter or a fuel pump. But, Hummingbird wasn't just a new part; it was a completely new engine. That new engine still makes use of many of the old parts (such as Panda and Penguin) but a good amount of the engine is completely original.

The goal of the Hummingbird algorithm is for Google to better understand a user's query. Bill Slawski who writes about Google patents has a great example of this in his post here. He explains that when someone searches for "What is the best place to find and eat Chicago deep dish style pizza?", Hummingbird is able to discern that by "place" the user likely would be interested in results that show "restaurants". There is speculation that these changes were necessary in order for Google's voice search to be more effective. When we're typing a search query, we might type, "best Seattle SEO company" but when we're speaking a query (i.e. via Google Glass or via Google Now) we're more likely to say something like, "Which firm in Seattle offers the best SEO services?" The point of Hummingbird is to better understand what users mean when they have queries like this.

So how do I recover or improve in the eyes of Hummingbird?

If you read the posts referenced above, the answer to this question is essentially to create content that answers users queries rather than just trying to rank for a particular keyword. But really, this is what you should already be doing!

It appears that Google's goal with all of these algorithm changes (Panda, Penguin and Hummingbird) is to encourage webmasters to publish content that is the best of its kind. Google's goal is to deliver answers to people who are searching. If you can produce content that answers people's questions, then you're on the right track.

I know that that is a really vague answer when it comes to "recovering" from Hummingbird. Hummingbird really is different than Panda and Penguin. When a site has been demoted by the Panda or Penguin algorithm, it's because Google has lost some trust in the site's quality, whether it is on-site quality or the legitimacy of its backlinks. If you fix those quality issues you can regain the algorithm's trust and subsequently see improvements. But, if your site seems to be doing poorly since the launch of Hummingbird, then there really isn't a way to recover those keyword rankings that you once held. You can, however, get new traffic by finding ways to be more thorough and complete in what your website offers.

Do you have more questions?

My goal in writing this article was to have a resource to point people to when they had basic questions about Panda, Penguin and Hummingbird. Recently, when I published my penalty newsletter, I had a small business owner comment that it was very interesting but that most of it went over their head. I realized that many people outside of the SEO world are greatly affected by these algorithm changes, but don't have much information on why they have affected their website.

Do you have more questions about Panda, Penguin or Hummingbird? If so, I'd be happy to address them in the comments. I also would love for those of you who are experienced with dealing with websites affected by these issues to comment as well.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Help with (Not Provided): New Landing Pages Report in Moz Analytics

Posted: 10 Jun 2014 05:44 AM PDT

Posted by JayLeary

There's been a lot of talk about what (not provided) means for SEO practitioners, with Mozzers weighing in on broader implications and alternative strategies.

When it comes down to it, we're stuck with a frustrating predicament that's forcing us to change our process. Activities like keyword gap analysis, editorial brainstorming, basic rank tracking, and  even SEM planning have become harder to execute.

More fundamentally, we're having a tough time answering simple questions like:

"Which keywords are sending traffic to my site(s)?"

"Which keyword opportunities am I not leveraging? Where are the gaps?"

So where do we go from here? Many of you have already changed your reporting strategies to reflect the new SEO reality. Focusing on topics, landing pages, and broader content strategies is a solid pivot, and we'll be working hard at Moz to support those efforts.

But we also recognize that keyword-level data remains an important signal for many SEOs.

With that in mind, we're happy to announce a new report in Moz Analytics that will help answer some of those (not provided) questions.

Quick Note: The new feature is only available to Moz Analytics campaigns connected to Google Analytics. If you're using the old version of our software (the SEOmoz PRO app) and want to take advantage, be sure to switch your campaigns over to Moz Analytics and connect GA.

The Landing Pages report

TL;DR We've grouped your tracked keywords by landing page and correlated them with a new metric, Estimated Traffic Share. Use the report to determine which keywords are your strongest traffic drivers.

In the new data view, your tracked keywords are grouped by landing page and correlated with both ranking position and visits.

The Estimated Traffic Share metric is our best guess at the percentage of visits each keyword contributes. The value is based on a combination of landing page traffic, keyword ranking position, estimated search volume, and SERP click-through-rates.

Let's look at a quick example from Rand's blog:

We know that Rand's evergreen post about stock options at startups received 170 organic visits. We also know it ranks decently for a couple of relevant phrases including "startup stock options."

Based on ranking position and search volume, however, it's a safe bet we're missing at least a few of the most important keyword targets. A peek at the Opportunities tab confirms our assumption:

Tracking phrases like "understanding stock starting a company" provides additional insight into the post's organic footprint and gives back some of the basic data (not provided) took away.

Sometimes you'll see the opposite: Estimated Traffic Shares that sum to a big chunk of the landing page total.

In those situations you can make an educated guess that the primary traffic drivers are being tracked.

Quick Note: This is obvious but worth stating: in order to get the most out of the new report you need to add tracked keywords to your Moz Analytics campaigns. Not only that, you'll probably want to add a healthy selection of terms to gain the most insight. For inspiration, take a peek at the Opportunities tab.

Back to the big picture

SEOs know how to adapt. The increase in (not provided) isn't the first time we've lost a valuable data source, and it's probably not the last. With the Landing Pages report and other Moz Analytics updates, we'll do our best to address the changing search landscape. 

Have a look at the latest release and share your thoughts. As always, if you have any insights or feedback feel free to shoot a message to our extra-helpful Help Team (help (at) moz.com) or sound off in the comments.

Not a Moz Pro subscriber? No problem! Take your 30-day free trial.

Thanks to our data source partners

Big thanks to the team at Grepwords and their amazing data for helping to make this update happen.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

AutoPilotBeats: "LSN Review - LSN Google Hangout June 4, 2014 7:30 PM" and more videos

AutoPilotBeats: "LSN Review - LSN Google Hangout June 4, 2014 7:30 PM" and more videos

Mihai, check out the latest videos from your channel subscriptions for Jun 11, 2014.
   Play all  
LSN Review - LSN Google Hangout June 4, 2014 7:30 PM
AutoPilotBeats
AutoPilotBeats
   Play  
Гриферята: Непредсказуемый выпуск :D [Пилотный выпуск]
PozzitifonShow
  + 3 more  
Horizon Freak Wave pt 1
Popular videos around the world
Gameplay - SNES - Chrono Trigger - Parte II (ITA)
InTheNerdWorld
  + 14 more  
Harry Potter Wedding
shrutishekar
American Flag card made with Stampin Up Work of Art Stamp Set
stampwithtami.com - Crafting and Card Making from Tami White
Watch Dogs PC Gameplay Walkthrough Part 28 - Unstoppable Force
gamer4ever
  + 10 more  
Update to Summoner's Rift - Pre Beta Footage
League of Legends
אחרי שנתיים וחצי הרווחה מתנגדת לחקור התעללות בילדים
UnPromisedLand
Sportsmanship Week: Pranksters Mock Golfers
Just For Laughs Gags
  + 1 more  

2 Years in SEO

2 Years in SEO

Link to White.net

2 Years in SEO

Posted: 10 Jun 2014 05:55 AM PDT

I was sitting staring at my computer trying to decide what my next post should be about, when I got a notification from LinkedIn – "X Congratulated you on your work anniversary!" Work anniversary? Then I realised, I've been working at WHITE for two years now and, subsequently, have spent two years in the world of SEO. So, I thought this post would be a great opportunity to have a look back on what's happened to SEO during the last two years, what I've learnt along the way, and what advice I can give to new SEOs.

 

In the beginning…
I joined WHITE in June 2012, only a couple of months after the release of the original Penguin update, and about 16 months after Panda. Obviously, it was a challenging time for SEOs everywhere, and I was thrown in at the deep end.

My knowledge of SEO was pretty much non-existent when I joined, and I remember my first few days being a whirl of terms I didn't understand and tools I didn't know how to use. Meta descriptions? Title tags? Bounce rates? Penguins? Pandas? What was going on?

Luckily, the guys at WHITE were patient and helpful, and quickly brought me up to speed with everything SEO-related. Soon I was working away on content-focused SEO like a pro (or so I like to think!).

So what's changed since 2012? What's stayed constant? What have I learnt? What are the best pieces of advice I can pass on to new SEOs?

 

The big changes
One of the biggest changes that springs to mind is guest blogging (unsurprising I suppose, given I focus mostly on content). Back in 2012, everyone was doing it. Whether you were actually trying to produce quality content or just spamming people is a different matter, but every SEO was guest blogging in some shape or form.

Of course, in 2014, guest blogging is pretty much off the table. With Matt Cutts publically denouncing it, and discouraging SEOs from using it, guest blogging has been effectively killed off. (You can read more about this in a post I wrote earlier this year). If you're an SEO still using guest blogging in 2014, you should really re-think your strategy.

This has led to most SEOs moving their priorities onto onsite content – a really important shift in my opinion! At WHITE, the importance of onsite content has always been stressed, but it's great to now see other SEOs championing it. Improving onsite content benefits everyone – clients, users, and SEO practitioners alike. The death of guest blogging really helped to spur this movement and I am grateful to Matt Cutts for giving me something solid to use to dissuade clients from guest blogging and push them into improving their own content.

What these changes have meant for SEO…

Standing out and being genuinely useful with website content has meant a much bigger push towards creative, user-focused SEO, which is another important change. Pre Panda and Penguin, I feel SEO was very much focused on tricking Google, rather than actually improving websites. Now that Google has caught on and happily rolls out continual updates to these major algorithms, SEOs are forced to focus on providing quality website experiences to users. I think this is a fantastic shift, which is really helping to improve websites and search engine results across every sector of the web.

And for me…

All of these changes have obviously placed a lot more emphasis on the role of content and, as a result, I've found a lot more work coming my way, and much greater demands on my skillset.

When I first started, being able to conduct appropriate keyword research, write effective titles tags and meta descriptions, and create good quality blog posts was the majority of my role. Now I conduct in-depth content audits, analyse client, competitor and general sector content to discover new content gaps for clients to exploit, and spend a lot of time brainstorming to come up with something new and unique that will set our clients' websites apart.

I am always focused on the user and what they're looking to get out of each search query they make. I analyse how best to provide them with the information they need whilst keeping them engaged with the client's site and interested in their offering. It's much tougher than it used to be, but I love the challenge and am prouder than ever of the work that we produce. SEO standards have been forced upwards by Google, and the quality of our work has been raised substantially.

 

So, those are the big changes, but what are the constants? What is still essential, and what do I think all new SEOs should be aware of?

 

Some things never change…

Of course, SEO has many constants. As a content specialist, I will always stress the importance of thorough keyword research. It is at the core of everything we do. It is absolutely central for understanding what users are looking for and informs all content related choices – from title tags and meta descriptions, to on-page copy, ads, new page creation, and more.

Keyword research…

I have a few tools that I will always recommend when conducting keyword research. Google Analytics and Webmaster Tools are essential starting points. Despite the rise of (not included) you should always start with a good look at the keywords that a currently bringing people to your site, and Google gives a great insight to this.

SEMrush is another fantastic tool, and just keeps getting better. As well as providing you with lots of information around your current organic keywords, you can also take a good look at what competitors in your sector are ranking for and how you fare in comparison. The Domain vs Domain tool is a great way to find keywords that competitors are targeting that you're missing out on, and gives a good overview of where you sit within your sector.

At WHITE, we also use Linkdex, which enables you to track a lot of different keywords and see how you're ranking for them, as well as analyse how your competitors are doing for them.

Technical knowledge…
Another area that hasn't changed is being able to produce solid technical work. Although this isn't my specialism, I have learnt a lot about the technical side of SEO over the last two years, and am now confident working on everything from link audits, to disavow files, to redirects, and more. Being competent across a range of SEO skills is something I would stress as crucial to anyone serious about a career in SEO. Having a specialism is great, but without a solid understanding of the other areas involved you can never really achieve your full potential. You need to understand all of the various factors involved in order to produce a really great website and user experience.

For any new SEOs, there are several great resources that will provide fantastic information on all of the essential areas involved. DistilledU is my favourite online SEO resource. It taught me all the basics when I first started, and is a brilliantly constructed and incredibly thorough learning resource. I would recommend it to anyone. Moz also has a great beginners guide to SEO, which I found really helpful when I was starting out. And even now, with a lot more experience, their blog is a source I turn to regularly when trying to keep up to date with everything SEO-related.

 

In the end…

Overall, I'd have to say my first two years in SEO have been something of a rollercoaster. There have been a lot of changes and sudden swerves, but the core principles behind what we do have never really changed.

If you asked me what the point of SEO was when I first started, and if you asked me again now, I'd still give you the same answer. It's about improving the web for the users. It's about giving people the information they're searching for in the most useful, informative, and effective way possible. It's not about tricking Google into giving you the top rankings, and it's certainly not about using spammy techniques to boost your link profile.

The core values at WHITE have never changed; we've always believed in creating top quality websites that cater for users' needs. Luckily, whilst I've been working in SEO, Google has got better and better at recognising websites that deliver this, and rewarding them for it. As a result, I can only say that I'm lucky to work in an agency that has always understood these concepts and kept them at the top of its priorities.

As for what the future holds, I'm sure we'll see more updates and even new algorithms; Google is always learning and adjusting. I'm also sure that spammy tactics will be spotted faster and punished more severely. However, I'm also certain that Google's key focus on quality for the user will stay the same, and that will always be our priority too.

Here's to the next two years!

 

Image from Flickr