A Step-by-Step Guide to Updating Your Website Without Destroying Your SEO |
A Step-by-Step Guide to Updating Your Website Without Destroying Your SEO Posted: 22 Dec 2014 03:05 PM PST Posted by Richard_Foulkes This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of Moz, Inc. The first thing any SEO thinks when a client says "I'm redesigning my website" is what impact will this have on all my work? In these events, often the client doesn't even consider telling their online marketing agency about the redesign until two days before launch. This resource will cover how to do SEO checks on your test site/development site to ensure the structure, URLs, Page Titles, Meta Descriptions and more all match up properly. It also serves as an SEO checklist touching on things that are often forgotten when a website goes through a complete overhaul. Why consider SEO in a redesign?Why is it important to consider your SEO during a website's revamp? In short, you have a lot to lose. Let's say your site's doing great. Rankings are strong, organic traffic is flowing and revenue is growing. Do you really want to undo all that hard work? I'm guessing not. However, by thinking strategically, you can take the opportunity to improve a site's performance after a redesign. That's what this client did: As you can see, a steady increase in traffic followed (from the red circle) even during the re-indexing phase. If you do a redesign right, you won't lose any traffic or rankings; in fact, you'll gain them. Below I outline some steps that can help you understand the test site being built and your current site from an SEO viewpoint. This is vital when changing your website around, and I will cover how to make sure the web development agency keeps the important SEO work that's gone into your website. Step 1 – Consider the SEOThe first thing you must do is think about SEO. Too often clients don't stop to consider the SEO impact of changing their website. They chuck away valuable content from historical pages or decide it would be a good idea to completely change every single URL without redirecting the old ones. This only happens because they misunderstand how Google et al. read a website and how URLs hold credibility. It's no fault of their own, but it happens. Step 2 – Crawl the existing siteWhy do I need to crawl my site?If you don't know what your site's structure looks like now, you'll set yourself up for a massive fall. Grabbing the structure, meta data and URLs is vital to identifying exactly what is changing and why. How to do itYour SEO crawl will give you a roadmap of how your entire site is currently set out. The best way to grab this data is to use a tool like Screaming Frog. Once you have the current site's meta data and structure, you will know how to match the new site up. Step 3 – Audit the old siteNext, you need to audit the site. Free tools like Woorank will do the job, but I strongly advise you to get your hands dirty and manually do the work yourself. There's nothing like getting into the nitty gritty of your site to find any problem areas. Why audit the site?You need to know what search engines like and don't like about your site. This can help you spot any problems areas, in addition to enabling you to see which areas must be retained. What am I looking for?Here are some of things we check at Liberty. Sometimes it's worth checking more, but these are top-level checks: Using your Screaming Frog data, I advise checking the following:
You should also manually check for:
This data gives you a good understanding of what the website's doing well and areas for improvement. Step 4 – Noindex your test siteWhy do we need to noindex?This stage is simple; yet it's the point where many redesigns go awry. If you're working on your test site, the last thing you want is for Google et al. to index it. If you've added great new content, it will get indexed. Then when you launch the new site, the new content will have no value because it will be duplicate. How to noindex your test siteA site can be noindexed in two ways by your developers. 1 – Tick the noindex box in your site's CMS. If you have WordPress, for example, you simply check the box that reads: "Discourage search engines from indexing this site."
This adds the following code in the <head> of every page:
2 – Your second option is to block the site in the Robots.txt file. This is a little trickier; hence, why most CMS have a box-ticking option. If your CMS doesn't allow for this, you can put the following in your Robots.txt file: User-agent: * No CMS? You can manually insert the code if you have access to the header file by implementing the noindex, follow code as above. Step 5 – Crawl the test siteWhy should I crawl the test site?You also need to understand how your test site is structured. Using a site crawler, crawl the test site again to see how it looks in comparison to your current site. How to do it
Download all of the HTML and save it as an Excel file. Name it "Final Crawled Test Site". This will be the test crawl you'll check through later. Also, hold onto the very first crawl we did of the test site (Test Site Crawl). At the end, you'll have these docs:
Okay, you made it. Now you have the data in Excel format, and you can see what works on the test site, and what doesn't. This allows you to understand what's missing from the test site that is on the current site. Step 6 – Analyse Your DataWhat we're looking forNow that we've done all the crawls, we need to open up the XLS spreadsheet called "Final Crawled Test Site" from Screaming Frog. You should see a lot of data. First, delete the row across the top named "Internal HTML". Then do the same for number "2," if this is a blank row. You should have these headings:
Some of these have the number "1" next to them, signifying that there is only one. If some of yours have number 2 next to them, then you have several of these. The elements you shouldn't have a number "2" on are as follows:
With all this, we'll begin identifying what changes need to be made. Go to the Status Code header, click the filter icon and select 200 code. This shows all the URLs that are working. You might see "Connection Timed Out" on some of these. This could be because Screaming Frog timed out. Manually check these. If they work, just update the spreadsheet; if they don't work, then you've identified a problem. Let the developer know these are timing out. They should be able to identify a fix. How to match up the dataI've told you how to test the data, but not what to do with all those crawls. The purpose of crawling your current and test sites in this way is to identify meta data, structure and errors the test site currently has. First, apply a filter to the columns:
Locate the Level heading, right click and sort from smallest to largest. Now segment all the data. I start with Page Titles (Title 1). Take the first 7 columns on the spreadsheet and highlight them all. Copy and paste these onto another sheet within the same Excel spreadsheet called "Page Titles". Do the same for "Meta Description", but this time pick the first 4 columns, then 8-10. Repeat this for each section to end up with the different sheets as follows:
This number of sheets may look like overkill, but in my experience working with smaller amounts of data is much easier than trying to work on one large, data heavy spreadsheet. Here's the best bitRemember all the crawls we did before? Well, we'll need to go and open Current Site Crawl for Editing. Filter the Level first so it shows "smallest to largest", then locate the following columns on this spreadsheet:
Highlight all the data in these three columns and copy them into your test site spreadsheet onto the Page Titles Sheet in the empty columns. Place those three columns apart from Title 1 Pixel Width. Now that you have the test site's Page Titles next to the current site's Page Titles, you can highlight the duplicates. Highlight both Title 1 columns and go to Conditional Formatting > Highlight Cell Rules > Duplicate Values. This will highlight everything that matches. I have no shortcut for this. You'll need to manually move things around and get them in the right place. I go about this by looking at the Page Title 1 closest to the left, (the one from the test site) then copy the text. Use the Find and Replace box (ctrl+F) to search the text. Hit "next" and go to the next match, where you'll grab the three relevant columns and stick them next to the text you copied. Then repeat. Sometimes nothing will match. When this happens, try doing this:
These are a few things that may cause issues with matches, so be sure to check yours with vigilance. Rinse and repeatAfter you've done this process once, you'll need to rinse and repeat for the other sheets to match up all your Meta Descriptions, Canonical Tags, Word Counts, etc. It's important to remember that the point of checking these areas is to ensure that any changes are good changes. Once you've nailed all 200 codes, you'll want to look at the 404s. Go to the Status Code header and select 404 on the filter icon again to find URLs that aren't working. This is assuming you have 404s. This will give you a list of all the URLs that didn't work. In theory, it should give you everything else that needs to be checked. You should only have 200 status codes and 400 status codes, but sometimes you will have 500s or 300s that need further investigation. 404 timeIf the URL is a 404, it means that the page doesn't exist. So we'll need to do one of two things:
Here's an example of a 404:
Look at the test server's URL. If you think it needs to redirect, highlight it in red. If you have to create a new URL, fill its cell with the relevant meta data and highlight it green. Don't forget what each colour means. You'll also need to highlight the corresponding URL that will redirect to the new version on the Current Site Crawl for Editing. What do to with live URLs that aren't on your current site?These URLs are most likely new pages. Like with any page on your website, it has to be optimised correctly. There are tons of guides to help you here (this visual guide is my favourite). Now what?I'm glad you asked. Now you have a fully comprehensive spreadsheet of everything needed to minimise the damage of moving a site. You need to work closely with the developers to get the changes you've recommended implemented. With the spreadsheets laid out in this way, you can simplify the data and give the developers the bits they need, making their lives easier. Don't forget, when you redirect pages to a new site, you'll lose around 10%-30% of your link equity. But you're giving search engines the best opportunity to bring over your old site's strong reputation. From this point onwards, I'll detail things that can go wrong, common problems, and important elements to check along the way to monitor the changes. Now you've given the new URL structure and changes to your developers, you need to check they've got it right. You've been involved in several meetings discussing the strategy to proactively make sure you don't upset the rhythm and have a positive impact on the changes. But, unfortunately, it doesn't stop there. You've more than likely been handing over changes periodically and testing as you go. Now, it's a good opportunity to test everything again. Crawl the test site again—being vigilant in cross-referencing all the relevant meta data and ensuring that the URLs match up. If they are even slightly off, then change them. One way you can check is to use "find and replace" in Excel. This time, swap the test.example.com with example.com, then crawl the URLs with Screaming Frog. From now forward, make it a habit to check these additional elements. Step 7 – All the additional checksRank checkWhy do you need to rank check? A rank check measures how the site performs for a host of keywords in search engines. You'll use this data as a comparison for the newly launched site. If things change, you can react and identify the problems when you check the results. What to look out for: Big movements. If a keyword jumps from page 1 to page 20, you may have a problem. Look out for any big or unusual movements by checking these things:
ContentDon't delete anything you don't have to delete. You might think your old blog posts aren't needed, but they are all adding to the credibility of your site. Without these blogs, you'll lose a chunk of value. Similarly, now's not the time to change your landing page content if you're currently enjoying decent rankings. Analytics codeThis is pretty self-explanatory—make sure you place your analytics code back in the <head> section of the site. It is important to check the ecommerce tracking, goals and event tracking if you currently have those in place. Unblock the siteIt's time to check the new site to see if it's allowing search engines to index it. Simply follow the reverse instructions of blocking the site. Whichever method you used to block it, do the reverse to unblock. Failure to do this could create big problems when you launch the new site. Summary checklistHere's the checklist I mentioned earlier. If you skipped to this, then use it as a guide to help you do a redesign with SEO in mind. With this in your arsenal, you never need to fear a website redesign again. Think about SEO from the start
Crawl the current site
Audit your existing site
Stop the test site from being indexed
Crawl the test site
Find and replace URLs
Crawl those swapped URLs
Check test site meta data on live URLs
Check 404s on test site
Map out 301s
Optimise all new pages
Check implementation
Do additional checks
Launch! Common problems to look out forEach scenario will differ between websites. It's important to understand how this foundation approach helps segment and break down important meta data so you don't lose SEO value during a redesign. As with any project, there are common problems SEOs, businesses and developers all come up against:
Additional resourcesOnce you've checked these elements, you are in a strong position to launch. It's still important to keep a close eye on the performance of the new site. Sometimes a single line of code can upset the rhythm. Here are some additional resources to reinforce what we've covered here: Search Engine Journal – Website Redesign Disaster One last thing… As with any changes to your website, it is important to monitor the situation. Use whatever tools you have available to keep a close eye on the following:
These things will help spot any problems. If you notice your rankings plummet, you can quickly investigate and make any needed changes. If Webmaster Tools reports errors when Google tries to crawl the site, then you know to be proactive and explore the problem. Once you are confident there are no issues, loosen up a bit. You don't need to keep such a close eye on these things. You can work on promoting the site and carrying on with your growth and maintenance SEO work. Give me some feedbackHow do you approach a site move? Do you have any cast-iron techniques you recommend to maintain strong rankings during a move? I'd love to hear from you. Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read! |
You are subscribed to email updates from Moz Blog To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |