Help! I Redesigned My Site and Organic Traffic Has Dropped!

Every few years (or months) you decide it’s time to redesign your website.  You run some usability tests (hopefully!), create some beautiful new templates, and pull the trigger on a design overhaul.  You’re thrilled at first – it’s a new and exciting look for your site and your friends and family tell you how amazing it looks.  Later, after a few days or weeks, you check your analytics data and realize your organic traffic has dropped.  What now?

Even the best laid plans for a site redesign may experience hiccups.  You’ll need to act swiftly once you determine where your problem areas are, but don’t panic.  There are several key areas that will help you find where your issue are occurring.  Let’s start with the most obvious…

Webmaster Tools

If you don’t have your site registered with Google Webmaster Tools, you need to go do this immediately.  Webmaster Tools is a free service offered by Google that helps you monitor and maintain your site’s presence in Google Search results.  It allows webmasters to check indexing status and optimize visibility of their websites.  Most importantly, it features a report of broken links for the site and statistics about how Google indexes the site along with any errors found while doing it.

Redirects

Once logged into Webmaster Tools, go to Crawl and view the Crawl Errors report.  This will list any links where Google is experiencing an issue.  Any pages that are showing a 404 (Page Not Found) error should either be fixed or 301 redirect to the new page.  It’s important to fix these as quickly as possible because if search engines are unable to find pages, they eventually remove the index from their search results.

HTML Improvements

Once you fix any broken links, the next place to look is under Search Appearance in the HTML Improvements reports.  These reports will highlight problems with your Meta Descriptions and Title tags, including details such as titles that are too long or short.  It also reports duplicate and non-informative tags as well as a report for any non-indexable content.  This information allows you to make fast improvements to your site, so it’s a good idea to check these reports weekly.

Robots.txt

Another area of your site you’ll want to check is your robots.txt file. The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.  Improperly written rules or rules that are accidentally written to be too broad, may prevent search engine crawlers from indexing your site’s content.

A great way to check this is, again, using Webmaster Tools.  Under the Crawl menu is a robots.txt Tester which will examine your robots.txt file and report any errors or warnings.  There is also an option to test a specific URL to see if the robots.txt is allowing Google to crawl the page.

In addition to robots.txt, make sure you verify your page markup individually.  You may have semantic markup that is blocking access to website crawlers.  For example:

<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>

There are reasons why you may want to prevent a page from being crawled and indexed, but make sure tags like this only appear on pages you absolutely DO NOT want indexed.  Otherwise, your valuable keyword-rich content may never be found!

Page Validation

If your redirects are working properly and your robots.txt isn’t blocking any important pages, the next area to investigate is your page validation.  Using the validation service provided by W3C at validator.w3.org, you can enter a URL, upload a file, or copy and paste your site’s markup and run a validation test.

The results of this test can help you find problems with your page’s semantic markup that may be causing some issues.  Fixing issues with your markup can make your pages easier to read and interpret, thus making it easier to search engine crawlers to index your content properly.

While this tool is a great resource, it’s important to note that it’s not perfect.  Not all validation errors will cause crawler errors and some validation errors may even be proper in certain contexts.  Use these tool in conjunction with the other tools mentioned and as a supplement for helping you create semantically correct and accessible pages.

Conclusion

If you’re still having trouble tracking down issues, you may want to verify that Google hasn’t made any major algorithm changes.  MOZ keeps a complete list of Google’s Algorithm Change History at this link: http://moz.com/google-algorithm-change.  It’s worth bookmarking that page and checking it every so often.

I should also note that Bing has their own version of Webmaster Tools available here: http://www.bing.com/toolbox/webmaster.  Yahoo! is integrated with Bing’s version of Webmaster Tools as well.  While their version of the tool has been improving, I have found Google’s to be more helpful.  That said, it’s not a bad idea to use both of these resources to help provide reports on your site.

Don’t forget that many sites are seasonal as well, so it’s important to look at data year-over-year and not just month-to-month. Use the tools available and keep digging and you’ll eventually get to the root of your traffic loss.

What other tools do you use to help troubleshoot traffic loss?  Tell us in the comments below!