Q: I am planning to redesign my website. Is it a good idea? How can I update the website without destroying the SEO?
A: Even if your website is doing great in terms of traffic and revenue, the decision of redesigning it can further improve the performance. However, it is important to understand the test site you are building and the current site from an SEO perspective.
The first thing is to think about SEO. I have seen clients chucking away valuable content from pages or deciding that it will be better to change every single URL & not redirecting the old ones.
Crawling the Existing Site
If you have no idea of how your site structure appears to be, you will be in deep trouble. Knowing the structure, URLs, and meta data is important to keep a note on what is changing and why.
Auditing the Old Site
Site audit lets you figure out what search engines like & don’t like about the website. This way, you will be able to spot problem areas and also the areas that should be retained.
I would advise to check for Missing & duplicate H1 tags, missing & duplicate page titles, missing and duplicate meta descriptions, page titles over 512 pixels, page titles below 512 pixels, canonical tags, canonicalization, broken internal/external links, meta descriptions over 923 pixels, and image alt text.
Apart from this, you can check for robots.txt, XML sitemap, URL structure, site speed and performance, and pages indexed by Google.
Noindexing the Test Site
When you are working on a test site, you would not want Google to index it. If you have added new content, it will get indexed. However, when you will launch the new website, content will no longer have value as it will be duplicate.
You can noindex the website by ticking on the noindex box in the CMS of the website or by blocking the site in Robots.txt file.
Crawling the Test Site
Crawling the site will help you in understanding how the site is structured. Use a site crawler to perform this step.
Analyzing the Data
Once you have crawled the data, you will have content, status code, title 1, title 1 length, meta description 1, H1-1, H2-1, canonical link element 1, and some other elements.
All the crawled data will be in excel. Apply the filter to look for 200 code, which will show you the URLs that are not working. Tell the issues to the developer and get them fixed.
Matching up the Data
After you have tested the data, crawl your current & test site for identifying the structure, meta data, and the errors currently on the test site. Have all the Page Titles, metas, and other elements from both the sites placed beside each other as this will help you in highlighted in the duplicates.
Match up all the meta descriptions, word counts, canonicals etc. to ensure that any changes on the test sites are good changes and will serve the purpose.
After you have nailed 200 codes, look for the 404s by applying the filter and you will again see the URLs that aren’t working.
If there’s a 404 error, it means that the page doesn’t exist. So, you need to create this URL on test server and redirect old URL to the new URL on test server. By looking at the test server’s URL, you can determine if it needs to be redirected or there’s a need to create new URL.
What should you do with the live URLs which are not on the current site?
These will most likely be new pages and like any other page, you have to optimize these properly.
After you have a comprehensive spreadsheet of things that need to be done to minimize the damage of site move, you have to get these recommendations implemented. Remember that when you redirect pages to a new website, you lose 10-30% of link equity. However, you are giving search engines a chance to bring over the strong reputation of the old website.
Once the developer has implemented the changes, you should crawl the site again to ensure all the URLs are matching up and the meta data is relevant.
It will ensure how the site will perform for a host of keywords on the search engine. Compare the rank check of both the websites so that any problem can be identified and resolved.
While performing the check, if a keyword has jumped from page 1 to page 8 or 11 or more, there could be a problem. In this situation, you should look for:
Make sure to place the analytics code back in the section of the website.
If you don’t have to delete something, just don’t delete it. There could be some blog posts you think aren’t required but might be adding credibility to the website.
Unblocking the Website
Check if the new website is allowing search engines to index it. Follow the reverse instructions to unblock the site.
There will be a different scenario for different websites, however, this foundation approach will help you in segmenting & breaking down meta data, so that you don’t lose SEO value.
Explore the award winning services of PageTraffic that puts you in the search engine spotlight.
Want more? Get updates from PageTraffic, know about latest happenings, SEO news, digital marketing updates.