[Blog]

Website Redesign: Technical SEO Questions You Need To Ask

Website Redesign: Technical SEO Questions You Need To Ask

I have a friend who I've known for almost 10 years. Smart fellow. He used to work in web design, and even dabbled in development. He now has his own business in photography, and relies on his website and local SEO to help find new client leads. However, a major overhaul in the architecture of his site earlier this summer found him in desperate need of his SEO friends from long ago. 

His new site was beautiful, clean, well-organized and the mobile experience was superb. Things were tagged properly. However, there was a problem. As soon as he updated the site and notified Google of the change, they stopped indexing him. Yikes!

It took quite a bit of time and investigation before he realized he had accidentally kept a Noindex tag leftover from a template he used in the code. 

(Cue the sigh of relief at discovering he wasn't being blacklisted by the engine!)

He fixed the problem and resubmitted his site. While things definitely started to pick back up for him, the loss in traffic for a few months baffled him and wreaked havoc on his leads. 

This is a good example, and reminder, that there needs to be a constant effort to ensure all SEO needs are being addressed when working on a site redesign. 

Even the best and well-experienced at this type of activity can easily get so lost in the motion of hitting the finish line with the redesign that they overlook critical facts that can make or break performance post-launch. 

Launching a new site gives a brand the opportunity to ensure that every technical SEO issue has been addressed, and that every possible technical SEO tactic has been considered for use. Below are the questions you should ask yourself or your team as you begin to work out the Technical SEO plans for the redesign. 

In The Planning Stage

Is there an opportunity to optimize the URL structure? If possible, using the targeted keyword phrase in the URL can have a high impact. It  is worth evaluating the URL of a page and considering changing it to include the keyword phrase for the page if possible.

What is my "Redirection Plan"? Do I have a plan for ensuring every old URL either resolves or points to a new logical page? This one is essential, especially if the old URL is restructured. It can be the most time-consuming, especially if you are combining two or more sites into one - which happens more often than you might think. Placing a 301 "permanent" redirect from the old version to the new one will ensure search engines understand the location of the content has permanently changed, and that you will keep the same spot you had on the search results page.

Is there a chance for me to utilize Schema markup? Using Schema has become a standard practice in SEO. It allows you to call out to the search engine’s specific information about a page that will help them categorize the content of the page, therefore making it more likely that they will reward you with a higher result.

How do we want to tackle on-page optimization? Are we willing to dedicate time and resources to researching and uncovering the best ways to optimize the Title/ Meta Description/ Image / Alt tags for every page? The answer to this question should be: "Yes. We will be dedicating resources to ensuring on-page tactics are well-thought out and meet the need of the audience, after we've spent some time researching what would be best for them." Understand that on-page optimization won't be done after you launch the site. It will be important to revisit what new search trends have emerged over time, and re-optimize the page towards those trends and needs of the audience. 

In The Testing & QA State, Pre & Post-Launch

Pre-launch: What did a crawl of the site uncover? Performing a crawl through the site - utilizing a crawl simulator such as Screaming Frog or even Xenu (a free tool) - can uncover broken links, improper redirects, and even verify what pages have utilized various tags (Title, Image, Meta, Headers) in order to build upon its content theme. Additionally, a crawl through the site will tell you if every page can be found by an engine.

Pre-launch: Inside the Robots.txt file, have we blocked only what we don't want crawled? Sites will often block entire directories to a search engine crawler, but they will truly only want a few pages within that directory blocked. Any SEO partner should be checking to see what has been put into it, and whether it is necessary, pre-launch. 

Pre-launch: Have we applied proper usage of the Noindex tag on pages we want kept out of search engine's index? Taking into consideration what happened to my friend, it is an easy thing to overlook! If you're going to use Noindex tags to keep an engine from including a page's content in the index, make sure it makes sense and that it is applied only to the pages necessary. 

Does every page seem to be utilizing a unique Title tag? This can be uncovered running a spider simulator through the site, and is often the first thing evaluated from SEO performance. If multiple pages are using exact-match duplication's of text, you are effectively telling the search engine that the page is no different than any of the others with the same tag. Other tags are important, but checking to ensure these are appropriately unique and targeted, pre-launch, will be critical. 

Post-launch: Crawl the site again - Did anything break? Re-crawling the site after you have launched it allows you to see and quickly mitigate any issues that may not have been noticeable in the Pre-launch crawl. 

Post-launch: How do I submit it to Google? Is there anything additional I need to do to ensure any drops in attention from Google will be small and easily fixable? Do both our typical sitemap and XML sitemap work properly? Using the Search Console (formerly Webmaster tools) will allow you to submit your site to Google, which will prompt the engine knows to visit the site. But also utilizing a traditional sitemap can communicate to Google and the other engines the architecture of the site. An XML sitemap does the same thing, only instead it allows a search engine to "ping" each page individually. Submitting the site and using both types of sitemaps ensures that you are allowing the search engines to crawl through the site and discover content on their own, as well as letting them see every page you want them to on an individual basis. 

Bottom Line

Any SEO partner (agency or in-house) should be working with you to make sure all of these questions are considered as you go through the redesign process, and that all possible technical opportunities have been addressed before the new site is live. 


comments powered by Disqus