No matter how well versed you are in SEO or not, checklists can help you or your clients to better hone their optimization work by checking simple or complex issues with a methodical approach.It may also occur that you are asked to do a “quick” audit of a site either for prospecting or as a favor, and a simple yet comprehensive checklist can certainly improve your speed to get the pertinent information to assess SEO status.
Here are the 5 items to check in short form:
1. URL issues (include URL canonicalization & 301 redirection). No matter what preferred homepage URL you plan on, it’s important to consolidate all versions or iterations of the homepage to one. This includes additional domains that need to be forwarded to the main domain. Although this doesn’t seem to be a big issue, each little bit can add up to a detrimental whole.
Variations of the homepage URL can include:
- http://www.site.com/index.html Β Β Β Β
- http://www.site.com/index.htmΒ Β Β
- http://www.site.com/index.phpΒ Β Β
- http://www.site.com/default.html Β Β
- http://www.site.com/default.php Β Β Β
- http://www.site.com/homeΒ Β Β
- https://www.site.com/* (where the asterisk indicates multiples of the URLs listed above in secure protocol URL versions)
These URLs need to all be pointed to the main site or chosen iteration of the homepage, such as http://www.site.com. Redirects can be added in apache “.htaccess” files or within configuration files for your webserver. An example of a redirect condition and rule set follows:
RewriteEngine On Β Β Β Β
RewriteCond %{HTTP_HOST} ^site.com Β Β Β Β
RewriteRule (.*) http://www.site.com/$1 [R=301,L]
Note: This of course takes into account that mod_rewrite is enabled. Also check your specific version and webserver for configuration attributes or declarations.
Be sure to take a quick gander at a sample of pages, including the homepage, category level pages, and some detail pages to locate redirect chains and/or 302 redirects (in the form of javascript, php or meta refresh redirects) which most likely require a shift to 301 redirects to take full advantage of SEO metric flow through the website.
2. Unique Titles & Meta Tags. Issue a site:www.site.com command to get a listing of all pages indexed, which allows one to find variations of titles or patterns that indicate duplicate content issues. Even a simple check into what is returned from Google, Yahoo! or MSN/Live gives you an idea of how unique or not pages are given title and description elements. Again, a sampling of pages may help determine the level of complexity on the issue pertaining specifically to that site.
3. Search Engine Friendly URLs. Looking at the mix of URLs that exist on the site, whether hand-coded or CMS generated, should help determine if more work is required to take parameter or nondescript page URLs to a keyword rich and effective URL schema. Parameters aren’t as problematic as they used to be, but the preference is still to use “static” URL schemas without parameters (if possible). This allows for better organization of content, an easier and more enjoyable experience for users, and keyword rich URL structures for search engines. Definitely a win-win-win.
4. Webmaster Tools and Registrations. Get your websites registered with Google, Yahoo!, MSN/Live and Ask.com (if you’ve got an XML sitemap – and you should). Try naming your XML sitemap something not so obvious, like “global.xml,” to avoid competitor snooping, and submit the sitemap to all of the tools. I find the meta tag verification tags to be easier to place, but you can also choose to verify by placing files on your website which each of the webmaster programs walk you through – but this is your personal preference. If you do not have an automated XML sitemap, try using a tool like Coffeecup software’s Google Site Mapper or any of the other free tools available online to generate one. Just be aware that not all of these tools work well and certain sites may cause programs to loop, generating a mass of useless duplicates…
5. Duplicate Content Check. This can be done manually or can be part of the process of issuing a site:www.site.com command in the major search engines to see which pages may be indexed with the same content under multiple URLs. Check also for multiple iterations of category and detail page levels so you avoid having http://www.site.com/doodads and http://www.site.com/do-dads. This can potentially cause dozens of iterations of the same content per URL, which is a big problem for rankings due to cannibalization and search engine choicing. Use of the tag on the http://www.site.com/do-dads URL can help alleviate this issue by indicating your preference of URLs to the engines. Be advised that this is a suggestion to the search engines and is not guaranteed in any way, yet all three (Google, Yahoo! and MSN/Live) claim to abide by the tag. Β
Much more can and will be added to a complete checklist, but this should certainly help get some of the major issues identified and will help in a more effective SEO strategy. Other things that can help you analyze a situation for SEO viability or problem solving include checking inbound links to determine types of links and popular areas on the site, checking IP blocks for neighboring websites to determine relationships and perceived tie-ins, and searching for inline coding issues which can hinder proper spidering.