«

»

Apr 15

Complete SEO analysis checklist 2012

This document represents a needful topic for each aspiring SEO specialist. The idea is to update it regularly and to integrate this document with your contributions.

Automated SEO reports

This is the easiest solution you have in your hands. Here is a list of free automated SEO tools that can offer interesting SEO reports on confirming the value of the SEO on the site you are analyzing.

http://www.quarkbase.com/

http://marketing.grader.com/

http://www.seoworkers.com/tools/analyzer.html

http://www.cubestat.com/

http://www.statbrain.com/

http://www.jonasjohn.de/test-tool/

http://spydermate.com/ (the apps for Android and iPhone are freeware)


The purpose of this post is to make your own SEO analysis.

The first step is to understand what kind of website we have to analyze. If you have access to its statistics, perfect. Otherwise nevermind: you can always determine the site, its structure, its possible ranking and traffic too.

Website Crawling Verification

 

Website Crawling Verification

Website Crawling Verification

You can start the job by crawling the website to be analyzed. There are many tools available. Some of the more interesting are: Screaming Frog SEO Spider Tool (http://www.screamingfrog.co.uk/seo-spider/) and IIS SEO Toolkit (http://www.iis.net/download/seotoolkit). One of the most appreciated to me is Xenu Link Sleuth (http://home.snafu.de/tilman/xenulink.html). Finally I would also suggest you Crawl Test from Seomoz (http://pro.seomoz.org/tools/crawl-test). Seomoz offers different SEO tools even if you have to pay if you want to use them regularly.

You have to check the following ON-Page SEO factors:

  • Website architecture. A flat site architecture is preferrable to a deep levels website. You can have a look to the following article to understand better the question: http://www.seomoz.org/blog/site-architecture-for-seo.
  • Link equity. Verify that each page of the website has a good number of incoming internal links. Consider that Google has no problem to reach the 4th level, that is to say that each page of your website should be reachable from any page of the site in no more than 4 clicks. If this is not your case, consider rebuild the structure of the website.
  • Link errors. Verify if you have any broken link, if a linked page answer with an error (500,400,404 …)

Site Indexation Verification

Verify the status of Indexation on Google Search Engine.

You have to check the following SEO factors:

  • Indexed URLs. You can simply realize the following query on Google: site:namedomain.tld
    Google will return the list of URLs it has indexed from your website. If the number of indexed URLs is quite different from the URLs you have registered using your crawling tools, it might be existing a problem for the Google spider to crawl the entire website and you should try to understand the reasons.
  • Home Page verification. Search in Google the exact words you have in the tag “title” of the Home Page or the first part of it. If the website doesn’t rank in the first position of the SERP (Search Engine Results Pages), you can bet there’s a SEO problem on the website.
  • Obsolete Indexed Pages. If old pages are indexed in Google, you have to redirect permanently to the new pages they should represent. You could also remove them from Google Webmaster Tools but just in case the resource gives you a server error 404.

Content Verification

You have to check the following ON-Page SEO factors:

  • Duplicated Content. If you have access to the web statistics of the website it would be easier. Otherwise you could use Webmaster Tools by Google or Seomoz to verify if the website has duplicated content. If you don’t have access to these tools (it might be the case if you analyzing a competitor of your website), you’ll have to verify it performing different queries on Google on topic services and products offered from the website. There are a lot of tools you can use to make the job easier, just check this post: http://slodive.com/web-development/10-top-tools-checking-duplicate-content/

  • Canonization Problems. The job is to verify if a rel=”canonical” is due or not. The question usually refers to the Home Page, where it might be responding as domain.tld/ domain.tld/index.php domain.tld/index.html or domain.tld/index.asp You should just have a single Home Page. If you can’t redirect index.asp to / that the case where you need to canonize the Home Page, placing this code inside the header tags, as it follows:
    <link rel=”canonical” href=”http://www.domain.tld/”/>
  • Pagination Problems. If your website has many products or services it’s probable that the website will have a sort of pagination of its products/services. In those cases it’s also probable that the website will have duplicated content pages or duplicated titles of pages. You have to play with rel=”prev” and rel=”prev”. You can have a look at the following post: http://googlewebmastercentral.blogspot.it/2011/09/pagination-with-relnext-and-relprev.html
  • Duplicated Title of Page. Duplicated titles of Pages are deprecated by Google and the other search engines. To check if the website has “duplicated” titles of Pages you can use: Google Webmaster Tools, Seomoz or just Xenu Link Sleuth. Even if Xenu was thought to be used in Windows, you can make it run also in Linux using wine. Once you have spidered the website, just order the pages for “title” and it will be easy to see which pages have the same Title of Page.

 

Extra SEO Verification

An important factor that it’s not strictly SEO but it’s related to, it’s to understand the audience of the Website. You can use different tools:


Google Adwords (free)
https://adwords.google.com
Google Trends (free)
http://www.google.com/trends/
Alexa (free) http://www.alexa.com
Quantcast (free) http://www.quantcast.com
Compete (from 199$ a month) http://www.compete.com

SEO analysis checklist

SEO analysis checklist

These tools give you important details on the demographic of the website visitors, the most popular keywords used to reach the website and many other data you can use to choose, update or change the keywords you want to rank first with the website itself.

Once we have determined the structure of the website, its indexation in the Search Engines (basically Google), the possible presence of navigation problems and the audience of the website, we are ready to the next step of the SEO analysis.

 

 

 

 

Domain On-Page SEO factors

 

  • Age. Older Domain means better Search Engine Ranking.
  • Keyword Representation. The domain itself should contains the most significant keywords of the website. The domain name should be short to be easily remembered, it shouldn’t contain underscores and special characters.

 

Domain Off-Page SEO factors

 

  • Google Page Rank

    Google Page Rank

    Page Rank. It’s the value determined by Google each page of the website. The root domain Page Rank can considered the value of the domain itself. Page Rank samples: Flagsonline.it PR4 Bizonweb.it PR5 Seomoz.org PR6 ShinyStat.com PR7.

  • Alexa Global Rank. The Alexa rank is calculated using a combination of average daily visitors to the website and pageviews on the same website over the past 3 months. The site with the highest combination of visitors and pageviews is ranked 1st. It’s useful to evaluate this rank in combination with some competitors of the website to better understand its value. (http://www.alexa.com)
  • Compete Rank. Compete offers an estimation of the traffic of the website and it represents this value with the Compete Rank (http://siteanalytics.compete.com/ )
  • MozRank, Page Authority and Domain Authority. The domain mozrank is a global link popularity metric, calculated similarly to Google’s PageRank on a log scale between 1-10. Page Authority and Domain Authority are two metrics calculated through Open Site Explorer by SeoMoz. These metrics are represented on a log scale between 1-100. You can have freely these values by checking your website there:http://www.check-domains.com/website-analysis/website-analyzer.php
  • External backlinks. You can use http://www.majesticseo.com/ to calculate the incoming links to your website. You should represents these data:External BacklinksReferring DomainsReferring IP addresses

    Class C subnets

    Consider that if you have thousans of incoming links but all the links come from a couple of C subnets it’s probable that all the domains that link your website belong to you … so the value of the incoming links it’s insignificant. You could also check the Followed Linking Root Domains, that is to say the number of domains with at least one followed link to any page on the root domain (ref. seomoz.org).
    You can also add
    Alexa Sites Linking In as important ranking factor to be analyzed. The Google query “link:domain.tld” is not anymore a good SEO reference.

  • Dmoz presence. One of the first step we did 10 years ago when we did a website was to sumbit it in Dmoz Directory. The presence of the website in DMOZ is still considered a SEO factor.
  • social network analysis

    social network analysis

    Social network references. There’s a check list you could create on the Social Network environment. You should check if the website is present and mentioned in the following Social Network:– Facebook

    – Twitter

    – Google Plus

    – Linkedin

     

On-Page SEO factors

 

  • URL. The url of each page should be search engine friendly and it must contain the most significant keywords of the same page.
  • Title tag. The title should contain the keywords we promote in a discursive manner, it can’t be just the list of the keywords of the page. As Google indicates in the “Google SEO Report Card”: “A descriptive title and description meta tag can help a result pop out better to search users.” The reference is here: http://www.google.com/webmasters/docs/google-seo-report-card.pdf
  • Description meta tag. The description should be descriptive and contains the keywords we promote inside the page. The actual standard confirms that the description should be not longer than 156 characters, including spaces and commas.
  • H1, H2 and H3 tags. Each page of the website should present these tags and these tags shoul be ordered (h1,h2,h3) and inside they should be presenting the keywords for the page they represent.
  • Keyword density. It’s the keyword density inside the body of the page. A good ranking could be determined by a good density (5 – 20% of the total words inside the body).
  • Bold Keywords and Strong Keywords. A keyword inside the tag <strong> or <b> is an heavier keyword, better visible to Search Engines.
  • Keyword prominence. A keyword in the upper left side of a page is heavier than the same keyword placed in ths inferior right side of the page. The prominence is how early the keyword is inside the page.
  • Anchor text. The anchor text should be keyword representative. If you have a link to the SEO section the anchor text should be “SEO section”.
  • Title of Link. The title of Link should be keyword representative. You can have a look at this post to understand its importance:
    http://www.seomoz.org/ugc/link-tilte-attribute-and-its-seo-benefit
  • Image Alt and Image title. Alt text is an alternative information source for those people who have disabled the images in their browsers and those who are simply unable to “see” the images. It should describe what the image is about and get those visitors interested to see it. The alt text contributes to rank in Google Images search, so it’s a relevant ranking factor. Image title should provide additional information. Tt should be relevant, short, catchy and concise and it helps to promote the keywords of the page (be careful as it might be checked as spam if the description and the content of the image have nothing to do with, i.e. Kewyord stuffing).

You can check many of the SEO factors I describe above using this simple page:

http://www.seocentro.com/tools/search-engines/metatag-analyzer.html

 

The last step of the SEO analysis is an advanced SEO analysis and it’s referred to other aspects you can improve for ranking better.

 

Usability

 

  • Usability test

    Usability test

    Website Usability Review. You should check if the website is usable, if you can easily navigate it even if disabling javascript.

  • Form Usability. Each form of the website should be usable even if you disable javascript.
  • Navigation Analysis. Each section of the website should be reachable in no more than 4 clicks, otherwise we have a navigation problem to enface.

 

Crawl Ability

 

  • Robots.txt On the root of the domain you should have the robots.txt This file is useful for the search engines to understand which pages they can index and which ones they cannot. It might be something so:User-Agent: *Disallow: Sitemap: http://www.domain.tld/sitemap.xml
  • Sitemap. The sitemap helps the search engine to know all the URLs we require to be indexed at once. Each Search Engine has its own page to submit the sitemap:Google – https://www.google.com/webmasters/tools/ Bing – http://www.bing.com/toolbox/webmaster/
  • Crawl verification You can check how crawlable is the website using a text browser (Lynx) or using this javascript browser:
    http://www.domaintools.com/seo-browser/

 

 

Text Quality

 

  • You can use automated tools to verify the quality of your texts. Specifically you can check the two metrices: Lexical Density and Gunning Fog. The Gunning-Fog Index may vary from 4 (easy) to 20 (hard to be read), while the Lexical Density index represents the Complexity factor and it’s a percentage (normally from 20 to 60%).

 

Page Speed

 

 

Page speed by Google

Page speed by Google

All you need is to use the Page Speed by Google (Make your web site faster).

https://developers.google.com/pagespeed/

 

The most significant points are:

 

  • Browser caching. Setting an expiry date in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk.
  • Compression resources. Compressing resources with gzip or deflate can reduce the number of bytes sent over the network.
  • Compressed Images. Properly formatting and compressing images can save many bytes of data. Each image should be compressed and should specify high and width values. Properly sizing images can save many bytes of data.
  • Sprite CSS. Combining images into as few files as possible using CSS sprites reduces the number of round-trips and delays in downloading other resources.
  • Minify and Deferr Javascript. Compacting JavaScript code can save many bytes of data and speed up downloading, parsing, and execution time. Deferring parsing of unneeded JavaScript until it needs to be executed, you can reduce the initial load time of your page (basically you could place most of your javascript code before the tag body is closed).
  • Asynchronous Resources. Fetching resources asynchronously prevents those resources from blocking the page load (javascript static code, facebook and twitter widgets etc..).
  • Consistent URLs. It’s important to serve a resource from a unique URL, to eliminate duplicate download bytes and additional RTTs.
  • Minify HTML. Compacting HTML code, including any inline JavaScript and CSS contained in it, can save many bytes of data and speed up downloading, parsing, and execution time.
  • Optimize the order of styles and scripts. Correctly ordering external stylesheets and external and inline scripts enables better parallelization of downloads and speeds up browser rendering time.
  • Specify a cache validator. By specifying a cache validator – a Last-Modified or ETag header – you ensure that the validity of cached resources can efficiently be determined.
  • Minify css. Compacting CSS code can save many bytes of data and speed up downloading, parsing, and execution time.
  • Specify charset early. Specifying a character set early for your HTML documents allows the browser to begin executing scripts immediately.
  • Specify a vary accept encoding header. Instructs proxy servers to cache two versions of the resource: one compressed, and one uncompressed. This helps avoid issues with public proxies that do not detect the presence of a Content-Encoding header properly.

 

 

Complete SEO analysis checklist 2012

Complete SEO analysis checklist 2012

Once you have worked on all these steps you should know the strength and weakness of your website or of your competitor and you should be ready to fight to rank first. If you are still having troubles for ranking, just comment this post.

 

If you don’t agree with what I wrote, here I’m to listen to your comments.

 

 

Print Friendly

6 comments

Skip to comment form

  1. santosh

    nice article on SEO

  2. Paul

    Great post. Using this to a) train staff and b) checklist for an audit. One very small recommendation would be a print option (unless I’m missing it)…

  3. Alysa

    I am actually glad to read this weblog posts which consists
    of lots of valuable information, thanks for providing such data.

  4. Chinna Botla

    Hello Alessandro,

    Its a amazing article with tremendous tips to raise up rankings. really like it. Alessandro could you please give us a complete article on site page speed optimization activities. any way here you given as a part of this article, but if it is possible please provide complete theory on “How to speed up the web page loading time”

    Thanks in advance,

    Best Regards,
    Chinna Botla

  5. Franchesca

    Mу fаmilу all the timе sаy that І
    am killing my tіme heгe at wеb, but I know Ι аm getting еxperіence all the time by reaԁing thеs goοd сontent.

  6. Chetan

    Very good instructions. I guess after the recent update from google Anchor title tag need give one more thought. The link you provided from SEO Moz site is from 2007, please update if you have any latest link

Comments have been disabled.