May 01

Pinterest Analytics, three Pinterest Analytics Tools.

Let’s talk about your need to measure your success with Pinterest, your Pinterest influence. If you are a web marketer or just the owner of a Brand that is trying to invest in Pinterest, you would probably need to know which audience you are gaining, which images are performing better than others and finally what kind of popularity you have in Pinterest.

three Pinterest Analytics Tools

three Pinterest Analytics Tools

You are lucky as there are at least three tools you can use to check your Pinterest Presence and its performance. Let’s talk about Pinreach, Pinerly and Pinpuff: three tools developed for Pinterest Analytics.

Before analyzing these tools, I just suggest you to read my old post on Pinterest: http://afo.li/pi
I also would suggest to read carefully the post of Colby Almond in the blog of SeoMoz http://afo.li/pinseomoz

The post of Colby Almond show you each secret on the best way to post your pin in Pinterest. There’s a time to pray and a time to love the Bible says and there’s also a time to pin.

 

 

 

Pinreach

 

Pinreach Popular Pins

Pinreach Popular Pins

Pinreach is your Pinterest Influence and Analytics tool. Once you log in you have on the right side of your dashboard a nice table with your avatar, your PinReach Score (as Pinreach affirms “the PinReach score is a combination of various social activities you complete while using Pinterest”. It’s value is calculated by PinReach. The best score I found is 82 of Jane Wang http://pinterest.com/janew that has 2,880,841 followers 15,823 pins at the date) and a resume of all your data including: pins, repins, likes, followers, following, comments and boards.On the left side you have 4 interesting TABS: Analytics, Boards, Pins and Influential Followers.

 

 

 

The Analytics tab is quite graphical showing your trend. You can have a look at your personal score, the popularity of your best 5 pins (repins on them), the popularity of your boards. Clicking on the Boards TAB you can have a look of the data of each board you have (repins, followers, pins, liked and comments). The TAB Pins, instead, will display your top 10 pins that reached the greatest number of repins among your pins. Finally you have the TAB Influential Followers where you can see the 6 most influential pinterest users that are now following you.

 

Above your dashboard you also have different links to browse: Highest Reach, Trending Pins, Trending Members, Recently Checked Scores and Score Details. It’s useful to know the most trending members, the most trending pins in Pinterest. You can evaluate the people to follow and also which kind of “pins” have a better chance to gain popularity.

 

You also have to know that PinReach is launching PinReach PRO, a suite of valuable tools for marketers. I hope to have the chance to try the beta version soon.

 

Pinerly

 

Pinreach analytics

Pinreach analytics

Pinerly is the righ tool for testing your pinterest activity, a good Pinterest Analytics Tool. The dashboard presents on the right side a little table with the following data: Followers, Following, Likes, Pins and Boards. The TABs are: Campaign, Pinalytics, Suggested and Follow.
From the TAB Campaign you can prepare your PIN: upload the image or just take it from a website. Once you do this you can set the landing URL and the description of your PIN. The next step will be on Pinalytics. There you can have a look on the pins you realized through Pinerly and their results. For each pin the Analytics will offer: the number of Click received, the Likes received, the Repins received and the Audience you reached (the number of pinterest users that possibly saw your pin through all the repins).

 

One of the most repinned images I pin

One of the most repinned images I pin

Actually my Pinterest performance has a percentage of clicks on my followers that varies from 0,80% to 4,0%, depending on how interesting is the image of the pin and how original this image is. The number of repins may reach the 10% if the image is significant and the audience can be multiplied by 12. So if I want to have 100 visits each time I post a PIN, I have to suppose to reach I have to dispose of a number of followers that vary from 2500 to 9000. I’ll be giving a call when I’ll have these followers in Pinterest.

 

 

The Suggested Tab just display a particular image that you could pin. And finally the Follow Tab is the tab where you can see the most influential pinterest users you can follow for each topic (Cuisine, Travel, Design, Photography, Fashion, Décor, Graphics, Kids, Entertainment, etc …).

 

 

Pinpuff

 

Pinpuff

Pinpuff

Pinpuff is a particular Pinterest Analytics Tool. The right size of the dashboard display your avatar and the Pinfluence Score (mine is 52). Below the score you can look at the three most boards you have in Pinterest and below there are the “Quick Stats” that include Followers, Following, Pins, Boards, Likes, Liked and Repins. On the left side of the dashboard you have the Value of Pins & Referral Traffic. My pin worth 0.27$. Then you have other data. My Reach Score is 33.1 and Pinpuff says “You have above average following on Pinterest and can easily be improved further.” My Activity Score is 73.6 and Pinpuff says “Great!! You are very active on Pinterest – perfect choice for Brands on Pinterest”. My Virality Score is 47.4 and Pinpuff says “Lovely! Your pins are doing great job and people are really liking them and repining”. Below I can look at my first 20 boards with the number of Followers, Pins, Repins and Likes gained.

 

Now it’s the time of your choice. Try them and verify if you need them.
Follow Me on Pinterest

Print Friendly

Apr 15

Complete SEO analysis checklist 2012

This document represents a needful topic for each aspiring SEO specialist. The idea is to update it regularly and to integrate this document with your contributions.

Automated SEO reports

This is the easiest solution you have in your hands. Here is a list of free automated SEO tools that can offer interesting SEO reports on confirming the value of the SEO on the site you are analyzing.

http://www.quarkbase.com/

http://marketing.grader.com/

http://www.seoworkers.com/tools/analyzer.html

http://www.cubestat.com/

http://www.statbrain.com/

http://www.jonasjohn.de/test-tool/

http://spydermate.com/ (the apps for Android and iPhone are freeware)


The purpose of this post is to make your own SEO analysis.

The first step is to understand what kind of website we have to analyze. If you have access to its statistics, perfect. Otherwise nevermind: you can always determine the site, its structure, its possible ranking and traffic too.

Website Crawling Verification

 

Website Crawling Verification

Website Crawling Verification

You can start the job by crawling the website to be analyzed. There are many tools available. Some of the more interesting are: Screaming Frog SEO Spider Tool (http://www.screamingfrog.co.uk/seo-spider/) and IIS SEO Toolkit (http://www.iis.net/download/seotoolkit). One of the most appreciated to me is Xenu Link Sleuth (http://home.snafu.de/tilman/xenulink.html). Finally I would also suggest you Crawl Test from Seomoz (http://pro.seomoz.org/tools/crawl-test). Seomoz offers different SEO tools even if you have to pay if you want to use them regularly.

You have to check the following ON-Page SEO factors:

  • Website architecture. A flat site architecture is preferrable to a deep levels website. You can have a look to the following article to understand better the question: http://www.seomoz.org/blog/site-architecture-for-seo.
  • Link equity. Verify that each page of the website has a good number of incoming internal links. Consider that Google has no problem to reach the 4th level, that is to say that each page of your website should be reachable from any page of the site in no more than 4 clicks. If this is not your case, consider rebuild the structure of the website.
  • Link errors. Verify if you have any broken link, if a linked page answer with an error (500,400,404 …)

Site Indexation Verification

Verify the status of Indexation on Google Search Engine.

You have to check the following SEO factors:

  • Indexed URLs. You can simply realize the following query on Google: site:namedomain.tld
    Google will return the list of URLs it has indexed from your website. If the number of indexed URLs is quite different from the URLs you have registered using your crawling tools, it might be existing a problem for the Google spider to crawl the entire website and you should try to understand the reasons.
  • Home Page verification. Search in Google the exact words you have in the tag “title” of the Home Page or the first part of it. If the website doesn’t rank in the first position of the SERP (Search Engine Results Pages), you can bet there’s a SEO problem on the website.
  • Obsolete Indexed Pages. If old pages are indexed in Google, you have to redirect permanently to the new pages they should represent. You could also remove them from Google Webmaster Tools but just in case the resource gives you a server error 404.

Content Verification

You have to check the following ON-Page SEO factors:

  • Duplicated Content. If you have access to the web statistics of the website it would be easier. Otherwise you could use Webmaster Tools by Google or Seomoz to verify if the website has duplicated content. If you don’t have access to these tools (it might be the case if you analyzing a competitor of your website), you’ll have to verify it performing different queries on Google on topic services and products offered from the website. There are a lot of tools you can use to make the job easier, just check this post: http://slodive.com/web-development/10-top-tools-checking-duplicate-content/

  • Canonization Problems. The job is to verify if a rel=”canonical” is due or not. The question usually refers to the Home Page, where it might be responding as domain.tld/ domain.tld/index.php domain.tld/index.html or domain.tld/index.asp You should just have a single Home Page. If you can’t redirect index.asp to / that the case where you need to canonize the Home Page, placing this code inside the header tags, as it follows:
    <link rel=”canonical” href=”http://www.domain.tld/”/>
  • Pagination Problems. If your website has many products or services it’s probable that the website will have a sort of pagination of its products/services. In those cases it’s also probable that the website will have duplicated content pages or duplicated titles of pages. You have to play with rel=”prev” and rel=”prev”. You can have a look at the following post: http://googlewebmastercentral.blogspot.it/2011/09/pagination-with-relnext-and-relprev.html
  • Duplicated Title of Page. Duplicated titles of Pages are deprecated by Google and the other search engines. To check if the website has “duplicated” titles of Pages you can use: Google Webmaster Tools, Seomoz or just Xenu Link Sleuth. Even if Xenu was thought to be used in Windows, you can make it run also in Linux using wine. Once you have spidered the website, just order the pages for “title” and it will be easy to see which pages have the same Title of Page.

 

Extra SEO Verification

An important factor that it’s not strictly SEO but it’s related to, it’s to understand the audience of the Website. You can use different tools:


Google Adwords (free)
https://adwords.google.com
Google Trends (free)
http://www.google.com/trends/
Alexa (free) http://www.alexa.com
Quantcast (free) http://www.quantcast.com
Compete (from 199$ a month) http://www.compete.com

SEO analysis checklist

SEO analysis checklist

These tools give you important details on the demographic of the website visitors, the most popular keywords used to reach the website and many other data you can use to choose, update or change the keywords you want to rank first with the website itself.

Once we have determined the structure of the website, its indexation in the Search Engines (basically Google), the possible presence of navigation problems and the audience of the website, we are ready to the next step of the SEO analysis.

 

 

 

 

Domain On-Page SEO factors

 

  • Age. Older Domain means better Search Engine Ranking.
  • Keyword Representation. The domain itself should contains the most significant keywords of the website. The domain name should be short to be easily remembered, it shouldn’t contain underscores and special characters.

 

Domain Off-Page SEO factors

 

  • Google Page Rank

    Google Page Rank

    Page Rank. It’s the value determined by Google each page of the website. The root domain Page Rank can considered the value of the domain itself. Page Rank samples: Flagsonline.it PR4 Bizonweb.it PR5 Seomoz.org PR6 ShinyStat.com PR7.

  • Alexa Global Rank. The Alexa rank is calculated using a combination of average daily visitors to the website and pageviews on the same website over the past 3 months. The site with the highest combination of visitors and pageviews is ranked 1st. It’s useful to evaluate this rank in combination with some competitors of the website to better understand its value. (http://www.alexa.com)
  • Compete Rank. Compete offers an estimation of the traffic of the website and it represents this value with the Compete Rank (http://siteanalytics.compete.com/ )
  • MozRank, Page Authority and Domain Authority. The domain mozrank is a global link popularity metric, calculated similarly to Google’s PageRank on a log scale between 1-10. Page Authority and Domain Authority are two metrics calculated through Open Site Explorer by SeoMoz. These metrics are represented on a log scale between 1-100. You can have freely these values by checking your website there:http://www.check-domains.com/website-analysis/website-analyzer.php
  • External backlinks. You can use http://www.majesticseo.com/ to calculate the incoming links to your website. You should represents these data:External BacklinksReferring DomainsReferring IP addresses

    Class C subnets

    Consider that if you have thousans of incoming links but all the links come from a couple of C subnets it’s probable that all the domains that link your website belong to you … so the value of the incoming links it’s insignificant. You could also check the Followed Linking Root Domains, that is to say the number of domains with at least one followed link to any page on the root domain (ref. seomoz.org).
    You can also add
    Alexa Sites Linking In as important ranking factor to be analyzed. The Google query “link:domain.tld” is not anymore a good SEO reference.

  • Dmoz presence. One of the first step we did 10 years ago when we did a website was to sumbit it in Dmoz Directory. The presence of the website in DMOZ is still considered a SEO factor.
  • social network analysis

    social network analysis

    Social network references. There’s a check list you could create on the Social Network environment. You should check if the website is present and mentioned in the following Social Network:– Facebook

    – Twitter

    – Google Plus

    – Linkedin

     

On-Page SEO factors

 

  • URL. The url of each page should be search engine friendly and it must contain the most significant keywords of the same page.
  • Title tag. The title should contain the keywords we promote in a discursive manner, it can’t be just the list of the keywords of the page. As Google indicates in the “Google SEO Report Card”: “A descriptive title and description meta tag can help a result pop out better to search users.” The reference is here: http://www.google.com/webmasters/docs/google-seo-report-card.pdf
  • Description meta tag. The description should be descriptive and contains the keywords we promote inside the page. The actual standard confirms that the description should be not longer than 156 characters, including spaces and commas.
  • H1, H2 and H3 tags. Each page of the website should present these tags and these tags shoul be ordered (h1,h2,h3) and inside they should be presenting the keywords for the page they represent.
  • Keyword density. It’s the keyword density inside the body of the page. A good ranking could be determined by a good density (5 – 20% of the total words inside the body).
  • Bold Keywords and Strong Keywords. A keyword inside the tag <strong> or <b> is an heavier keyword, better visible to Search Engines.
  • Keyword prominence. A keyword in the upper left side of a page is heavier than the same keyword placed in ths inferior right side of the page. The prominence is how early the keyword is inside the page.
  • Anchor text. The anchor text should be keyword representative. If you have a link to the SEO section the anchor text should be “SEO section”.
  • Title of Link. The title of Link should be keyword representative. You can have a look at this post to understand its importance:
    http://www.seomoz.org/ugc/link-tilte-attribute-and-its-seo-benefit
  • Image Alt and Image title. Alt text is an alternative information source for those people who have disabled the images in their browsers and those who are simply unable to “see” the images. It should describe what the image is about and get those visitors interested to see it. The alt text contributes to rank in Google Images search, so it’s a relevant ranking factor. Image title should provide additional information. Tt should be relevant, short, catchy and concise and it helps to promote the keywords of the page (be careful as it might be checked as spam if the description and the content of the image have nothing to do with, i.e. Kewyord stuffing).

You can check many of the SEO factors I describe above using this simple page:

http://www.seocentro.com/tools/search-engines/metatag-analyzer.html

 

The last step of the SEO analysis is an advanced SEO analysis and it’s referred to other aspects you can improve for ranking better.

 

Usability

 

  • Usability test

    Usability test

    Website Usability Review. You should check if the website is usable, if you can easily navigate it even if disabling javascript.

  • Form Usability. Each form of the website should be usable even if you disable javascript.
  • Navigation Analysis. Each section of the website should be reachable in no more than 4 clicks, otherwise we have a navigation problem to enface.

 

Crawl Ability

 

  • Robots.txt On the root of the domain you should have the robots.txt This file is useful for the search engines to understand which pages they can index and which ones they cannot. It might be something so:User-Agent: *Disallow: Sitemap: http://www.domain.tld/sitemap.xml
  • Sitemap. The sitemap helps the search engine to know all the URLs we require to be indexed at once. Each Search Engine has its own page to submit the sitemap:Google – https://www.google.com/webmasters/tools/ Bing – http://www.bing.com/toolbox/webmaster/
  • Crawl verification You can check how crawlable is the website using a text browser (Lynx) or using this javascript browser:
    http://www.domaintools.com/seo-browser/

 

 

Text Quality

 

  • You can use automated tools to verify the quality of your texts. Specifically you can check the two metrices: Lexical Density and Gunning Fog. The Gunning-Fog Index may vary from 4 (easy) to 20 (hard to be read), while the Lexical Density index represents the Complexity factor and it’s a percentage (normally from 20 to 60%).

 

Page Speed

 

 

Page speed by Google

Page speed by Google

All you need is to use the Page Speed by Google (Make your web site faster).

https://developers.google.com/pagespeed/

 

The most significant points are:

 

  • Browser caching. Setting an expiry date in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk.
  • Compression resources. Compressing resources with gzip or deflate can reduce the number of bytes sent over the network.
  • Compressed Images. Properly formatting and compressing images can save many bytes of data. Each image should be compressed and should specify high and width values. Properly sizing images can save many bytes of data.
  • Sprite CSS. Combining images into as few files as possible using CSS sprites reduces the number of round-trips and delays in downloading other resources.
  • Minify and Deferr Javascript. Compacting JavaScript code can save many bytes of data and speed up downloading, parsing, and execution time. Deferring parsing of unneeded JavaScript until it needs to be executed, you can reduce the initial load time of your page (basically you could place most of your javascript code before the tag body is closed).
  • Asynchronous Resources. Fetching resources asynchronously prevents those resources from blocking the page load (javascript static code, facebook and twitter widgets etc..).
  • Consistent URLs. It’s important to serve a resource from a unique URL, to eliminate duplicate download bytes and additional RTTs.
  • Minify HTML. Compacting HTML code, including any inline JavaScript and CSS contained in it, can save many bytes of data and speed up downloading, parsing, and execution time.
  • Optimize the order of styles and scripts. Correctly ordering external stylesheets and external and inline scripts enables better parallelization of downloads and speeds up browser rendering time.
  • Specify a cache validator. By specifying a cache validator – a Last-Modified or ETag header – you ensure that the validity of cached resources can efficiently be determined.
  • Minify css. Compacting CSS code can save many bytes of data and speed up downloading, parsing, and execution time.
  • Specify charset early. Specifying a character set early for your HTML documents allows the browser to begin executing scripts immediately.
  • Specify a vary accept encoding header. Instructs proxy servers to cache two versions of the resource: one compressed, and one uncompressed. This helps avoid issues with public proxies that do not detect the presence of a Content-Encoding header properly.

 

 

Complete SEO analysis checklist 2012

Complete SEO analysis checklist 2012

Once you have worked on all these steps you should know the strength and weakness of your website or of your competitor and you should be ready to fight to rank first. If you are still having troubles for ranking, just comment this post.

 

If you don’t agree with what I wrote, here I’m to listen to your comments.

 

 

Print Friendly

Mar 30

Twitter tips, increase reputation trustworthy and credibility

Recently a study wasrealized by Microsoft and Carnegie Mellon University on users’perceptions of tweet credibility (microblogging and Twitter)*. The report indicates important steps to increase reputation trustworthy and credibility on Twitter, strategies that tweet authors can use to enhance their credibility with their actual and future readers.


Increase reputation trustworthy and credibility on Twitter

Increase reputation trustworthy and credibility on Twitter

Twitter users have two different approaches to have a piece of information: follow another user they consider interesting for the content he/she produces or search into the Twitter search engine the info they are looking for. The fact that 1.6 billion of queries are generated on Twitter Search Engine (ref. Siegler, M.G. At 1.6 Billion Queries Per Day, Twitter Finally Aims To Make Search Personally Relevant. TechCrunch, June 1, 2011) is a clear evidence that both approaches are significant.

The survey reveals the sources Twitter users adopt to have an info.

  • Reading tweets from users they followed
  • Conducting searches on search.twitter.com (84%)
  • Clicking trending topics on the Twitter homepage (84%)
  • Searching for tweets using Bing’s and Google’s social search functionality (72%)
  • Encountering tweets mixed into the results of general Web searches (81%)

The survey makes evidence that the ability of Twitter users to determine reputation, trustworthy and credibility is largely limited to features visible at-a-glance:

  • user picture
  • user name
  • tweet content

The credibility factors are here reported (max impact 05.00):

 

Feature Credibility Impact
author is someone you’ve heard of 04.33
contains URL you clicked thru to 04.33
account has verification seal 04.32
author often tweets on topic 04.14
many tweets w/ similar content 04.11
personal photo as user image 04.10
author often mentioned/retweeted 04.09
is a RT from someone you trust 04.08
username is related to topic 04.07
author location near topic 04.07
author bio suggests topic expertise 04.06
is a retweet 04.06
author has many followers 04.05
verified author topic expertise 04.04
is a reply 04.01
author is someone you follow 04.00
posted recently 03.59
near top of search result list 03.58
contains complete URL 03.57
author tweets frequently 03.52
contains a URL 03.50
contains hashtags 03.48
author location near you 03.43
customized Twitter homepage 03.41
contains shortened URL 03.39
logo as user image 03.37
author is following many users 03.30
default user image 03.27
cartoon/avatar as user image 03.22
non-standard grammar/punctuation 03.11

 

The survey shows that the credibility of an author on a given topic is determined by such factors as the Twitter homepage bio, history of on-topic tweeting, pages outside of Twitter, or the location relevant to the topic of the tweet.

 

The author reputation, on the other side, seems to be determined by the number of followers the User has, the number of retweets he/she receives, and the mentions he/she has about and if the author has an official Twitter account verification seal.

 

The author credibility is also greatly influenced by his/her picture. The pictures of males and females are preferred to the default avatar (the Twitter egg). The reason is simple. Default avatars may easily represent unreal users.

 

The credibility of the content seems to be granted by the following factors:

 

  • URL leading to a high-quality site
  • Presence of other tweets conveying similar information
  • The tweet is a retweet

 

Considering the survey of Microsoft, the steps to have a good reputation, trustworthy and credibility are determined.

 

  • Update your profile with a complete biography on the topics you are going to tweet.
  • Upload your personal image (if you are not a company).
  • Don’t follow the planet, is not credible that you can follow 20,000 users.
  • Tweet regularly.
  • Try to engage the Community, more followers = more trustworthy.
  • Tweet URLs if possible.
  • Tweet using hashtag.

 

Simple or not ? Anyway this is the way to increase reputation trustworthy and credibility on Twitter, the path to the success !

Happy tweet guys and girls !

The original doc is available there:

http://research.microsoft.com/pubs/155374/tweet_credibility_cscw2012.pdf

 

Print Friendly

Mar 11

keyword effectiveness index KEI, take with caution.

All success in Search Engine Marketing begins with Keywords says Robin Nobles, Co-Director of Training of Search Engine Workshops. The analysis of the keyword effectiveness index is the reason of this post.

keyword effectiveness index

keyword effectiveness index

Sumantra Roy (1stSearchRankinginvented the equation used to calculate KEI. The keyword effectiveness index (KEI) compares the number of searches for a keyword with the number of search results that include that particular keyword in order to determine which keywords are most “effective” for your SEO activity. In other words the KEI tell us which keywords have less competitiveness but at the same time a good traffic (visits).

You can easily discover your keyword effectiveness index for your Keywords just using Google Keyword Tool (https://adwords.google.com/select/KeywordToolExternal) to acquire Monthly Global Searches (Searches) and Google’s search engine to determine the number of Competition Pages (CompetitionPages).

The most common formula may be written as follow:

KEI=Searches²/CompetitionResults

or if you prefer:

KEI=(Monthly Searches/30)²/CompetitionResults

< 0.001 = Poor keyword

0.001-0.010 = Good Keyword

0.010-0.100+ = Excellent Keyword

This index comes with many weakness points for Web Marketing Practitioners.

Here they are.

Google Keyword Tool reliabilty

As Rand Fishkin has recently pointed out http://www.seomoz.org/blog/be-careful-using-adwords-for-keyword-research, GKT is not reliable. I invite you to read it carefully but, resuming his post, Fishkin makes evidence that if you don’t place the most significant keywords in GKT, Google will never suggest all the most significant keywords. That said, if the list of keywords you place will not include significant keywords, Google will never suggest the most significant ones.

You can partially correct this problem integrating different tools to have a significant and complete list of keywords to be analyzed. You can use:

I also would suggest to use Google Trends

http://www.google.com/insights/search/

Google Trends can help you to determine among different keywords which are the most trendy.

Localization

Even if you use different tools, they will never suggest you “localized keywords”. If your business is in Verona (the city of Romeo and Juliet) you should always consider that the chance to convert for a determined localized keyword is greater than any other keyword, not localized. If I need a flag is more probable that I would buy where I live if I find a good producer in my city instead of searching far away.

So when it’s time to determine which keywords you want to promote for your website, always think to the local Market, it’s so important as the Global one. Internet is a global market but people still prefer to buy at home if they have a chance to.

Websites are different

seo keywords

seo keywords

Rank first for “flags” with a new website is different than ranking first with a 10 y.o. website. The websites have different values so that it would be important to consider the value of the website before determing the keywords of my Campaign – Promotion. Nowadays the value of a website is the Google index (Page Rank).

The competition results are a neutral index but it’s quite different compete with thousand of websites or with ten websites with PR 8. We should try to integrate this value to the common formula both for the website we promote both for the competitors we have for each keyword.

Improve your SEO approach on the keyword effectiveness index

Start listening

Try to speak with the customer care if the website you are going to promote and rank has one. Nobody knows better the company than the customer care operators. They know the problems, they know what the customers want and this is the Key for determine a valuable list of keywords to use for your analysis.

Be creative

There are many different approaches to an improved version of the Keyword Effectiveness Index. I would suggest some:

  • The re-visited KEI version fo Web Site Advantage
    It considers the Page Rank of the website into the formula.
    http://seo-website-designer.com/KeyWord-Analysis-KEI
  • The re-visited KEI version of Ivano De Biasi
    It’s in Italian but you can still understand it. The formula includes the index “relevance”, how important is the keywords for the website.
    http://www.ivanodibiasi.com/come-scegliere-le-keyword-da-posizionare.htm
  • Your own KEI
    You can build your own index. This is a suggestion you can work on. The “PRCoptrs” might be the average Page Rank of the 10 websites in the first 10 positions for the analyzed keyword.

KEI=YourPR*(Monthly Searches/30)²/CompetitionResults*PRComptrs

 

You are the only limit to your keywords

There are no difficult keywords and easy keywords. There are prepared web marketers and unprepared ones. If you are prepared, you can rank well for each keyword you might wish. Obviously keyword effectiveness index can help you in improving your SEO techniques.

Print Friendly

Older posts «

» Newer posts