free web site traffic and promotion

CLICK HERE!

Saturday, January 8, 2011

Google Adwords Tips That Enable You to Profit From Google More Effectively

Anyone can profit from Google - it’s just that not everyone knows how to. Yes, you can make money online when you utilize Google Ad words for your business or for another person’s business, but, to earn from Adwords, you should be aware of several Google Adwords tips first. You should have a clear idea of Ad Words key principles because if you just take everything in a happy-go-lucky manner, you’ll only end up with low conversions, or worse, no conversions at all.

If you wish to make Google’s Ad-words program work for you, better bear these tips in mind and apply them:

Choose The Right Key Phrases/Keywords

One tip that will help you profit from Google is to ensure that you actually choose the appropriate key phrases or keywords for your campaign. You can actually use Google’s Keyword Research tool to find out which words relate to your business or your niche, and which of those relevant key words have only small competition. Selecting the right words enables you to make money online because the appropriate phrases/words work to attract targeted customers.

For example, if your product is an anti-dandruff shampoo, you can utilize keywords/phrases such as get rid of dandruff, anti dandruff treatment, and the likes, so that people looking for an effective shampoo for removing dandruff will be able to find your web pages. This is one of the must do Google Adwords tips that can make you earn from Adwords. It is also best that you place yourself in the shoes of your potential customers and ask yourself which keywords you are likely to type in the search bar if you’re searching for a specific product or service.

Come Up With An Appropriate And Creative Title

Another strategy that can up your chances to profit from Google is to create a title that’s catchy or interesting, and is related to your products/services. Remember that the title is one of the first things that people see when searching, thus, if you aim to make money online more easily, do not forget to choose the right title. Part of valuable Google Adwords tips is to also use bigger font sizes for your titles, and to use font colors that make titles more visible in the eyes of searchers.

You can earn from Adwords more easily, for example, if you’re selling organic dog food, and you include the phrase: organic dog food in the title line, instead of simply making use of the phrase: dog food. Phrases such as: all natural dog food, healthy food for dogs, etc, are also good suggestions for this example, definitely better than just using the plain phrase of ‘dog food’.

Proper Keyword Placement Is Also Essential

The placement of your keywords in your Ad-words campaign also affects how easily you’ll profit from Google. Aside from placing the key phrase or the keyword in the title, you should also place keywords/key phrases within the introductory text, the body of the text, and within the conclusion. Using 1 - 3% keyword density is recommended. Using more than that percentage could make your ads look like spam. Included in Google Adwords tips is for you to also highlight your main keyword/phrase to make the campaign appear ‘stronger’ in the eyes of targeted people. Doing this tip will truly aid you in your goal to earn from Adwords.

The tips mentioned above can be your stepping stones to make money online with the use of Google’s Ad Words program. Trying out all of these strategies is highly recommended if you really yearn to profit from Google. Not only will these recommendations help you increase targeted web site traffic - they will also help you gain improvement in your sales conversion rate. To know more about how you can earn lots of money from Google, check out this e-book by Chris Carpenter: Google Cash. It does not only contain Google Adwords tips, but also discusses where you can find products to sell on line, the best ways to advertise products over the Web, and of course, how you can get more cash from Google.
By Annita Brixen

AdWords Success Secrets

With AdWords, Google has created one of the most effective, easily measurable, and targeted forms of advertising and marketing in history. By using Google AdWords, you can access millions of people in seconds, getting the attention of an appropriate audience, in order to sell your products or services to visitors from all across the world.

However, in order to be successful with Google AdWords – maximizing your results along with minimizing your spend – here are a number of simple steps that you can take.

Identify a List of Effective Keywords

One of the basic steps is to identify a list of the most effective keywords, which will be used in your AdWords ad. This is particularly important, since if you bid on the keywords that everybody else is bidding on, it will result in you ending up with very costly clicks.

Use Keyword Search Tools

You can use any keyword search tools that are available online in order to determine the keywords that do not have too much competition and yet have lots of searches. AdWords Analyzer software can also be used for this purpose.

In fact, by bidding on enough keywords that are low-cost, you can actually do as well, or even better than if you should have bid for an expensive and popular keyword. Plus, you can also determine a budget limit, in order to keep control on the total monthly expenditure. In addition, you can also specify not including certain words in order to limit the number of clicks from people hunting for some related product or service that you do not offer.

Refine the Keywords

If you have budget constraints you could cut costs by refining the keyword, aiming at sub-topics of the main topic. For instance, if your topic is on Diamond, you could create a sub-topic "Diamond Jewelry". On doing this, your ad will only be displayed if the exact keyword phrase, which you need to include within quotes, is searched for.

Target the Correct Audience

Targeting the right audience is an obvious tactic. You can do this quite simply by choosing the language and countries that you are aiming for. For instance, you could leave out all countries where the major percentage of the population cannot understand English.

Perform a Split Test

One of the things you need to do is simultaneous testing of two or more ads, which is known as a split test. This will help you to find out which ad produces a higher click through ratio. Once this is determined, you could go about tweaking the weaker ad, turning it into a higher performing ad. This process should be done continuously in order to keep your ads at the highest click through ratio possible.

Track the Return-On-Investment

Although the click through ratio of every ad is tracked by Google, however, it does not actually track the conversion ratio. In order to track the conversion ratio you will need using a special tracking link. Attaching an affiliate system of tracking is one of the ways of doing it. This will help you to monitor in a better way whether each of your ads is producing return-on-investment.

Include Targeted Keywords

Targeted keywords should always be included in the headline as well as the description of the ad. Google highlights keywords that are searched for by bolding them in the ad, which helps to attract the attention of the viewer. This is the reason that ads with keywords generally perform better than the ones that are without.

Spell out the Benefits

Since ad space is limited, go straight to the point by spelling out a couple of major benefits in your ad, such as: lose weight; make more money; get healthier.

Use Attention Grabbers

The headline of your ad should begin with a word that is attention grabbing such as: Sale; New; Free; and so on. However, see that you heed the editorial guidelines of Google.

Use Power Words

Don’t forget to use power words like: free shipping; free; limited offer; sale; cheap; at last; learn; fact; discover; enhance; tips; limited time; special. Plus, it is always a good idea to include call-to-action phrases, such as: Download free trial now; Buy Today = Save 70%, etc. Make sure that the phrases you use are specific to your business or the phrase may be rejected by Google.

Sell Your USP

Your ad needs to spell out why your product or service is different or better from your competitors.

Link to the Relevant Landing Page

If any of your ads is for a specific service or product, you need to have a landing page for it. The landing page should have all the useful, appropriate and relevant information so that there are higher chances of converting the potential customer. Well designed landing pages usually convert a higher percentage of visitors than if they are simply sent to the home page.

Remove Common Words

As there is a limitation of space, common words should be removed as much as possible. Examples of these words are: of; it; on; in; an; a. In fact, get rid of any word that is not absolutely necessary in the ad. Ensure making every word count.

Deter Hunters of Freebies

Include the price of the service or product at the end of the ad. This will help you to reduce your average cost of customer acquisition along with improving your overall conversion ratio. Plus, it will also let your potential customer know what to expect. While doing this may lower your click through ratio, however, it increases the likelihood of capturing potential customers. In any case, usually freebie hunters rarely, if ever, become paying customers.
By Rita Putatunda

Adsense Follies: 6 Ways To Get Banned From Google Adsense And Lose Tons of Revenue

Google’s adsense program is hands-down one of the best ways to monetize your website. Adsense is good for advertisers, good for consumers and good for webmasters. But just as there are many honest webmasters producing quality content and playing straight with the Adsense program, there are also unscrupulous individuals that attempt to deceive advertisers, Google and the public.

Sometimes the intent is not pure fraud (although it often is) but rather a desire to make money quickly. These people are not thinking about the ethical or social harm that they are causing or the consequences of getting banned. Do not engage in the following tactics unless you want to get banned from Google Adsense and destroy the best potential source of revenue your website could have.

1. Play with color and font to hide text. The purpose of this is to engage in spammy keyword stuffing that Google search bots will see but humans will not. Trying to game the system this way is an excellent way to get yourself banned.

2. Engage in "page cloaking," i.e., set up scripts so that your website shows one set of content to visitors and an entirely different set of content to Google search bots. Goo

3. Try to make adsense revenue with doorway pages. A doorway page is a page of hastily-assembled low grade content focused on a single key word. Sometimes groups of doorway pages are linked together to form an authority site. But the content is of no value to consumers and the entire purpose of the enterprise is to rank high in the search engines and generate adsense revenue from consumers that are unfortunate enough to find the sites.

4. Create multiple domains having the exact same content. Or make multiple pages on one domain having duplicate content. I do not know how to break this to you, but Google knows when you are doing this. It does not believe this duplication serves any benefit to consumers and where it is done to increase streams of revenue from the adsense program it considers it a form of abuse.

5. Format and manipulate Adsense ads in an attempt to deceive visitors. For instance, change the font and formatting ads to make it seem as though they are internal links to your website. Visitors will click on the ads thinking they are going to pages of additional content that you have provided for them. Place your ads at the very top of the page prior to any of your content, or in such a way that it seems that these ads are part of the content of your web page.

6. Try to deceive Google with click fraud. Sit around at home in your pajamas and click on your own adsense ads to generate commissions for yourself. Ask all of your friends to click on your adsense ads. Post on your blog encouraging people to visit your product site and click on your adsense ads. Outsource to a team of people that are devoted to clicking on your adsense ads. You get the picture. You are going to get banned in a hot minute.

Google Adsense is an extraordinary program for bringing webmasters, advertisers and consumers together. It is an invaluable source of revenue for your sites. But this system cannot work without integrity. If you engage in any of the tactics identified about you are attempting to undermine the integrity of the Adsense program and trying to deceive consumers. You will get yourself banned from Adsense and lose a tremendous source of revenue for your website.
Blackout Panels
For more information on the superb room darkening capabilities of blackout panels and how they can save you money and increase the comfort of your home, click the above link.

By Sylvie Strong

Make Money Online Through Google Adsense

If you are looking for ways to make money online through your website, you should look into opportunities through Google’s Adsense program. Such online opportunity is definitely a good option for you to generate an income online.

With Google, the biggest search engine right now on the internet, you can now join their program known as Google Adsense and make use of such opportunity to make money online. This program are basically meant for web site owners who are willing to display textlinks or image banners on their website and earn some money when someone click on their ads on their website.

If you do have a website, then, you should not miss this opportunity to earn an extra income by signing up with the Google's Adsense program. To know more about how you could earn an income online through this program, below are some effective ways you can do so.

Making An Interesting Website To Attract Visitors

When you have a website, you should start with some interesting topics, which you yourself have interest in, to get visitors wanting to visit your site more. With more people going to your site, the chances of more people clicking on the ads will be higher and hence enabling to earn more income this way. On top of this in providing a good and interesting website to your visitors, there are a few points you need to take note of with the Google Adsense Program.

Be Aware Of The TOS (Terms Of Service) Of The Program

First, you need to be aware of the TOS of the program in order to safe keep your adsense account and preventing your account being banned for any violation go against the program. This is especially important if you want to make this a long term income as you do not want to get your account banned after so much effort was being put in to attract visitors and you start to earn money. So, read their TOS carefully after you sign up with them and follow their rules closely and do not break them.

Play Around With The Ads Placement

After you have signed up with the program, you need to play around with the ads placement and the displaying color. Normally, ads placed in the top part of your website tends to get more clicks and hence you can make money online more this way. However, every website design need to be tested out by yourself and stick to the one that get most clicks.

Similarly, use Adsense units in strategic spots on your site or blog. Usually, the top most portion of the web page is the ideal area for posting good ads. However, other locations and corners could also be converted into good areas depending on the layout plan of the web page.

Using The Tools Available

There are many tools available in the Google Adsense Program. A good one is using a channel to track the performance of the ads. By using channels, you can track which webpage or ads placement earn the most clicks and hence allowing you a way to track the performance e and make any necessary tweaks to optimize your adsense earnings.

Keep Updating Your Content

Updating your site’s content is very important for your visitors so that you want to return to your site again for more. If your site is being stagnant and not must update, people will not want to return again and hence the chances of more clicks on the ads will be less, lowering your earning income.

Using Google Adsense program to make money online is definitely a good start if you have a website. Join one today and start to make money!!!

If you want to know more of how you can Make Money Online with Google Adsense Program, do not hesitate to visit this site by clicking on the link Make Money right away.
By Alan Lim

Increasing Google Pagerank

So you've heard about Google Pagerank and you know what it is. Now that you know exactly what it is, I am sure you want to increase your pagerank so you can get better rankings in the SERP's and get more website traffic. Search Engine Optimisation is the art of gaining a better pagerank through the knowledge and experience of an SEO expert. Search Engine Optimisation covers all facets of what it takes to get a website ranking well in the SERP's, from onsite changes like titles and meta tags, to offsite SEO factors like linking. Since your website probably has a low pagerank and you would like to do something to increase it this article covers some hints and tips that will aid you in your efforts. Many so called SEO experts have brought to the SEO marketplace tools that you can assist you in increasing your website pagerank. However it should be noted that the majority of these tools are useless and really do not assist you at all. What will increase Google pagerank is applying some of the techniques listed below to your site. Using these proven methods will guarantee a way of boosting your websites pagerank and they are all completely free SEO techniques. So you should ask yourself, why spend the money to increase pagerank when we’ll tell you how to do it for free?

Increasing pagerank

As we said, there are several SEO techniques to increase a web site’s Google pagerank. Even if you only follow one of the techniques below, you can expect to see a positive difference in your website’s pagerank.

Technique #1: Content

Despite the fact that SEO experts constantly talk about the importance of content, most web masters ignore their advice to make sure their sites have quality content. It’s really crazy, particularly when you consider the fact that Google looks at content when determining pagerank. Thus, one of the greatest techniques toward improving pagerank is to simply make sure your site has quality content. This content should ALWAYS be relevant to the topic of the site. The content should not be too long (greater than 2500 words a page) or too short, should be informative and should include the right amount of keywords (a keyword density of 2-5%). If your content has all of this, you should be in very good condition to increase your pagerank.

Technique #2: Back links

The major determining factor in Google’s pagerank is back links. These are incoming links from one site to your site. What these links tell Google is that your site must have good content on it, or else other sites would not be linking to it. Now you might think that all that matters is the links, right? Not exactly. Google has gotten very smart and accurate with pagerank, and so they can tell bad links (i.e. Links from link farms, irrelevant sites etc.) from the good links (links in articles, links from relevant sites etc.). The most effective technique in increasing pagerank is to gather good back links. The next two techniques on our list deal with this.

Technique #3: Forums and Blogs

Google is known for quickly indexing forums and blogs. Literally the fastest way to gain a good back link is to simply post a link to your site on a forum or a blog. Most forums consider outright linking to be spam, so you’ll have to make sure you do this in the right way. Try joining forums relevant to your site’s topic and put a link to your site in your signature, if it’s allowed. Also, put a link to your site in your profile. Then post 1-10 times. Your posts should contribute something to the forum so that they are not edited or, worse, removed.

As for using blogs to gain more links, you can try to create a blog of your own and casually put a link to your site on it. You can also try commenting on other relevant blogs and including a link to your site in these comments. Once again, you need to be cautious in how you do this—you don’t want to make comments that are outright promoting your site.

Technique #4: Articles

An popular way of gaining a better Google pagerank is to write articles. These articles are based off of your site. For instance, if you run a photography site, you may write an article on digital photography. At the end of the article, you might include something like "for more on digital photography, visit insert site.com". Article back links are very effective because they are considered credible by Google. One thing to remember is to not mention your site too many times; one or two mentions per article is fine. You can submit your articles to AssociatedContent.com and Ezines or you can use article submission services or Article Submitter software.

Conclusion

You don’t have to spend money to increase pagerank. All you need to do is just practice some of the techniques we’ve outlined here and you’ll be well on your way to a better Google pagerank.

References: Kanga Internet are based in Melbourne, Australia, expert Website Design, Web & Internet Marketing Online Marketing.
  By Chris Diprose

Why sitemaps are important

There are many optimisation tips and tricks that help websites achieve rankings in search engines, and one of those factors is often undervalued, these are sitemaps. Sitemaps are simply a map of your site which is usually a single page that displays the structure of your site, its sections, etc and has links to all pages within the website itself. Sitemaps makes the navigation of your site easier and keeping it updated can benefit both your site visitors, and search engines. Sitemaps are an important way of communication with search engines, as they tell search engines which parts of your site to include for indexing. It’s like giving search engines directions.

Sitemaps have always been recommended for use on websites, and now with the adoption of sitemaps by search engines, they have become even more important. If you are interested in sitemaps from an optimisation point of view, these need to be created in a specific format, like written in XML for instance.

Why Use a Sitemap?

Using sitemaps has many benefits such as easier navigation and better visibility by search engines, and the opportunity to inform search engines immediately about any changes on your site. Of course search engines probably won’t immediately index your changed pages, but the changes will be indexed faster.

If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be crucial to your success. Although you can still go without a sitemap, it is likely that soon sitemaps will become the standard way of submitting a site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.

Sitemaps also help in classifying your site content, and as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed faster via sitemaps.

Are Sitemaps hard to create and submit?

Jade Creative has extensive experience generating and submitting Sitemaps for our clients’ sites, whether it be for search engine optimisation purposes, or to improve the navigation of your website for your visitors.

The basic process is this – create the sitemap, upload the sitemap to the clients’ website and then notify Google about the alteration.

Sitemaps can be as simple or complex as you desire, and we at Jade Creative are happy to assist and generate you a Sitemap that enhances your website for your website visitors and search engines.

Why backlinks are important

If you've read anything about, or studied, search engine optimisation (SEO), you've probably come across the term "backlink" at least once. For those of you new to SEO, you may be wondering what a backlink is, and why they are important. Backlinks have become so important for SEO that they are a major factor in how your website ranks in search engines.

What exactly are backlinks?

Backlinks are links that are directed towards your website and are also knows as inbound links. The number of backlinks your website has is an indication to search engines of its popularity. Some search engines, such as Google, will give more credit to websites that have a good number of quality backlinks.

When search engines rank your website and its relevance to a keyword, they consider the number of quality inbound links to your site. So it is not just about getting inbound links, it is also about the quality of the inbound links.

How does a search engine determine the quality of a link?

When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.
What is reciprocal linking?

Another reason to achieve quality backlinks is to attract visitors to visit your website. Gone are the days of build awebsite and they will come. You need to let everyone know that your website exists. One way is to use reciprocal linking which essentially a link exchange, one webmaster places a link on their website that points to another website, and vice versa.
So to sum up...

Building quality backlinks is extremely important to search engine optimisation, and because of their importance, it should be very high on your priority list in your SEO efforts.

Keep your content fresh

Why keep your content updated? Because doing this regularly shows your customers that your business is evolving, innovative and interesting. Perhaps the most important point is that if you do not update your content then why would a visitor come back? Fresh content means more visitors, simple as that!


How often will you update your content?

The best place to start is 'how often will you update your website?' To understand this you need to ask yourself:
  • Do you offer special offers online (e.g. Christmas offers, monthly offers, etc)?
  • Do you compete on price? If you do then you must keep prices current.
  • Do you promote new products to your customers? If so then perhaps a 'featured product' of 'new product' section is a good option for your website.
  • Do you offer your customers packages and if so how often do these change?
  • Will your customers benefit from written articles, a blog or news list?

Who, What and When?

Armed with the details to the above, then we move on to consider the following:
  • Timing - will updates occur daily, weekly, monthly or quarterly?
  • Pages - will all pages need to be updated, or just a select few (i.e. special offers, new, homepage, etc)?
  • Responsibility -who will be responsible for making the changes, and who will provide and/or write the content?
  • Images - will images need to be updated as well? Who will provide the images, will you need a photographer or can your suppliers provide them to you?
  • Video/Audio - will your website display any audio or video content and if so who will provide these to you? You will also need to ensure that the files provided are compatible with your website.
  • Style - will every page follow the same design template or will different sections require different templates?

Create a plan and stick to it

Once you have collected the details above you then need to create a plan based on the update frequency, pages to be updated, timeframes and deadlines as well as who is responsible. A plan is worthless if it is not followed, so the most important point in this article is follow your plan.

The Weight of Google PageRank in Ranking- By Andrew Gerhart

As Google continues to reign as the top search engine referral source, we all continue to check our PageRank (Download the Google Toolbar to view a web page's PageRank score). SEOs and webmasters scramble to increase their PageRank to a respectable score, anything above a 5, as we continue to analyze the workings of Google's PageRank calculation and its weight within the ranking algorithm. It is not a mystery that PageRank has a fairly large role in Google's ranking, but how much weight? Are there other variables that outweigh PageRank? Below we will discuss how much weight is given to PageRank, if PageRank alone is enough to rank, and more.
The Google algorithm takes many things into account when ranking web pages. Some of the variables for ranking in Google are:
  • Page Title
  • Link text
  • PageRank
  • Heading Tags
  • ALT Tags
  • Domain names
  • Filenames
  • Directory names
  • Link popularity
  • Keyword density
The weight that PageRank is assigned within Google's algorithm has always been debated, and will continue to be as the algorithm changes from month to month. Google does not stick with one algorithm like other search engines. Instead, Google switches between a few different algorithms from month to month.
One of the main reasons that the weight assigned to PageRank is debated is because it is not uncommon to see a website with a lower PageRank outranking a website with a higher PageRank. When we see this, it means three things: 1) PageRank is not the most important factor in Google's rankings algorithm, 2) a properly optimized website with an average PageRank can outrank a non-optimized website with a high PageRank, and 3) the on-page variables play a larger role in Google ranking than most tend to think.
We have seen evidence of web pages with a PageRank of 4 and 5 ranking higher than web pages with a PageRank of 8. The website with the PageRank of 5 was highly optimized and utilized all of the variables that Google takes into account when calculating rank. The website with a PageRank of 8, although carrying some of the SEO basics, was not highly optimized.
This is what happens to many corporate websites. The website ends up being awarded a high PageRank as a result of the high amount of websites linking to them, but they end up being outranked as a result of the lack of search engine optimization. When a website has a PageRank of 8 or above, it will not take a high level of optimization to obtain top rankings within Google.
PageRank is not the end-all-be-all of ranking within Google. One of the reasons is that PageRank is easily manipulated. For example, recently we are seeing companies selling links from websites with high PageRank. For this reason and others we realize that the PageRank calculation is not perfect. Therefore, Google cannot rely on a flawed concept for their algorithm - having a website's score play a role in its ranking is fine, but the score should not be the basis of this calculation.
If you design or optimize your website properly, remembering all of the on-page variables and criteria, you can set yourself up for success. Once you have completed this task and your site has entered the ranks, you will realize if you need to increase your PageRank to battle in Google's search engine results pages.
September 11, 200

Search engine rankings tips

Most website users do not look past the first 3 pages of search results in a search engine, so your goal is to be ranked as highly as possible and at least within the first 30 results and ideally the top 10. Search Engine Optimisation (also known as Website Optimisation) is the process of maximising a sites ranking potential by using multiple key techniques.

Although search engines frequently change the way in which they rank sites in order to avoid being manipulated by marketers and damaging the engines integrity some elements will always play a factor in the ranking process.

Keywords

This is crucial to your sites optimisation, and is not as easy as it sounds. The objective is to specify a list of relevant and popular phrases that relate specifically to your site that are also being search on by your websites target audience.

A keyword is a term used by a person searching through a search engine to find a website that best matches what they are searching for. Identifying keywords should be done as early as possible, and generally identifying key phrases is better practice then single keywords as this makes for a more targeted search result.

Link Popularity

This refers to the number of websites that link to your web pages, and search engines measure the quantity as well as the quality of the websites that link to you website. Many search engines use this as an important factor in determining the search engine ranking of a website as well as your web pages page rank.

The Age of a Domain Name

Another factor that many search engines measure, in a way, the age of a domain name gives the appearance of longevity and therefore can gain higher relevancy in some search engines (for example Google). This is driven by spam sites which exist and then drop off quickly.

URL Web Page Names

These are the names you call each of your web pages and are used when creating links to web pages within your website. The use of keywords in these names helps with your sites rankings.

Content

The use of keywords within the content of your web pages is one of the best ways to obtain high search engine rankings. The objective is to place keywords in places that matter the most to search engines. Each web page in your site should focus on different keywords, making the pages more targeted as opposed to being general. With this in mind, the main focus of the content is to appeal to the visitor first and search engine second.

Web Page Optimization

Selecting the right keyword phrases for each page of your web site is the first step towards ranking high in the Search Engines. However, it is only the first step.

In addition to selecting targeted keyword phrases, you must also strategically optimize your page including:

  � META description tag
  � META keyword tag
  � Title
  � Image ALT tags
  � Heading tags
  � Image names
  � Hyperlinks
  � Body text

<META>

The META description tag will contain a description of your site. This description will be visible in some of the Search Engines when your site is returned in a search.

Your description should include your site's most important keyword phrases.

Example:

<META name="DESCRIPTION" content="Providing dog grooming tips, supplies and training.">
The META Keywords tag will contain a list of your keyword phrases separated with a comma. Your primary keyword phrase should be first, followed by one or two secondary keyword
phrases.

Example:

<META name="KEYWORDS" content="dog grooming tips, dog grooming supplies, dog grooming training">

<IMG ALT>

An image Alt tag follows your graphic address or URL in your HTML code. These words will be displayed in place of your graphics through an older browser or when your visitors have their graphics turned off.

To fully optimize your graphics, insert your keyword phrases within the Alt tags of your graphics. At a bare minimum, make sure you use enough images to display all of your keyword phrases. Remember...your primary keyword phrase should always come first.

Example Images (Notice the images are named using the three primary keyword phrases):

dog_grooming_tips.jpg
dog_grooming_supplies.jpg
dog_grooming_training.jpg


Example:

<IMG SRC=WIDTH=80 HEIGHT=105 dog_grooming_tips.jpg ALT=dog grooming tips>
<IMG SRC=WIDTH=80 HEIGHT=105 dog_grooming_supplies.jpg ALT=dog grooming supplies>
<IMG SRC=WIDTH=80 HEIGHT=105 dog_grooming_training.jpg ALT=dog grooming training>


<TITLE> Tag

Just as you must place your keyword phrases within your META description and keyword tags, you must also use your primary keyword phrase as your web page title. Nothing more should appear between the <TITLE> and </TITLE> tags except your primary keyword phrase.

<H?> Tag

Heading tags are used to separate topics and range from <H1> being very large and bold to <H6> which is very small and bold. Some Search Engines place relevance on text displayed within the heading tags. Top priority is placed on the highest listed heading tag.

Your keyword phrases should each be used as a heading for sections within your web page (placed in the same order as your keyword phrases within your META keywords tag) and placed within an appropriate heading tag. These headings should be followed by some descriptive text.

Your headings should look something like this:

<H?>Dog Grooming Tips</H?>

Your descriptive content containing your keyword phrase.

<H?>Dog Grooming Supplies</H?>

Your descriptive content containing your keyword phrase.

<H?>Dog Grooming Training</H?>

Your descriptive content containing your keyword phrase.

<A HREF> Tag

When creating links on your web page, your links should be displayed together with a small image in front of each link. This image might be a graphic bullet, arrow, or whatever you'd like. These images will not only enhance your web page, but they will also enable you to place your keyword phrases within the Alt tags.

When you begin creating your links, make sure the page name, image name and page description text all contain your keyword phrases.

Your HTML code might look something like this:

<img src=dog_grooming_tips.gif alt=dog grooming tips>
<a rel="nofollow" href=dog_grooming_tips.htm>Dog Grooming Tips</a>

<img src=dog_grooming_supplies.gif alt=dog grooming supplies>
<a rel="nofollow" href=dog_grooming_supplies.htm>Dog Grooming Supplies</a>

<img src=dog_grooming_training.gif alt=dog grooming training>
<a rel="nofollow" href=dog_grooming_training.htm>Dog Grooming Training</a>


<BODY>Text</BODY>

Optimizing your text is another important step towards ranking higher in the Search Engines. Your web page should contain plenty of text and should contain each of your keywords and keyword phrases used in different variations. If all of your keyword phrases you've listed within your META tags aren't found within your text, the Search Engines will simply ignore them.

Search Engine Submissions

Once you've optimized your web pages and uploaded them to your server, your next step will be to submit your main pages to the Search Engines. However, don't submit your pages to Google. Your pages will rank much higher if you allow this Search Engine to find your pages on its own.

You may want to consider creating a site map for your site and submit this page to Google instead. A site map is a page that outlines how your pages are set up and linked together. If you design a site map with links to all of your pages, the Search Engine robots can easily spider and index them.

Taking the time to optimize each of your web pages is the most important step you can take towards ranking high in the Search Engines and driving your more traffic to your web site.

Copyright © Shelley Lowery

About the Author:

Shelley Lowery is the author of the acclaimed web design course, Web Design Mastery. http://www.webdesignmastery.com And, Ebook Starter - Give Your Ebooks the look and feel of a REAL book. http://www.ebookstarter.com

Article source:

TwoSpots web design

8 Ways to Build a Really Bad Web Site for Search Engines

Some web sites receive hundreds or thousands of unique visitors a day, whilst others only get a handful or none. The reason is often because the web designers or Webmaster has built the site in one �really bad way' or other. This can end up hindering the potential success of the web site. If you want to make sure your site is not a �traffic flop' then here are some simple rules to follow.

Bad Move 1: Build the site using a frameset.

Framesets may save designers time but are bad news for search engines. They can struggle to follow links into the web site or read text on the page unless you use a no frames tag effectively. In addition to this, if an engine does keep a cache of a site with frames it will often pick up the individual frames/ pages and not the complete frameset. The downside of this is that you may lose your navigation for many of your pages, which is likely to turn visitors off. Whilst one or two partial fixes to framesets are out there, it's still no wonder that many web site promoters still cry �Please No Frames�. For more information on why framesets can cause problems visit http://www.html-faq.com/htmlframes/

Bad Move 2: Build the site purely in flash.

Flash intros and web sites can be visually stunning, but at the same time they can be rather limiting when it comes to search engines. If your main site is all one flash site it will typically play in just one html page. Some search engines simply can't read Flash and so your web site to them is just one relatively empty HTML page. If your competitors web site has 15 or 20 pages in HTML talking about their good sand services then they will have a big advantage on you. If you must have a Flash site for graphical reasons then it would be wise to budget for a separate HTML web site to have along side the Flash so your site content can be read and indexed by search engines.

Bad Move 3: Decide that graphics are more important than words

Be careful. As great as some images can be, try not to let designers convince you that you don't need copy on your web site or that a few lines is enough. Only very occasionally is there ever an excuse to fill your web site with graphics at the expense of text. If the graphics look great, then match them with great copy. Sales copy is important to tell your audience why your goods and services are important. Search engines also like to index plenty of useful copy too. 250-500 words is a sensible starting guide for most pages or ½ of the amount you would place in a brochure. Text copy is important and will always be so make sure web site has some!

Bad Move 4: Leaving out the Meta tags

This is a bad move as Meta Tags are important to search engines. Clear and concise title tags should be written for every page reflecting what it contains. Avoid writing things like �Home' or �welcome' as it's fairly meaningless. If you page is selling blue widgets then get �blue widgets' in the title and keep the title to 10 words or less.
In addition to this create a well-written objective Meta description for each page, and list your Meta Keywords. These keywords should also reflect the content on your web page. Leaving these 3 things out, or doing them badly can be disastrous. The impact of Meta tags on rankings may vary from engine to engine, but without them your pages could be ignored. Most HTML editors allow you to easily insert Meta Tags into your web page and it only takes a few moments to add to a page. So there are no excuses. Make sure you have good Title tags, Meta Description and Meta Keyword tags on your pages today!

Bad Move 5: Use lots of JavaScript

Search engines have a few problems understanding JavaScript in pages. At the best of time they struggle with it, at worst they may even ignore it. On its own, it can be an unreliable way to make web site navigation. If you must use JavaScript for your navigation make sure you have some alternative ways to get to pages by using HTML text links at the bottom of the page. If you have a large amount of JavaScript think about linking to it as a separate JavaScript file.

Bad Move 6: Don't have sites linking to yours

Unless you want your web site hidden from the outside world you want to be found right? Well search engines ideally find a new web site by following a link from another site. This happens when people link to you and this kind of underpins the Internet. By having sites link to you search engine crawlers will find your web site and you never need to submit your web site to the likes of Google. It is still the case that web sites that rank highly on Google and Yahoo for relatively competitive key phrases often have scores if not hundreds of web sites linking to them Google help to explain the importance of linking here http://www.google.com/technology/ . So �think links� and be sure to get your site listed in some quality web directories as part of the process.

Bad Move 7: Focus on submitting your web site to thousands of search engines and forget the ones that matter

Now and then you will see some companies attempting to promote this idea - often by using Bad Move 8! It is true that there are thousands of search engines but the highest volume of traffic comes from less than 10 major search engines. The logic of �If I get one hit a day off each of the 1000 mini search engines I'll get a 1000 visitors a day� unfortunately in practice does not ring true. It is a fact that a huge volume of search engine traffic comes from a small handful of search engine: most notably the Google, Yahoo and Msn.

Bad Move 8: Using unsolicited/ Spam email

This may seem obvious that this is not the best way to promote your business but is always worth stating. If you're in any doubt simply ask yourself: �Do I like getting Spam emails?� It's common sense to avoid using techniques that annoy people or damage your brand. Using unsolicited email campaigns could result in complaints and at worst your ISP could ban you. If you want a successful email campaign it is advisable to target genuine opted in newsletter subscribers and to always offer an opt-out button in every email.

About The Author

Copyright Gareth Davies 2005. You are free to reprint this article with both disclaimer and copyright notice in tact. Gareth Davies is a web promotion consultant working for GSINC Ltd based in North London, UK. For feedback on how to build better sites for search engines email Gareth via garethskettyATyahoo.co.uk or visit http://www.garethsketty.com .

Article source:

TwoSpots web design

Choosing your Meta-Keywords

Introduction

This article assumes you already know what a �meta-tag keyword' is and know a little about their importance to search engines. In this article I will attempt to explain the art of choosing the most appropriate and best performing keywords for your web pages.
As you should already know keywords contained within your websites' meta tags are extremely important in allowing search engines to determine the content of your web pages. In order to make sure that these keywords are bringing your site up within Search Engine Results Pages (SERPS) and driving visitors to your site, the most important factors in determining your keywords are:
  • Relevance.
  • Choosing keywords that people actually search for.
  • Choosing keywords without too much competition.

Relevance

All your keywords should ALWAYS be relevant to the content within the page they describe. Adding keywords to your site just because they are commonly searched for words is not recommended. Not only will it frustrate visitors who are looking for other information, but it may well get your site black-listed from search engine rankings.
Highly relevant keywords will attract visitors who are actually interested in the products and services your website offers. At the end of the day, it is better to attract fewer visitors who actually have an interest in your website than it is to attract more visitors who leave immediately.

Choosing Keywords people search for

Although your keywords should all be relevant, sometimes it is best not to be too specific.
For example, I once discovered a new fossil (honest!), it was new to science so I named it, wrote a paper on it and had it published. I didn't ever build a web page dedicated to it, but if I had the most used keyword and most relevant word would have been the fossils name (Trypanites fosteryeomani). You might therefore think that it would be sensible to use this as one of my most important keywords� However, that would (at least to start with) have been wrong. No one else has ever heard of this fossil, so it is very unlikely that anyone would ever type its name into a search engine. And sure enough, a quick check shows that during Dec 2004 there wasn't a single search for this term within a particular, popular search engine.
I would therefore need to be more generic with my choice of keywords. The fossil itself was a trace fossil of a worm from the Jurassic, so keywords/phrases such as �fossil', �trace fossil' or �worm trace fossil' may be more successful.
There are several tools available that allow you to check the number of times a particular word or phrase has been searched for. It is important to choose keywords that are regularly searched for and these tools can help in this decision. It is also worth including common mis-spellings of your most relevant keywords as your competitors may not have thought of this when choosing their keywords.

Choosing keywords without too much competition

The section above may lead you to believe that choosing very generic keywords is your best bet as they are often searched for. However, if you get too generic in your choice of keywords then you will be competing with many more websites for the top spots in the SERPS. If we go back to our fossil example we can see what I mean. A quick search in Google brings up the following numbers of results:
  • Trypanites fosteryeomani � 1 result (something I once wrote in a forum!)
  • Jurassic Worm Trace Fossil � 4,320 results
  • Trace Fossil � 407,000 results
  • Fossil � 9,120,000 results
As you would expect, the more generic we get, the more results we get. It can be seen then that choosing the best keywords is a matter of balancing the number of times the keywords are searched for against the number of other sites competing for rankings with those keywords. The best keywords will be those that are searched for often but have few competing sites (assuming the keywords are relevant to your content).
I find that it is best to have a balance between the generic and specific keywords relating to your web page and using key-phrases is a useful way of achieving this. In this way the entire key-phrase can be specific to your particular page, but the individual words within it are fairly generic.
e.g. Affordable Website Design Wales (4 generic keywords to create a specific key-phrase)
To Summarise, choosing keywords is an essential part of producing a successful website. Your keywords need to be highly relevant to the content of your page and specific enough to reduce competition. They also need to contain some generic keywords that are often searched for. As always, the single most important factor is relevancy and good content to go with the keywords.

About The Author

Alan Cole runs http://www.pixelwave.co.uk , a one-person web design studio. His aim is to provide cost effective website design production and maintenance by offering professional web solutions that stand out from the crowd.

Article source:

TwoSpots web design

Writing Effective ALT Text For Images

Anyone who knows anything about web accessibility knows that images need alternative, or ALT, text assigned to them. This is because screen readers can't understand images, but rather read aloud the alternative text assigned to them. In Internet Explorer we can see this ALT text, simply by mousing over the image and looking at the yellow tooltip that appears. Other browsers (correctly) don't do this. The HTML for inserting ALT text is:
But surely there can't be a skill to writing ALT text for images? You just pop a description in there and you're good to go, right? Well, kind of. Sure, it's not rocket science, but there are a few guidelines you need to follow...

Spacer images and missing ALT text

Spacer images should always be assigned null ALT text, or alt="" . This way most screen readers will completely ignore the image and won't even announce its presence. Spacer images are invisible images that pretty most websites use. The purpose of them is, as the name suggests, to create space on the page. Sometimes it's not possible to create the visual display you need, so you can stick an image in (specifying its height and width) and volià, you have the extra space you need.
Not everyone uses this null ALT text for spacer images. Some websites stick in alt="spacer image". Imagine how annoying this can be for a screen reader user, especially when you have ten of them in a row. A screen reader would say, �Image, spacer image� ten times in a row (screen readers usually say the word, �Image�, before reading out its ALT text) - now that isn't helpful!
Other web developers simply leave out the ALT attribute for spacer images (and perhaps other images). In this case, most screen readers will read out the filename, which could be �newsite/images/onepixelspacer.gif'. A screen reader would announce this image as �Image, newsite slash images slash one pixel spacer dot gif�. Imagine what this would sound like if there were ten of these in a row!

Bullets and icons

Bullets and icons should be treated in much the same way as spacer images, so should be assigned null alternative text, or alt="". Think about a list of items with a fancy bullet proceeding each item. If the ALT text, �Bullet' is assigned to each image then, �Image, bullet� will be read aloud by screen readers before each list item, making it take that bit longer to work through the list.
Icons, usually used to complement links, should also be assigned alt="". Many websites, which place the icon next to the link text, use the link text as the ALT text of the icon. Screen readers would first announce this ALT text, and then the link text, so would then say the link twice, which obviously isn't necessary.
(Ideally, bullets and icons should be called up as background images through the CSS document - this would remove them from the HTML document completely and therefore remove the need for any ALT description.)

Decorative images

Decorative images too should be assigned null alternative text, or alt="". If an image is pure eye candy then there's no need for a screen reader user to even know it's there and being informed of its presence simply adds to the noise pollution.
Conversely, you could argue that the images on your site create a brand identity and by hiding them from screen reader users you're denying this group of users the same experience. Accessibility experts tend to favour the former argument, but there certainly is a valid case for the latter too.

Navigation & text embedded within images

Navigation menus that require fancy text have no choice but to embed the text within an image. In this situation, the ALT text shouldn't be used to expand on the image. Under no circumstances should the ALT text say, �Read all about our fantastic services, designed to help you in everything you do'. If the menu item says, �Services' then the ALT text should also say �Services'. ALT text should always describe the content of the image and should repeat the text word-for-word. If you want to expand on the navigation, such as in this example, you can use the title attribute.
The same applies for any other text embedded within an image. The ALT text should simply repeat, word-for-word, the text contained within that image.
(Unless the font being used is especially unique it's often unnecessary to embed text within images - advanced navigation and background effects can now be achieved with CSS.)

Company logo

Websites tend to vary in how they apply ALT text to logos. Some say, �Company name', others �Company name logo', and other describe the function of the image (usually a link back to the homepage), �Back to home'. Remember, ALT text should always describe the content of the image so the first example, alt="Company name", is probably the best. If the logo is a link back to the homepage then this can be effectively communicated through the title tag.

Conclusion

Writing effective ALT text isn't too difficult. If it's a decorative image then null alternative text, or alt="" should usually be used - never, ever omit the ALT attribute. If the image contains text then the ALT text should simply repeat this text, word-for-word. Remember, ALT text should describe the content of the image and nothing more.
Do also be sure also to keep ALT text as short and succinct as possible. Listening to a web page with a screen reader takes a lot longer than traditional methods, so don't make the surfing experience painful for screen reader users with bloated and unnecessary ALT text.
About The Author
This article was written by Trenton Moss. He's crazy about web usability and accessibility - so crazy that he went and started his own web usability and accessibility consultancy ( Webcredible - http://www.webcredible.co.uk ) to help make the Internet a better place for everyone.

Article source:

TwoSpots web design

The Google Sandbox Explained

Introduction

The Google Sandbox is a metaphorical term to explain why most new websites have very poor rankings in Google Search Engine Results Pages (SERPS). Very few people know for sure if the 'sandbox' actually exists, but it seems to be a filter added to the Google algorithms sometime around March 2004.

What is the function of the Google Sandbox?

The generally accepted principle behind the Google Sandbox is that it enables Google to filter out 'Flash-in-the-Pan' websites from those that offer good quality, up-to-date content. It is within Googles interest to ensure that the results it displays to its users within the SERPS lead to highly relevant, up-to-date, useful websites Relevancy is key to the search engines success so it will take all steps it can to ensure the relevancy of its search results. Filtering out new websites and monitoring them may allow them to provide more accurate results within the real SERPS.

Indentifying the Sandbox?

How do you know if your website in in the sandbox? Most new websites under newly registered domains will get relegated to the sandbox once Google knows about the site. Google will find the site by following an Inbound Link (IBL) from another site that the Googlebot crawls. You will then see the website in the normal Google SERPS if you search for the actual domain name, but the website is unlikely to be listed for any of its keywords. Google also won't show signs of any other websites linking to your website, nor will it display pages related to yours. In addition Google won't list any pages other than your Home (index) page.
If you track your websites visitor stats whilst it is in the Google Sandbox, you will see that the Googlebot comes crawling fairly regularly and that it does crawl, and therefore catalogue, all of your pages. Google does know that your pages exist and knows what they contain, but doesn't list them in the main SERPS. In other words, because your website is new, it is under probation in the 'sandbox'.

How Long Will My website Stay in the Google Sandbox?

It is difficult to say how long a website will stay in the Google sandbox as this seems to depend on the types of keywords it will be completing for in the real SERPS It can be up to 6-8 months and the only way to get out of the sandbox is to wait.
The Google Sandbox isn't all bad news. If your site contains good quality relevant material it will find its way out of the sandbox and will get the rankings it deserves in the Google SERPS. I even have some theories that may mean that your time in the sandbox can be used wisely to actually improve your final rankings.
You also shouldn't forget that Google isn't the only search engine out there. MSN is widely used and with the might of Microsoft behind it is likely to become a bigger player in the search engine world in the new future. At the time of writing, MSN or other search engines such as Yahoo! and Alta Vista don't have a 'sandbox' filter and your new websites will be listed in the quickly and should start driving traffic to your site.
About The Author
Alan Cole runs http://www.pixelwave.co.uk , a one-person web design studio. His aim is to provide cost effective website design production and maintenance by offering professional web solutions that stand out from the crowd.

Article source:

TwoSpots web design

Link Popularity

Why It's The Best Investment You Can Do For Your Business


More and more search engines rank your web pages based on the number of links that point to your web site (link popularity). Google uses link popularity as its most important factor in ranking sites. HotBot, AltaVista, MSN, Inktomi, and others also use link popularity in their formulas.
In the near future every major search engine will use link popularity, so developing and maintaining good link exchange campaigns are essential to the success of your business. Also, finding the right partner to exchange links with is equally as important as becoming a member of a link farm can be devastating to your long term search goals.
Your exchange link partner should be reputable in the industry, and should provide links back to you on pages that have a high page rank.
The end goal is to have the greatest number of websites pointing to you saying on the link itself, �Your Company � Offering (insert your key phrases here)�.
For a good ranking on Google, Inktomi and Altivista, you need good links that point to your site. If you develop a strong content oriented site you can persuade thousands of other webmasters to link to you or to trade links with you. If you want to do it yourself, all you need is the right content, the right email letter, and a keen eye for finding sites that well link to your site.
The key is to develop content people want to link to and then get out there and make the contacts by visiting people's sites, sending out personalized emails to webmasters of sites you have visited, and networking in discussion groups.
You can also make posts in forums related to your industry to increase your link popularity. Just do a search on any search engine with your industry + forums or discussions and you should be able to find a few discussion boards where you can make posts.
Make sure that you find a way to participate on the discussion as opposed to just posting an ad that's not related to what people are talking about.
Another way that you can greatly increase the link popularity of your website is by publishing articles. There are literally thousands of ezine and newsletter publishers that would love to publish your article on their ezine.
Again, if you search for ezines related to your industry you should find plenty of places where you can submit an article and see it bring lots of targeted traffic. Make sure you place your resource box at the bottom of the article and require publishers to keep it there.

About The Author

Marcio Dias is an established Web Marketing Specialist who has taken dozens of sites from obscurity to millions of visitors per year in as little as a few months. Visit his main site at: http://www.webprofitsecrets.co?s=ea

Article source:

TwoSpots web design

Importance of Keyword Research for Search Engine Optimization

Keyword Research is by far the most important aspect in any Search Engine Optimization initiative. This article shall discuss the important aspects of keyword research process.
Keyword Phrase Research is a process of selecting the most �optimum performance� keyword phrases that can help visitors find your site. You may have spent days and months on fine-tuning your web pages for a better ranking with the major search engines, yet it will all amount to a big waste if the right keyword phrases are not targeted. It's like not being able to reach your destination even after running your best race because you started out on the wrong road. Even if you achieve high search engine rankings, you may not get relevant traffic if you select the wrong keywords. Therefore, the foremost step in any SEO campaign is identifying your target audience and researching what keyword phrases they might be searching in the search engines to locate a site like yours.
For any marketing strategy to succeed, it is critical to know your audience and the means to reach them. A certain focus is required which could be location specific, region specific or country specific; it could be business, trade, service, product specific, since we are talking specific audience. For instance, a dentist practicing in a particular town would most likely target people living in the same region, instead of targeting the entire country. Just as a patient searching for a dentist would search for one in his own area. Focus on region would help her get targeted visitors, not just wasted traffic.

Facts, Not Gut Feel

A common pitfall is to start the website optimization exercise with a set of �gut-feel� keyword phrases. Site owners often come up with �common sense looking' key phrases, which though look obvious, may not match with the ones your buyers are using as their search term. Very often, being from within the trade narrows the vision and you tend to assume that trade-specific terms are easily understood and popularly used. Not so. You need to think out of the box.
Doing Keyword Research invariably means departing from one's gut-feel and going by the facts. �Facts are sacred' in website optimization as they provide the exact data of what people are actually searching for, thus saving you from starting on a wild goose chase. As mentioned earlier, targeting the wrong key phrases might get you a good ranking for keywords that have few or no search requests or just get you irrelevant junk traffic. So, how does one get the facts and the data regarding a particular search term? There are several online keyword research tools like Wordtracker and Overture, which offer data pertaining to your search term. Relying on search tools to analyze keyword phrase data helps you to get a grip on your target audience.

Keyword Research Process

Keyword Research process involves the following important steps. These steps can be described as:
 Discovering Keywords

Analyzing Keywords

Selecting Keywords

Deploying Keywords
The Discovering phase should focus on identifying as many keywords as possible that are related to your website and target audience.
The Analysis phase involves adding information about existing competition; PageRank based limitations and potential for ranking.
The Selection process involves objective measurement based shortlisting of keywords keeping the site focus and target audience within the limitations analyzed.
The Deploying phase is about making optimum use of your selected keywords on your website copy , HTML code and tags.

Step 1: Discovering Keyword Phrases

Starting out Keywords

The nature of keyword research tools that help you identify various keywords is such that they need initial starting out keywords. It is important that you identify good 15 starting out keywords, tightly focused on your business. You can brainstorm with your colleagues and clients to identify search phrases that are most likely to get qualified traffic to your site. This brainstorming session is intended at discussing the most relevant keyword phrases to the product or service you offer. An initial list of 15-20 keywords can be compiled at this stage, which can be generic in nature. For wider keyword coverage, you will get better results if you have one or two keywords rather than longer phrases.

Using Wordtracker, Overture to Collect a Corpus of Keywords

Keyword Research tools like Wordtracker and Overture can be used to expand on your initial list of starting out keywords. These tools allow you to find the number of searches being made on a particular term and also look for all related terms that include your search term.
Wordtracker and Overture are the two most widely used services that give you the ability to research and find out what people are actually searching for on the major search engines along with information on how popular a search term was in the last 30 or 60 days.
Wordtracker
Wordtracker is a fee-based service that allows you to look up popular keyword phrases. Wordtracker is most popular in SEO industry, as it offers a good search term database and makes searching for keyword related information easy.
The database is constantly updated, with the oldest data being removed and replaced with the latest information every week. As of writing this article, Wordtracker offers access to a database size of 324 million entries.
Although Wordtracker is a paid service for regular use, it offers the benefit of one-day free trial, which can be used to complete your keyword research if you are fairly organized.
Competition Search can also be made at Wordtracker for your short-listed terms. Competition Search allows you to ascertain how many web pages exist for these search terms on different search engines. This helps you determine your chances of ranking with a particular search engine for your search term. Ideally, lesser number of pages with a search engine means a better ranking chance for your search term.
Overture
Overture Search Term Suggestion tool, although intended to offer keyword popularity information to the PPC (Pay Per Click) advertisers, works fairly well to carry out your keyword research. Enter your main search terms and in response Overture lists out all other popular search terms that contain that particular term or phrase along with the popularity count. Overture lists the search terms in order of their popularity, giving numerical count of past one month.
Being a free tool, Overture is quite useful. However, it has a few downsides and its results cannot be totally relied upon. They are good to be used as a guideline. Some of the drawbacks of Overture are listed below:
  1. Overture does not make a distinction between singular and plural search terms. Therefore, it can lead you to assume that a keyword is popular in the singular, though it could actually be the plural you should target.
  2. Related phrases are often collapsed into narrow term listings.
  3. The different variations of a word are stemmed as one.
  4. Mis-spellings and punctuation are either ignored or stemmed resulting in stinted data.

Google Adwords: Keyword Suggestion Tool
Search engines like Google offer some keyword recommendation tools, which can be used as a guide. However, Google does not indicate the popularity count of each search phrase.

Coverage of Keyword Phrases

While selecting your key phrases, make an effort to cover each aspect of your service or product that could be searched on stand-alone basis by the users. Each page on your website could be dealing with a different topic (product or service), therefore; the keywords for each page would be different too, depending on which you should try and place unique and relevant key phrases on each page.
For instance, it would not suffice to optimize a dentist's site only for the terms �dentist' or �dental surgeon', as people are likely to make specific searches on related terms like cosmetic surgery, root canal treatment or RCT, dental crowns, tooth extractions, cavities etc. Hence, it is important that all these keywords are covered.

Step 2: Analyzing Keyword Phrases

The second step is analyzing your keyword phrases. This involves analyzing the competition for the search terms, i.e. how many pages are indexed in a search engine like Google. This gives you an idea of how many pages are competing for the top spot. While most of the time, the optimized sites are ranking high, very often un- Optimized sites show up highly ranked due to the complex nature of the search engine algorithms. It is therefore wise to be conscious of the extent of competition.
PageRank
The potential for ranking with a particular keyword depends on several factors. The on-page factors address your optimization efforts while the off-page factors like PageRank affect ranking in a major way. While it is beyond the scope of this article to discuss the affect and weight of all the factors, your site's PageRank is an important parameter in keyword analysis. Thumb rule is that the higher your site's PageRank, the better the chance of ranking with highly competitive keywords. Choose your keywords keeping your site's PageRank in mind. For more information, read our detailed article on Google PageRank Algorithm
Adding Information and Measuring Competition for KeywordsAfter you have made your search on Wordtracker or Overture, and short-listed your key phrases, you need to collect information about their competition as described above. Wordtracker offers an in-built tool to find the number of pages for your keywords indexed in Google.
Alternately, you can check the competition for your keywords at: search-engine-keyword-competition.php
In order to make this task more organized, we recommend making use of Excel Sheets to feed in all the data you have compiled. This will allow you to make comparisons between various search terms and at the same time, do away with all the unrelated junk terms that are irrelevant to your site and also the ones that exceed your site's PR requirement.

Step 3: Selection of Keyword Phrases

If you have followed the above process of Discovering and Analyzing keywords, you probably have between 50-100 keyword phrases that qualify as �high-yield� given your site's current valuable keyword phrases that most accurately describe specific qualities of your website. Though generic keywords are generally searched more, ranking with generic key phrases often brings mediocre Return on Traffic.
Focused Key Phrases
As you may have realized by now the focused key phrases not only give your site a better chance to rank, it gets highly targeted traffic resulting in higher traffic to sales conversion ratios. Since you have used a process, which ensures that you are working with popular keywords, ranking with any of the short listed keywords assures you of incoming traffic. Amongst the 100 odd keywords you have short-listed, keep your eyes on the ones that have a tight focus on what your site's offering within the parameters of ranking possibilities and competitiveness.
Final Selection of Keyword Phrases
For making your final selection of 15-20 most relevant keywords, we recommend working in the following method:
   Eliminate the keywords, which are difficult to rank with considering the chart given for PR vs. Competition.
   Eliminate very low popularity Keywords where the search traffic for the terms is very low.
   Out of the balance keywords that qualify for ranking with your site's current PR, select the most popular keywords, which have the right focus for your site.

Step 4: Deploying Keyword Phrases

After you have made a final selection of 15-20 of your most relevant and important keywords, you need to use them on different parts of your website.
It is not advisable to use all your keywords on all the pages. A good way to use the selected keywords on your site is to divide them into 5-7 groups of keywords. Each group should have closely related keyword-forming themes. Identify various sections/ parts of your website that closely match with the keyword group themes and optimize your pages using these themes.
Keywords can be used in several portions/ codes of your web pages. The important aspects are listed below:
  1. Keywords Per Page
    We recommend using your keywords 2-4 times per page. Over-repetition of keywords on a page might be interpreted as spam by the search engines. Make sure you don't put all your keywords on one page.
  2. Where to use the Keywords?
    Your important keywords can be used in the following:Title Tag: A Title Tag of about 90 characters, inclusive of your most important keywords is good enough. For instance, the Title Tag of this article would be written as �Keyword Research : Techniques for keyword research, keyword analysis & keyword optimization�. Read our detailed article onTitle Tag Optimization.
    Meta Description Tag: You should write a Meta Description Tag of about 250 characters with your important keywords. Be careful about over-repetition. For instance, the Description tag of this article would be written as: �Keyword research described: Learn all about keyword research, keyword analysis, keyword optimization and marketing keyword research.� Read our detailed article on Meta Description Tag Optimization.
    Headlines: Keyword rich headlines and sub-headings placed in <H1> and <H2> tags, not only give a brief preview of what follows in the body copy, but also pass the most relevant information on the page or the subsequent paragraph. For instance, heading of this article reads: Keyword Research for Search Engine Optimization; followed by a sub-heading: Importance of Keyword Research for Search Engine Optimization
    Body Text: Headlines in <H1> and <H2> Tags followed by keyword rich body text can improve your ranking chances.
    Alt text / Alt Tag: Important keywords can be used in the text that appears when you bring your mouse over an image on a web page.
    Title Attributes: The title attributes, are short descriptions for the hyperlinks telling the users about the content of the landing page. Make use of your important keywords here.
    Anchor Text: The words that appear within the hyperlinked portion of the text are called Anchor Text. Keywords appearing in the Anchor Text have a high relevance weight in Search Engines.
    File Name: your page's file name or URL also makes use of the most important keyword. For example, file name for article :
    keyword-research-article.htm
    Table Summary: The table summary in HTML code describes the content of each HTML table in your code. Though not visible to a user, it is important for people with disabilities and text to speech applications. It is an important place to use your keyword phrases.
  3. Copy ChangesGiven below are a few tips for writing keyword rich copy. For detailed information, refer to our article on Search Engine Copywriting.Keeping your keyword phrases in mind, you might need to make certain changes in the copy of your web page. Therefore, instead of just writing (taking the Dentist example)�This procedure involves�' one should try and include their keywords by writing �Root Canal Treatment involves�'
    It is important to include both the singular and plural versions of your keywords as most search engines return different sets of results for plural and singular forms of the same word.
    Major search engines like Google are not case sensitive. Therefore, headlines and sub-headings can be written in upper or lower case or a combination of both, as it makes no difference in your search engine rankings.
    Spelling variations of the same word is another thing you should look out for. Try to include all different forms of the same word that can be spelled differently or commonly mis-spelt. For instance, you can write, � Abscessed Tooth, (commonly mis-spelt as Absessed tooth) is �', thus smartly covering the mis-spelling without giving the user a feeling that the site owner does not know the correct spellings.
  4. Avoiding Pitfalls

    Don't get swayed by the Keyword Density formula many SEO experts profess, it's just over hyped. Though you may find a number of articles written on the importance of keyword density, we don't think it is mission critical. Write your language naturally while leveraging higher use of the keywords in places where it can be used, like substituting keywords in places of articles like �it', �this', etc.

    The Title or the Description of your web page should not read like a thesaurus or a collection of keywords, but should be descriptive and enticing enough to make a user click on it and read further. It is important to note that our findings on user behavior have proved that sometimes a site ranking number 5th in Search Engines has drawn more clicks than a poorly titled site ranking number 1 in SERP. For instance, instead of simply writing �wisdom tooth extraction', a title that reads �painless wisdom tooth extraction' which is inclusive of your USP would result in enticing more users to click on your entry.
Place your keywords throughout the page, rather than just at the top, but be very very careful about over-repetition, as search engines would consider that as Spamming.
The whole idea behind the Keyword Research exercise is traffic optimization, not traffic maximization. In a nutshell, a good Keyword Research helps in bringing qualified traffic to your site that leads to high sales conversions, by being focused and targeting specific search terms.
Related Reading:
 Search Engine Copywriting
Choosing your Meta-Keywords
Title Tag and Meta Description Tag Optimization
Google PageRank Algorithm Explained
About the Author: Harjot Kaleka is an SEO Copywriter at http://www.redalkemi.com/search-engine-optimization-seo/keyword-research-article.php#, a leading Search Engine Optimization services company. She has a Masters degree in Mass Communications and Copywriting.
© Copyright 2004, RedAlkemi

Article source:

TwoSpots web design

Google's PageRank Explained

What is PageRank?

PageRank is a numeric value that represents how important a page is on the web. Google figures that when one page links to another page, it is effectively casting a vote for the other page. The more votes that are cast for a page, the more important the page must be. Also, the importance of the page that is casting the vote determines how important the vote itself is. Google calculates a page's importance from the votes cast for it. How important each vote is is taken into account when a page's PageRank is calculated.
PageRank is Google's way of deciding a page's importance. It matters because it is one of the factors that determines a page's ranking in the search results. It isn't the only factor that Google uses to rank pages, but it is an important one.
From here on in, we'll occasionally refer to PageRank as "PR".
Notes:
Not all links are counted by Google. For instance, they filter out links from known link farms. Some links can cause a site to be penalized by Google. They rightly figure that webmasters cannot control which sites link to their sites, but they can control which sites they link out to. For this reason, links into a site cannot harm the site, but links from a site can be harmful if they link to penalized sites. So be careful which sites you link to. If a site has PR0, it is usually a penalty, and it would be unwise to link to it.
top

How is PageRank calculated?

To calculate the PageRank for a page, all of its inbound links are taken into account. These are links from within the site and links from outside the site.
PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))
That's the equation that calculates a page's PageRank. It's the original one that was published when PageRank was being developed, and it is probable that Google uses a variation of it but they aren't telling us what it is. It doesn't matter though, as this equation is good enough.
In the equation 't1 - tn' are pages linking to page A, 'C' is the number of outbound links that a page has and 'd' is a damping factor, usually set to 0.85.
We can think of it in a simpler way:-
a page's PageRank = 0.15 + 0.85 * (a "share" of the PageRank of every page that links to it)
"share" = the linking page's PageRank divided by the number of outbound links on the page.
A page "votes" an amount of PageRank onto each page that it links to. The amount of PageRank that it has to vote with is a little less than its own PageRank value (its own value * 0.85). This value is shared equally between all the pages that it links to.
From this, we could conclude that a link from a page with PR4 and 5 outbound links is worth more than a link from a page with PR8 and 100 outbound links. The PageRank of a page that links to yours is important but the number of links on that page is also important. The more links there are on a page, the less PageRank value your page will receive from it.
If the PageRank value differences between PR1, PR2,.....PR10 were equal then that conclusion would hold up, but many people believe that the values between PR1 and PR10 (the maximum) are set on a logarithmic scale, and there is very good reason for believing it. Nobody outside Google knows for sure one way or the other, but the chances are high that the scale is logarithmic, or similar. If so, it means that it takes a lot more additional PageRank for a page to move up to the next PageRank level that it did to move up from the previous PageRank level. The result is that it reverses the previous conclusion, so that a link from a PR8 page that has lots of outbound links is worth more than a link from a PR4 page that has only a few outbound links.
Whichever scale Google uses, we can be sure of one thing. A link from another site increases our site's PageRank. Just remember to avoid links from link farms.
Note that when a page votes its PageRank value to other pages, its own PageRank is not reduced by the value that it is voting. The page doing the voting doesn't give away its PageRank and end up with nothing. It isn't a transfer of PageRank. It is simply a vote according to the page's PageRank value. It's like a shareholders meeting where each shareholder votes according to the number of shares held, but the shares themselves aren't given away. Even so, pages do lose some PageRank indirectly, as we'll see later.
Ok so far? Good. Now we'll look at how the calculations are actually done.
For a page's calculation, its existing PageRank (if it has any) is abandoned completely and a fresh calculation is done where the page relies solely on the PageRank "voted" for it by its current inbound links, which may have changed since the last time the page's PageRank was calculated.
The equation shows clearly how a page's PageRank is arrived at. But what isn't immediately obvious is that it can't work if the calculation is done just once. Suppose we have 2 pages, A and B, which link to each other, and neither have any other links of any kind. This is what happens:-
Step 1: Calculate page A's PageRank from the value of its inbound links
Page A now has a new PageRank value. The calculation used the value of the inbound link from page B. But page B has an inbound link (from page A) and its new PageRank value hasn't been worked out yet, so page A's new PageRank value is based on inaccurate data and can't be accurate.
Step 2: Calculate page B's PageRank from the value of its inbound links
Page B now has a new PageRank value, but it can't be accurate because the calculation used the new PageRank value of the inbound link from page A, which is inaccurate.
It's a Catch 22 situation. We can't work out A's PageRank until we know B's PageRank, and we can't work out B's PageRank until we know A's PageRank.
Now that both pages have newly calculated PageRank values, can't we just run the calculations again to arrive at accurate values? No. We can run the calculations again using the new values and the results will be more accurate, but we will always be using inaccurate values for the calculations, so the results will always be inaccurate.
The problem is overcome by repeating the calculations many times. Each time produces slightly more accurate values. In fact, total accuracy can never be achieved because the calculations are always based on inaccurate values. 40 to 50 iterations are sufficient to reach a point where any further iterations wouldn't produce enough of a change to the values to matter. This is precisiely what Google does at each update, and it's the reason why the updates take so long.
One thing to bear in mind is that the results we get from the calculations are proportions. The figures must then be set against a scale (known only to Google) to arrive at each page's actual PageRank. Even so, we can use the calculations to channel the PageRank within a site around its pages so that certain pages receive a higher proportion of it than others.


NOTE:
You may come across explanations of PageRank where the same equation is stated but the result of each iteration of the calculation is added to the page's existing PageRank. The new value (result + existing PageRank) is then used when sharing PageRank with other pages. These explanations are wrong for the following reasons:-
1. They quote the same, published equation - but then change it
from PR(A) = (1-d) + d(......) to PR(A) = PR(A) + (1-d) + d(......)
It isn't correct, and it isn't necessary.
2. We will be looking at how to organize links so that certain pages end up with a larger proportion of the PageRank than others. Adding to the page's existing PageRank through the iterations produces different proportions than when the equation is used as published. Since the addition is not a part of the published equation, the results are wrong and the proportioning isn't accurate.
According to the published equation, the page being calculated starts from scratch at each iteration. It relies solely on its inbound links. The 'add to the existing PageRank' idea doesn't do that, so its results are necessarily wrong.
top

Internal linking

Fact: A website has a maximum amount of PageRank that is distributed between its pages by internal links.
The maximum PageRank in a site equals the number of pages in the site * 1. The maximum is increased by inbound links from other sites and decreased by outbound links to other sites. We are talking about the overall PageRank in the site and not the PageRank of any individual page. You don't have to take my word for it. You can reach the same conclusion by using a pencil and paper and the equation.
Fact: The maximum amount of PageRank in a site increases as the number of pages in the site increases.
The more pages that a site has, the more PageRank it has. Again, by using a pencil and paper and the equation, you can come to the same conclusion. Bear in mind that the only pages that count are the ones that Google knows about.
Fact: By linking poorly, it is possible to fail to reach the site's maximum PageRank, but it is not possible to exceed it.
Poor internal linkages can cause a site to fall short of its maximum but no kind of internal link structure can cause a site to exceed it. The only way to increase the maximum is to add more inbound links and/or increase the number of pages in the site.
Cautions: Whilst I thoroughly recommend creating and adding new pages to increase a site's total PageRank so that it can be channeled to specific pages, there are certain types of pages that should not be added. These are pages that are all identical or very nearly identical and are known as cookie-cutters. Google considers them to be spam and they can trigger an alarm that causes the pages, and possibly the entire site, to be penalized. Pages full of good content are a must.
What can we do with this 'overall' PageRank?
We are going to look at some example calculations to see how a site's PageRank can be manipulated, but before doing that, I need to point out that a page will be included in the Google index only if one or more pages on the web link to it. That's according to Google. If a page is not in the Google index, any links from it can't be included in the calculations.
pagerank, page rank
Let's consider a 3 page site (pages A, B and C) with no links coming in from the outside. We will allocate each page an initial PageRank of 1, although it makes no difference whether we start each page with 1, 0 or 99. Apart from a few millionths of a PageRank point, after many iterations the end result is always the same. Starting with 1 requires fewer iterations for the PageRanks to converge to a suitable result than when starting with 0 or any other number. You can use a pencil and paper to follow this.
The site's maximum PageRank is the amount of PageRank in the site. In this case, we have 3 pages so the site's maximum is 3.
At the moment, none of the pages link to any other pages and none link to them. If you make the calculation once for each page, you'll find that each of them ends up with a PageRank of 0.15. No matter how many iterations you run, each page's PageRank remains at 0.15. The total PageRank in the site = 0.45, whereas it could be 3. The site is seriously wasting most of its potential PageRank.


Example 1 pagerank, page rank
Now begin again with each page being allocated PR1. Link page A to page B and run the calculations for each page. We end up with:-
Page A = 0.15
Page B = 1
Page C = 0.15

Page A has "voted" for page B and, as a result, page B's PageRank has increased. This is looking good for page B, but it's only 1 iteration - we haven't taken account of the Catch 22 situation. Look at what happens to the figures after more iterations:-
After 100 iterations the figures are:-
Page A = 0.15
Page B = 0.2775
Page C = 0.15

It still looks good for page B but nowhere near as good as it did. These figures are more realistic. The total PageRank in the site is now 0.5775 - slightly better but still only a fraction of what it could be.
NOTE:
Technically, these particular results are incorrect because of the special treatment that Google gives to dangling links, but they serve to demonstrate the simple calculation.


Example 2 pagerank, page rank
Try this linkage. Link all pages to all pages. Each page starts with PR1 again. This produces:-
Page A = 1
Page B = 1
Page C = 1

Now we've achieved the maximum. No matter how many iterations are run, each page always ends up with PR1. The same results occur by linking in a loop. E.g. A to B, B to C and C to D.
This has demonstrated that, by poor linking, it is quite easy to waste PageRank and by good linking, we can achieve a site's full potential. But we don't particularly want all the site's pages to have an equal share. We want one or more pages to have a larger share at the expense of others. The kinds of pages that we might want to have the larger shares are the index page, hub pages and pages that are optimized for certain search terms. We have only 3 pages, so we'll channel the PageRank to the index page - page A. It will serve to show the idea of channeling.


Example 3 pagerank, page rank
Now try this. Link page A to both B and C. Also link pages B and C to A. Starting with PR1 all round, after 1 iteration the results are:-
Page A = 1.85
Page B = 0.575
Page C = 0.575

and after 100 iterations, the results are:-
Page A = 1.459459
Page B = 0.7702703
Page C = 0.7702703

In both cases the total PageRank in the site is 3 (the maximum) so none is being wasted. Also in both cases you can see that page A has a much larger proportion of the PageRank than the other 2 pages. This is because pages B and C are passing PageRank to A and not to any other pages. We have channeled a large proportion of the site's PageRank to where we wanted it.


Example 4 pagerank, page rank
Finally, keep the previous links and add a link from page C to page B. Start again with PR1 all round. After 1 iteration:-
Page A = 1.425
Page B = 1
Page C = 0.575

By comparison to the 1 iteration figures in the previous example, page A has lost some PageRank, page B has gained some and page C stayed the same. Page C now shares its "vote" between A and B. Previously A received all of it. That's why page A has lost out and why page B has gained. and after 100 iterations:-
Page A = 1.298245
Page B = 0.9999999
Page C = 0.7017543

When the dust has settled, page C has lost a little PageRank because, having now shared its vote between A and B, instead of giving it all to A, A has less to give to C in the A-->C link. So adding an extra link from a page causes the page to lose PageRank indirectly if any of the pages that it links to return the link. If the pages that it links to don't return the link, then no PageRank loss would have occured. To make it more complicated, if the link is returned even indirectly (via a page that links to a page that links to a page etc), the page will lose a little PageRank. This isn't really important with internal links, but it does matter when linking to pages outside the site.


Example 5: new pages
Adding new pages to a site is an important way of increasing a site's total PageRank because each new page will add an average of 1 to the total. Once the new pages have been added, their new PageRank can be channeled to the important pages. We'll use the calculator to demonstrate these.
Let's add 3 new pages to Example 3. Three new pages but they don't do anything for us yet. The small increase in the Total, and the new pages' 0.15, are unrealistic as we shall see. So let's link them into the site.
Link each of the new pages to the important page, page A. Notice that the Total PageRank has doubled, from 3 (without the new pages) to 6. Notice also that page A's PageRank has almost doubled.
There is one thing wrong with this model. The new pages are orphans. They wouldn't get into Google's index, so they wouldn't add any PageRank to the site and they wouldn't pass any PageRank to page A. They each need to be linked to from at least one other page. If page A is the important page, the best page to put the links on is, surprisingly, page A. You can play around with the links but, from page A's point of view, there isn't a better place for them.
It is not a good idea for one page to link to a large number of pages so, if you are adding many new pages, spread the links around. The chances are that there is more than one important page in a site, so it is usually suitable to spread the links to and from the new pages. You can use the calculator to experiment with mini-models of a site to find the best links that produce the best results for its important pages.


Examples summary
You can see that, by organising the internal links, it is possible to channel a site's PageRank to selected pages. Internal links can be arranged to suit a site's PageRank needs, but it is only useful if Google knows about the pages, so do try to ensure that Google spiders them.


Questions
When a page has several links to another page, are all the links counted?
E.g. if page A links once to page B and 3 times to page C, does page C receive 3/4 of page A's shareable PageRank?
The PageRank concept is that a page casts votes for one or more other pages. Nothing is said in the original PageRank document about a page casting more than one vote for a single page. The idea seems to be against the PageRank concept and would certainly be open to manipulation by unrealistically proportioning votes for target pages. E.g. if an outbound link, or a link to an unimportant page, is necessary, add a bunch of links to an important page to minimize the effect.
Since we are unlikely to get a definitive answer from Google, it is reasonable to assume that a page can cast only one vote for another page, and that additional votes for the same page are not counted.
When a page links to itself, is the link counted?
Again, the concept is that pages cast votes for other pages. Nothing is said in the original document about pages casting votes for themselves. The idea seems to be against the concept and, also, it would be another way to manipulate the results. So, for those reasons, it is reasonable to assume that a page can't vote for itself, and that such links are not counted.
top

Dangling links

pagerank, page rank "Dangling links are simply links that point to any page with no outgoing links. They affect the model because it is not clear where their weight should be distributed, and there are a large number of them. Often these dangling links are simply pages that we have not downloaded yet..........Because dangling links do not affect the ranking of any other page directly, we simply remove them from the system until all the PageRanks are calculated. After all the PageRanks are calculated they can be added back in without affecting things significantly." - extract from the original PageRank paper by Google's founders, Sergey Brin and Lawrence Page.A dangling link is a link to a page that has no links going from it, or a link to a page that Google hasn't indexed. In both cases Google removes the links shortly after the start of the calculations and reinstates them shortly before the calculations are finished. In this way, their effect on the PageRank of other pages in minimal.
The results shown in Example 1 (right diag.) are wrong because page B has no links going from it, and so the link from page A to page B is dangling and would be removed from the calculations. The results of the calculations would show all three pages as having 0.15.
It may suit site functionality to link to pages that have no links going from them without losing any PageRank from the other pages but it would be waste of potential PageRank. Take a look at this example. The site's potential is 5 because it has 5 pages, but without page E linked in, the site only has 4.15.
Link page A to page E and click Calculate. Notice that the site's total has gone down very significantly. But, because the new link is dangling and would be removed from the calculations, we can ignore the new total and assume the previous 4.15 to be true. That's the effect of functionally useful, dangling links in the site. There's no overall PageRank loss.
However, some of the site's potential total is still being wasted, so link Page E back to Page A and Calculate. Now we have the maximum PageRank that is possible with 5 pages. Nothing is being wasted.
Although it may be functionally good to link to pages within the site without those pages linking out again, it is bad for PageRank. It is pointless wasting PageRank unnecessarily, so always make sure that every page in the site links out to at least one other page in the site.
top

Inbound links

Inbound links (links into the site from the outside) are one way to increase a site's total PageRank. The other is to add more pages. Where the links come from doesn't matter. Google recognizes that a webmaster has no control over other sites linking into a site, and so sites are not penalized because of where the links come from. There is an exception to this rule but it is rare and doesn't concern this article. It isn't something that a webmaster can accidentally do.
The linking page's PageRank is important, but so is the number of links going from that page. For instance, if you are the only link from a page that has a lowly PR2, you will receive an injection of 0.15 + 0.85(2/1) = 1.85 into your site, whereas a link from a PR8 page that has another 99 links from it will increase your site's PageRank by 0.15 + 0.85(7/100) = 0.2095. Clearly, the PR2 link is much better - or is it? See here for a probable reason why this is not the case.
Once the PageRank is injected into your site, the calculations are done again and each page's PageRank is changed. Depending on the internal link structure, some pages' PageRank is increased, some are unchanged but no pages lose any PageRank.
It is beneficial to have the inbound links coming to the pages to which you are channeling your PageRank. A PageRank injection to any other page will be spread around the site through the internal links. The important pages will receive an increase, but not as much of an increase as when they are linked to directly. The page that receives the inbound link, makes the biggest gain.
It is easy to think of our site as being a small, self-contained network of pages. When we do the PageRank calculations we are dealing with our small network. If we make a link to another site, we lose some of our network's PageRank, and if we receive a link, our network's PageRank is added to. But it isn't like that. For the PageRank calculations, there is only one network - every page that Google has in its index. Each iteration of the calculation is done on the entire network and not on individual websites.
Because the entire network is interlinked, and every link and every page plays its part in each iteration of the calculations, it is impossible for us to calculate the effect of inbound links to our site with any realistic accuracy.
top

Outbound links

Outbound links are a drain on a site's total PageRank. They leak PageRank. To counter the drain, try to ensure that the links are reciprocated. Because of the PageRank of the pages at each end of an external link, and the number of links out from those pages, reciprocal links can gain or lose PageRank. You need to take care when choosing where to exchange links.
When PageRank leaks from a site via a link to another site, all the pages in the internal link structure are affected. (This doesn't always show after just 1 iteration). The page that you link out from makes a difference to which pages suffer the most loss. Without a program to perform the calculations on specific link structures, it is difficult to decide on the right page to link out from, but the generalization is to link from the one with the lowest PageRank.
Many websites need to contain some outbound links that are nothing to do with PageRank. Unfortunately, all 'normal' outbound links leak PageRank. But there are 'abnormal' ways of linking to other sites that don't result in leaks. PageRank is leaked when Google recognizes a link to another site. The answer is to use links that Google doesn't recognize or count. These include form actions and links contained in javascript code.
Form actions
A form's 'action' attribute does not need to be the url of a form parsing script. It can point to any html page on any site. Try it.
Example:
<form name="myform" action="http://www.domain.com/somepage.html">
<a rel="nofollow" href="javascript:document.myform.submit()">Click here</a>
To be really sneaky, the action attribute could be in some javascript code rather than in the form tag, and the javascript code could be loaded from a 'js' file stored in a directory that is barred to Google's spider by the robots.txt file.
Javascript
Example: <a rel="nofollow" href="javascript:goto('wherever')">Click here</a>
Like the form action, it is sneaky to load the javascript code, which contains the urls, from a seperate 'js' file, and sneakier still if the file is stored in a directory that is barred to googlebot by the robots.txt file.
The "rel" attribute
As of 18th January 2005, Google, together with other search engines, is recognising a new attribute to the anchor tag. The attribute is "rel", and it is used as follows:-
<a rel="nofollow" href="http://www.domain.com/somepage.html" rel="nofollow">link text</a>

The attribute tells Google to ignore the link completely. The link won't help the target page's PageRank, and it won't help its rankings. It is as though the link doesn't exist. With this attribute, there is no longer any need for javascript, forms, or any other method of hiding links from Google.

top

So how much additional PageRank do we need to move up the toolbar?

First, let me explain in more detail why the values shown in the Google toolbar are not the actual PageRank figures. According to the equation, and to the creators of Google, the billions of pages on the web average out to a PageRank of 1.0 per page. So the total PageRank on the web is equal to the number of pages on the web * 1, which equals a lot of PageRank spread around the web.

The Google toolbar range is from 1 to 10. (They sometimes show 0, but that figure isn't believed to be a PageRank calculation result). What Google does is divide the full range of actual PageRanks on the web into 10 parts - each part is represented by a value as shown in the toolbar. So the toolbar values only show what part of the overall range a page's PageRank is in, and not the actual PageRank itself. The numbers in the toolbar are just labels.
Whether or not the overall range is divided into 10 equal parts is a matter for debate - Google aren't saying. But because it is much harder to move up a toolbar point at the higher end than it is at the lower end, many people (including me) believe that the divisions are based on a logarithmic scale, or something very similar, rather than the equal divisions of a linear scale.
Let's assume that it is a logarithmic, base 10 scale, and that it takes 10 properly linked new pages to move a site's important page up 1 toolbar point. It will take 100 new pages to move it up another point, 1000 new pages to move it up one more, 10,000 to the next, and so on. That's why moving up at the lower end is much easier that at the higher end.
In reality, the base is unlikely to be 10. Some people think it is around the 5 or 6 mark, and maybe even less. Even so, it still gets progressively harder to move up a toolbar point at the higher end of the scale.
Note that as the number of pages on the web increases, so does the total PageRank on the web, and as the total PageRank increases, the positions of the divisions in the overall scale must change. As a result, some pages drop a toolbar point for no 'apparent' reason. If the page's actual PageRank was only just above a division in the scale, the addition of new pages to the web would cause the division to move up slightly and the page would end up just below the division. Google's index is always increasing and they re-evaluate each of the pages on more or less a monthly basis. It's known as the "Google dance". When the dance is over, some pages will have dropped a toolbar point. A number of new pages might be all that is needed to get the point back after the next dance.
The toolbar value is a good indicator of a page's PageRank but it only indicates that a page is in a certain range of the overall scale. One PR5 page could be just above the PR5 division and another PR5 page could be just below the PR6 division - almost a whole division (toolbar point) between them.
top

Tips

Domain names and Filenames

To a spider, www.domain.com/, domain.com/, www.domain.com/index.html and domain.com/index.html are different urls and, therefore, different pages. Surfers arrive at the site's home page whichever of the urls are used, but spiders see them as individual urls, and it makes a difference when working out the PageRank. It is better to standardize the url you use for the site's home page. Otherwise each url can end up with a different PageRank, whereas all of it should have gone to just one url.
If you think about it, how can a spider know the filename of the page that it gets back when requesting www.domain.com/ ? It can't. The filename could be index.html, index.htm, index.php, default.html, etc. The spider doesn't know. If you link to index.html within the site, the spider could compare the 2 pages but that seems unlikely. So they are 2 urls and each receives PageRank from inbound links. Standardizing the home page's url ensures that the Pagerank it is due isn't shared with ghost urls.
Example: Go to TwoSpots site - how's that for a nice piece of link text ;). Notice that the url in the browser's address bar contains "www.". If you have the Google Toolbar installed, you will see that the page has PR5 otherwise you can chack it from here. Now remove the "www." part of the url and get the page again. This time it has PR1, and yet they are the same page. Actually, the PageRank is for the unseen frameset page.
When this article was first written, the non-www URL had PR4 due to using different versions of the link URLs within the site. It had the effect of sharing the page's PageRank between the 2 pages (the 2 versions) and, therefore, between the 2 sites. That's not the best way to do it. Since then, I've tidied up the internal linkages and got the non-www version down to PR1 so that the PageRank within the site mostly stays in the "www." version, but there must be a site somewhere that links to it without the "www." that's causing the PR1.
Imagine the page, www.domain.com/index.html. The index page contains links to several relative urls; e.g. products.html and details.html. The spider sees those urls as www.domain.com/products.html and www.domain.com/details.html. Now let's add an absolute url for another page, only this time we'll leave out the "www." part - domain.com/anotherpage.html. This page links back to the index.html page, so the spider sees the index pages as domain.com/index.html. Although it's the same index page as the first one, to a spider, it is a different page because it's on a different domain. Now look what happens. Each of the relative urls on the index page is also different because it belongs to the domain.com/ domain. Consequently, the link stucture is wasting a site's potential PageRank by spreading it between ghost pages.


Adding new pages
There is a possible negative effect of adding new pages. Take a perfectly normal site. It has some inbound links from other sites and its pages have some PageRank. Then a new page is added to the site and is linked to from one or more of the existing pages. The new page will, of course, aquire PageRank from the site's existing pages. The effect is that, whilst the total PageRank in the site is increased, one or more of the existing pages will suffer a PageRank loss due to the new page making gains. Up to a point, the more new pages that are added, the greater is the loss to the existing pages. With large sites, this effect is unlikely to be noticed but, with smaller ones, it probably would.
So, although adding new pages does increase the total PageRank within the site, some of the site's pages will lose PageRank as a result. The answer is to link new pages is such a way within the site that the important pages don't suffer, or add sufficient new pages to make up for the effect (that can sometimes mean adding a large number of new pages), or better still, get some more inbound links.
top

Miscellaneous

The Google toolbar
If you have the Google toolbar installed in your browser, you will be used to seeing each page's PageRank as you browse the web. But all isn't always as it seems. Many pages that Google displays the PageRank for haven't been indexed in Google and certainly don't have any PageRank in their own right. What is happening is that one or more pages on the site have been indexed and a PageRank has been calculated. The PageRank figure for the site's pages that haven't been indexed is allocated on the fly - just for your toolbar. The PageRank itself doesn't exist.
It's important to know this so that you can avoid exchanging links with pages that really don't have any PageRank of their own. Before making exchanges, search for the page on Google to make sure that it is indexed.
Sub-directories
Some people believe that Google drops a page's PageRank by a value of 1 for each sub-directory level below the root directory. E.g. if the value of pages in the root directory is generally around 4, then pages in the next directory level down will be generally around 3, and so on down the levels. Other people (including me) don't accept that at all. Either way, because some spiders tend to avoid deep sub-directories, it is generally considered to be beneficial to keep directory structures shallow (directories one or two levels below the root).
ODP and Yahoo!
It used to be thought that Google gave a Pagerank boost to sites that are listed in the Yahoo! and ODP (a.k.a. DMOZ) directories, but these days general opinion is that they don't. There is certainly a PageRank gain for sites that are listed in those directories, but the reason for it is now thought to be this:-
Google spiders the directories just like any other site and their pages have decent PageRank and so they are good inbound links to have. In the case of the ODP, Google's directory is a copy of the ODP directory. Each time that sites are added and dropped from the ODP, they are added and dropped from Google's directory when they next update it. The entry in Google's directory is yet another good, PageRank boosting, inbound link. Also, the ODP data is used for searches on a myriad of websites - more inbound links!
Listings in the ODP are free but, because sites are reviewed by hand, it can take quite a long time to get in. The sooner a working site is submitted, the better. For tips on submitting to DMOZ, see this this DMOZ article.

Article source:

TwoSpots web design