The Importance of Referrer Logs

Sunday, August 30, 2009
Referrer logging is used to allow web servers and websites to identify where people are visiting them either for promotional or security purposes. You can find out which search engine they used to find your site and whether your customer has come from a ‘linked site’. It is basically the URL of the previous webpage from which your link was followed.

By default, most hosting accounts don’t include referrer logs but may be subscribed for an extra monthly fee. If your web host does not provide a graphic report of your log files, you can still view the referrer logs for your website by logging into the host server using free or low-cost FTP software, like these:

The log file is available on your web server which can be download into your computer later. You can use a log analysis tool, like those mentioned below, to create a graphic report from your log files so that the files are easier to understand.

You can view the files using Word, Word Perfect, txt or WordPad files even if you don’t have the right tool. This information is very crucial to your business and marketing plans and is not advisable to neglect it.

In addition to identifying the search engine or linked site from where your visitor arrived, referrer logs can also tell you what keywords or keyword phrases your client used for searching.

As referrer information can sometimes violate privacy, some browsers allow the user to disable the sending of referrer information. Proxy and Firewall software can also filter out referrer information, to avoid leaking the location of private websites. This can result in other problems, as some servers block parts of their site to browsers that don't send the right referrer information, in an attempt to prevent deep linking or unauthorized use of bandwidth. Some proxy software gives the top-level address of the target site itself as the referrer, which prevents these problems and still not divulging the user's last visited site.

Since the referrer can easily be spoofed or faked, however, it is of limited use in this regard except on a casual basis.

Keyword Density

Saturday, August 29, 2009
Keyword density is an indicator of the number of times the selected keyword appears in the web page. But mind you, keywords shouldn’t be over used, but should be just sufficient enough to appear at important places.

If you repeat your keywords with every other word on every line, then your site will probably be rejected as an artificial site or spam site. Keyword density is always expressed as a percentage of the total word content on a given web page.

Suppose you have 100 words on your webpage (not including HMTL code used for writing the web page), and you use a certain keyword for five times in the content. The keyword density on that page is got by simply dividing the total number of keywords, by the total number of words that appear on your web page. So here it is 5 divided by 100 = .05. Because keyword density is a percentage of the total word count on the page, multiply the above by 100, that is 0.05 x 100 = 5%.

The accepted standard for a keyword density is between 3% and 5%, to get recognized by the search engines and you should never exceed it.

Remember, that this rule applies to every page on your site. It also applies to not just to one keyword but also a set of keywords that relates to a different product or service. The keyword density should always be between 3% and 5%.

Simple steps to check the density:
  • Copy and paste the content from an individual web page into a word-processing software program like Word or Word Perfect.
  • Go to the ‘Edit’ menu and click ‘Select All’. Now go to the ‘Tools’ menu and select ‘Word Count’. Write down the total number of words in the page.
  • Now select the ‘Find’ function on the ‘Edit’ menu. Go to the ‘Replace’ tab and type in the keyword you want to find. ‘Replace’ that word with the same word, so you don’t change the text.
  • When you complete the replace function, the system will provide a count of the words you replaced. That gives the number of times you have used the keyword in that page.
  • Using the total word count for the page and the total number of keywords you can now calculate the keyword density.

Try it and Good luck!


How Do Search Engines Work - Web Crawlers

Friday, August 28, 2009
It is the search engines that finally bring your website to the notice of the prospective customers. Hence it is better to know how these search engines actually work and how they present information to the customer initiating a search.

There are basically two types of search engines. The first is by robots called crawlers or spiders.

Search Engines use spiders to index websites. When you submit your website pages to a search engine by completing their required submission page, the search engine spider will index your entire site. A ‘spider’ is an automated program that is run by the search engine system. Spider visits a web site, read the content on the actual site, the site's Meta tags and also follow the links that the site connects. The spider then returns all that information back to a central depository, where the data is indexed. It will visit each link you have on your website and index those sites as well. Some spiders will only index a certain number of pages on your site, so don’t create a site with 500 pages!

The spider will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the moderators of the search engine.

A spider is almost like a book where it contains the table of contents, the actual content and the links and references for all the websites it finds during its search, and it may index up to a million pages a day.

Example: Excite, Lycos, AltaVista and Google.

When you ask a search engine to locate information, it is actually searching through the index which it has created and not actually searching the Web. Different search engines produce different rankings because not every search engine uses the same algorithm to search through the indices.

One of the things that a search engine algorithm scans for is the frequency and location of keywords on a web page, but it can also detect artificial keyword stuffing or spamdexing. Then the algorithms analyze the way that pages link to other pages in the Web. By checking how pages link to each other, an engine can both determine what a page is about, if the keywords of the linked pages are similar to the keywords on the original page.


Consider all the things that matters as discuss above if you want to increase your page ranking. Work closely with the crawler. Good luck.

The Importance of Search Engines For Your Blog

Thursday, August 27, 2009
It is the search engines that finally bring your website to the notice of the prospective customers. When a topic is typed for search, nearly instantly, the search engine will sift through the millions of pages it has indexed about and present you with ones that match your topic. The searched matches are also ranked, so that the most relevant ones come first.

Remember that a prospective customer will probably only look at the first 2-3 listings in the search results. So it does matter where your website appears in the search engine ranking.

Further, they all use one of the top 6-7 search engines and these search engines attract more visitors to websites than anything else. So finally it all depends on which search engines the customers use and how they rank your site.

It is the Keywords that play an important role than any expensive online or offline advertising of your website.

It is found by surveys that a when customers want to find a website for information or to buy a product or service, they find their site in one of the following ways:-
  • The first option is they find their site through a search engine.
  • Secondly they find their site by clicking on a link from another website or page that relates to the topic in which they are interested.
  • Occasionally, they find a site by hearing about it from a friend or reading in an article.
Thus it’s obvious the the most popular way to find a site, by search engine, represents more than 90% of online users. In other words, only 10% of the people looking for a website will use methods other than search engines.

All search engines employ a ranking algorithm and one of the main rules in a ranking algorithm is to check the location and frequency of keywords on a web page. Don’t forget that algorithms also give weightage to link population (number of web pages linking to your site). When performed by a qualified, experienced search engine optimization consultant, your site for high search engine rankings really does work, unless you have a lot of money and can afford to pay the expert.

With better knowledge of search engines and how they work, you can also do it on your own.

Tips to Get Repeat Web Traffic

Wednesday, August 19, 2009
Here is few tips on how to get repeat web traffic to your site/blog
  1. Update the pages on your website frequently. Stagnant sites are dropped by some search engines. You can even put a date counter on the page to show when it was last updated.
  2. Offer additional value on your website. For affiliates and partners you can place links to their sites and products and ask them to do the same for you. You can also advertise their books or videos, if these products relate to your industry and are not in competition with your own product.
  3. You can allow customers to ‘opt in’ to get discounts and special offers. Place a link on your site to invite customers to ‘opt in’ to get a monthly newsletter or valuable coupons.
  4. Add a link to your primary page with a script ‘Book Mark or Add this site to your Favorites’.
  5. Add a link ‘Recommend this site to a Friend’ so that the visitor can email your website link, with a prewritten title, “Thought you might be interested in this”, just by clicking on it.
  6. Brand your website so that visitors always know they are on your site. Use consistent colors, logos and slogans and always provide a ‘Contact Us’ link on each page.
  7. Create a ‘Our Policies’ page that clearly defines your philosophy and principles in dealing with your customers. Also post your privacy policy as well so that clients know they are secure when they visit your site.
  8. Create a FAQ page which addresses most of the doubts and clarifications about your product or your company that are likely to be asked. This helps to resolve most of the customers doubts in their first visit to your site.
  9. Ensure that each page on your website has appropriate titles and keywords so that your customer can find their way back to your site if they lose the book mark.
  10. Never spam a client, who has opted for newsletters, with unsolicited emails. Later if they decide they want to ‘opt out’ of the mailings, be sure you honor their request and take them off the mailing list. They may still come back if they like your products. But they will certainly not come back if you continue to flood their email box with mails they no longer wish to receive.