Webmaster SEO Tools

Free xml sitemap generator

Laman

The World of Duplicate Content - Use of a Filter

By: Aaron Brooks The World Wide Web is like a running race or marathon where websites compete to reach the finish line first. In this case the finish line is higher ranking. And in this race for supremacy it is important to avoid duplicate content and its penalties.

To facilitate the efficient functioning of directories search engines have been armed with content filters. This removes or filters duplicate content from pages it's indexing. And the most hurtful penalty is lower rankings.

Unfortunately, these filters not only catch rogues but web pages that are genuine too. What webmasters need to do is understand how filters function and know what action is to be taken to avoid being filtered out.

When a search engine sends out spiders the filters leave out or sieve:

• Websites that feature identical content. And when within a site the webmaster includes many copies or versions of pages to cheat the search engines. Filters are also extremely sensitive to "doorway" pages.

• Content masked by different packaging. Known as "scraped content" this duplication of pages with little or no relevant changes falls prey to filters.

• Product descriptions featured by e-commerce sites. Most e-commerce sites publish alongside a product the manufacturer's description of the product and this content then appears on zillions of e-commerce sites falling victim to filters.

• Articles distributed widely over the net. While some engines are programmed to find the origin of the article there are others who may not be able to source the origins.

• Pages that are not duplicates but contain the same core material written by different people.

The World of Duplicate Content - Use of a Filter

The World Wide Web is like a running race or marathon where websites compete to reach the finish line first. In this case the finish line is higher ranking. And in this race for supremacy it is important to avoid duplicate content and its penalties.

To facilitate the efficient functioning of directories search engines have been armed with content filters. This removes or filters duplicate content from pages it's indexing. And the most hurtful penalty is lower rankings.

Unfortunately, these filters not only catch rogues but web pages that are genuine too. What webmasters need to do is understand how filters function and know what action is to be taken to avoid being filtered out.

When a search engine sends out spiders the filters leave out or sieve:

• Websites that feature identical content. And when within a site the webmaster includes many copies or versions of pages to cheat the search engines. Filters are also extremely sensitive to "doorway" pages.

• Content masked by different packaging. Known as "scraped content" this duplication of pages with little or no relevant changes falls prey to filters.

• Product descriptions featured by e-commerce sites. Most e-commerce sites publish alongside a product the manufacturer's description of the product and this content then appears on zillions of e-commerce sites falling victim to filters.

• Articles distributed widely over the net. While some engines are programmed to find the origin of the article there are others who may not be able to source the origins.

• Pages that are not duplicates but contain the same core material written by different people


To get the better of filters you need to:

• Use a tool like the Similar Page Checker http://www.webconfs.com/similar-page-checker.php to ensure that the pages in your site are not mirroring content from elsewhere. In case there are other URLS with similar or identical content the tool will reveal them to you and you will be able to make changes in your pages.

• Be vigilant and know who has "helped" themselves to your content. By using www.copyscape.com you can determine which websites have stolen or copied your work.

• Even if you do use distributed content you can add a commentary or make changes to the page focusing on its relevance to your site. By making any content your own you are making it unique and different and this will ensure that the pages are not filtered by search engines.

• Even if you are running an e-commerce site you must include product descriptions that are distinctively yours and not run of the mill.

Learn as much as you can about duplicate content and its dangers. Read the issues that were discussed at the SES 2006 New York Session and other forums. Remember most search engines, Google, Yahoo, or Open Directory Project do not want to be flooded by duplicate content and web pages.

Jake Baillie, President of TrueLocal listed the duplicate content mistakes to be: circular navigation; printer friendly pages; inconsistent linking; product only pages; transparent serving domains; and bad cloaking.

It is important for sites to get high ranking through fair and not foul means.
READ MORE » The World of Duplicate Content - Use of a Filter

Free SEO keyword analyzer tool!

By: steve bis This article is about a free website which has a great text analysis tool, that is very useful for anyone in the internet marketing game or who optimizes their website for search engine rankings. The sites address is www.textalyser.net this tool really does the job when it comes to analyzing either all the text on a website or just a certain portion of your choice. Giving you all sorts of great information about the keywords used in the site.

When you first enter the site you can either paste in a certain portion of text into the designated area, or you just type in the URL of the whole website you would like to have analyzed. Then you can choose as to what analysis options you would like to perform. Such as the minimum characters per word, whether you would like it to ignore numbers, and a few more. After you chose your options then you simply click analyze the text, thus returning you a complete very detailed analysis.

At the very top of your analysis it shows some basic text information like the total word count, number of different words used, sentence count, a readability index ranging from easy to hard. That little function comes in handy, because you definitely want the text of your page to be easy for the viewer to read.

The next and perhaps most important feature, shows you the occurrences and frequency at which the top keywords for your page show up. It ranks them from the number one word to whichever number you would like it to stop at. That is set at the options you chose before you analyzed the text. This particular feature is very nice to SEO's, seeing as that it lists the top words on a site and the density/frequency of which they appear. So an example of how this might help would be if you were targeting to have a certain keyword density for a particular keyword on your site. Thus enabling you to figure out whether to add more or less of that word to meet the density at which is required for the search engines to list you for that keyword.

Not only does this tell you your top ranking keywords it tell you the top word phrases, ranging from 2 word to 5 word phrases. It gives the count of how many times that phrase was used and also shows the frequency compared to the rest of the text on the page.

Anyone who is in the internet marketing field especially marketers who optimize their websites for search engine traffic can make great use of this free tool. I personally find great use in this tool for the process of my keyword research, which is essential for any search engine optimization campaign.
READ MORE » Free SEO keyword analyzer tool!

10 Costly Search Engine Mistakes to Avoid

By: Brad Eden If you have a website then you already know the importance of traffic. Traffic is to Internet marketing as location is to real estate. It's the only thing that really matters. If you cannot generate targeted visitors to your site, you will not make any sales.

Usually the owner or designer of the website is the person designated to drive traffic to the site. The chief ingredient in generating traffic is the search engine. Of course, you can use advertising, but it's going to cost you. Using the search engines to generate targeted (interested in your product) traffic is the least expensive method known.

Unfortunately, many website owners do not understand the importance of search engine visibility, which leads to traffic. They place more importance on producing a "pretty" website. Not that this is bad, but it is really secondary to search engine placement. Hopefully, the following list of common mistakes, made by many website owners, will help you generate more targeted traffic to your site...after all, isn't that what you want.

1. Not using keywords effectively.
This is probably one of the most critical area of site design. Choose the right keywords and potential customers will find your site. Use the wrong ones and your site will see little, if any, traffic.

2. Repeating the same keywords.
When you use the same keywords over and over again (called keyword stacking) the search engines may downgrade (or skip) the page or site.

3. Robbing pages from other websites.
How many times have you heard or read that "this is the Internet and it's ok" to steal icons and text from websites to use on your site. Don't do it. Its one thing to learn from others who have been there and another to outright copy their work. The search engines are very smart and usually detect page duplication. They may even prevent you from ever being listed by them.

4. Using keywords that are not related to your website.
Many unethical website owners try to gain search engine visibility by using keywords that have nothing at all to do with their website. They place unrelated keywords in a page (such as "sex", the name of a known celebrity, the hot search topic of the day, etc.) inside a meta tag for a page. The keyword doesn't have anything to do with the page topic. However, since the keyword is popular, they think this will boost their visibility. This technique is considered spam by the search engines and may cause the page (or sometimes the whole site) to be removed from the search engine listing.

5. Keyword stuffing.
Somewhat like keyword stacking listed above, this means to assign multiple keywords to the description of a graphic or layer that appears on your website by using the "alt=" HTML parameter. If the search engines find that this text does not really describe the graphic or layer it will be considered spam.

6. Relying on hidden text.
You might be inclined to think that if you cannot see it, it doesn't hurt. Wrong.... Do not try to hide your keywords or keyword phrases by making them invisible. For example, some unethical designers my set the keywords to the same color as the background of the web page; thereby, making it invisible.

7. Relying on tiny text.
This is another version of the item above (relying on hidden text). Do not try to hide your keywords or keyword phrases by making them tiny. Setting the text size of the keywords so small that it can barely be seen does this.

8. Assuming all search engines are the same.
Many people assume that each search engine plays by the same rules. This is not so. Each has their own rule base and is subject to change anytime they so desire. Make it a point to learn what each major search engine requires for high visibility.

9. Using free web hosting.
Do not use free web hosting if you are really serious about increasing site traffic via search engine visibility. Many times the search engines will eliminate content from these free hosts.

10. Forgetting to check for missing web page elements.
Make sure to check every page in your website for completeness, like missing links, graphics, etc. There are sites on the web that will do this for free.

This is just a few of the methods and techniques that you should avoid. Do not give in to the temptation that these methods will work for you. They will do more harm than good for your website.

Not only will you spend weeks of wasted effort, you may have your site banned from the search engines forever. Invest a little time to learn the proper techniques for increasing search engine visibility and your net traffic will increase.
READ MORE » 10 Costly Search Engine Mistakes to Avoid