Find Duplicate Content Using Free Tools
A very essential part of keeping up a website is to make sure it is free from duplicate content. The quality and uniqueness of the content in a particular website can play a big role in its popularity. At present, there are many tools available on the Internet that anyone can use for free to check for duplicate content.
One of the popular sites used by many is Copyscape. It is a free tool that allows you to post your text content on a box and in return, the tool would check for other websites with similar texts.
Another free tool that can be used is the Xenu Link Sleuth. This is a tool that can be downloaded to check for broken links. This tool often yields title tags, formats, sizes and URLs that can be exported to excel files and these can be sorted to check for duplicates. A similar tool works this way and it is called Yahoo Explorer. The only difference is Yahoo Explorer does not detect broken links.
Google also offers tools to check for duplicate content. The Google Webmaster Tools can be accessed through the main Google page (google.com/webmasters). One has to go to the Diagnostics page and choose HTML suggestions. Click on duplicate title tag to proceed to the download link for the table. Apart from these free tools, there are also a lot of reliable paid duplicate checking tools available online.
There are many types of duplicate contents that one needs to be aware of. Various elements of a website need to be considered as there are millions of websites in the World Wide Web.
Here are some of the elements that need to be checked for duplication:
1. Title Tags. Many websites out there tend to use the title tag over and over again throughout the entire website. This is a form of duplication and needs to be avoided. Plus, with the millions of websites published on the web, title tags often get duplicated.
2. Dynamic URLs. Since the content of dynamic pages changes depending on the data base driving the results of the site, the possibilities of duplicate contents are greater.
3. Meta descriptions. Subjects of the content of a website need meta descriptions, the summary of the content of the page. Oftentimes, because the topics discussed in a website have many similarities with other web pages, meta descriptions also tend to get duplicated.
4. Product descriptions. Websites of product resellers often get their product descriptions from the original manufacturer of the goods. Since there could be many resellers for the same products online, the descriptions are often duplicated.
For any SEO professionals, making sure that the websites they are marketing is unique and free from duplicates help a lot. The popularity and ranking of the site is often at stake. Of course, there are also the copyright infringement issues that are pretty common in the World Wide Web at the moment. Make sure that you check your contents for any duplication.
Thanks for posting this, It’s just what I was looking for on yahoo. I’d much rather hear opinions from an individual, rather than a corporate site, that’s why I like blogs so much. Thanks!
I use google alerts to keep me aware of content that may be similar to mine. Article sites often encourage republishing in entirety with links back to the original article so, does the link-back clear the article site of plagiarism in those instances and is it worth the re-publisher using the second-hand content?
For example, a number of articles about patio door innovation (mine and others) were republished together on a blog about home improvement, presumably so that the depth of information would draw search engines and people to the blog for the blog owner to earn some money with google adsense.
Also, you’re right about titles and descriptions being duplicated. Many websites are left that way by the web design company and some templated CMS sites don’t seem to have provision for the customer or SEO specialist to make those changes.