If Google is banishing your content to the “supplemental index” gulag you might just have this problem … and need this solution!
Great article.

Each time I consider this problem I keep thinking that we are fixing something that isn’t broken. Google is the problem here not the website owners. Google is essentially missing important information on the web that their google bots have mistakenly dumped into a supplemental file (reminds me of a ‘I will file it later stack that never gets filed’).

If Google doesn’t get around to getting a better grip on this, their search engine results will continue to degrade and eventually the market will offer up some company that can do the job more effectively (again).

It would also seem that Google already has some of the technology to deal with this scenario present in Google News. Instead of penalizing a site for perceived duplicate content, Google should offer up an alternative option with these listings with a link in their results like ‘All 1,534 Pages of similar content’.

If I am searching the web and I find the version of content that Google deems to be correct, its possible and often likely that Google is wrong. I’d much rather have the ability to drill into a look up of other sites with similar content and find the source (possibly the primary source that isn’t as search engine friendly).

From a webmaster’s perspective this is a great article, from a web researcher’s perspective it shows the failing of Google to get things right and opens the door for competition. Isn’t it about time someone knocked off the king of the hill for search? Its been a few years now. . .

read more | digg story

Comments

Responses...