Posted by Tom Foremski - March 3, 2007
Search engines say they use complex algorithms to help users find exactly what they want. Google's "I'm feeling lucky" button (btw, does anybody use it?), right below the search box implies that very thing.
The legions of top Ph.Ds working for the search engines publish oodles of scientific papers on complex mathematical concepts related to search.
It all looks very impressive but it seems to have more to do with contributing to the mythology surrounding search--that it is very complex and scientific--than to the actual reality of how search is done.
From my vantage point as an online publisher, it is clear that search is increasingly "people-powered" rather than machine-powered. There are millions of people helping the searchbots find information.
Here are some examples and gripes:
- There are many publishers that try to make sure their headlines catch the attention of the search engines rather than catch the attention of readers. The same is true for content, editors increasingly optimize it for the search engines rather than the readers.
- Why should I have to tag my content, and tag it according to the specific formats that Technorati, and other search engines recommend? Aren't they supposed to do that?
- Google relies on a tremendous amount of user-helped search. Websites are encouraged to create site maps and leave the XML file on their server so that the GOOGbot can find its way around.
- The search engines ask web site owners to mask-off parts of their sites that are not relevant, such as the comment sections with no-follow and no-index tags.
- Web sites are encouraged to upload their content into the Googlebase database. Nice--it doesn't even need to send out a robot to index the site.
- Every time I publish something, I send out notification "pings" to dozens of search engines and aggregators. Again, they don't have to send out their robots to check if there is new content.
- Google asks users to create collections of sites within specific topics so that other users can use them to find specific types of information.
- The popularity of blogs is partly based on the fact that they find lots of relevant links around a particular subject. Blogs are clear examples of people-powered search services.
And there are many more examples. If the search engines are so great at doing what they do, then how come we have to do all of the above?
I resent the fact that I have to create all this content describing my content--the search engines should be creating this "metadata."
I just want to write stuff, and leave it up to the search engines to find it, classify it, index it, and do all the other things their mythology suggests that they do.
In the world of enterprise search, companies such as FAST, Vivisimo, Autonomy, etc, have to find information without the benefit of aids. Corporate documents have no pagerank or tags or much metadata of any kind.
Yet in consumer search it seems as if nothing would be found without a huge amount of help from millions of people every day. Why is it that we have to help the search engines do a job they are supposed to be doing by themselves?
I wonder about the productivity cost to society from all this human labor--work that is supposed to be done by robots.
It's as if these searchbots are blind, and we have to lead them patiently along the street and point things out to them, while they tap away at the world with white canes.
Part 2: Search seems to be broken...
Tweet this story Follow @tomforemski