In the last issue, we talked about an amazing white paper ( http://www.langa.com/newsletters/2001/2001-01-15.htm#7 ) that claimed that most web search engines actually only scratch the surface of what’s available online. The authors of the white paper believe they have a way to dig out information from the "deep web," which is available from online databases and such rather than in the normal web pages that standard search engines catalog. It’s an interesting claim, and thought-provoking reading.
Reader Jerry Shallenberger’s interest was piqued enough that he actually decided to put the claims to the test— with fascinating results!
I found the article at Bright Planet interesting. I downloaded the evaluation copy of Lexibot and gave it a try. For comparison, I also used Google (my personal favorite). Surprisingly, in two separate searches, Google returned a greater number of relevant hits than Lexibot. It also ran much faster. Lexibot would search for perhaps a half-hour, while Google would complete in a second or so. It strikes me that there may be a "deep web", but it either doesn’t have much of value, or Lexibot doesn’t do too well searching it. FYI, the terms I searched were ‘sleep apnea’ and ‘computer forensics’. Lexibot produced 299 hits for ‘sleep apnea’ while google produced 144,000. For ‘computer forensics’, lexibot produced 493 hits, while google found 7,230. Plus, google allows you to further refine the search within the original hits. I didn’t see this capability in lexibot. For me, I will continue to use google, and save my money.
So, even if the claims about the "deep web" are true, that abundant additional information may not mean much in terms of what it gets you. <g>