Popular search engines (GYM: Google, Yahoo, Microsoft) help consumers find home pages or web pages very fast using keyword based indexing that brings all pages relevant to those keywords and presented in order of their relevance. A major factor in judging the relevance is the ‘popularity’ of the page determined by the link structure of the web. This link structure plays a really major role in the page rank and similar algorithms.
Is it possible to develop an approach for determining the relevance based on expertise or trust. The answer by Claude Vogel to this question is a BIG YES. Claude Vogel and Paul Gardner gave a talk in the Next Generation Search Systems seminar series related my class with the same title.
Claude is CTO of Convera who has developed Excalibur system to license to people to explore the web for finding information. His search approach has more depth than GYM because it clearly tries to bring elements of vertical search that many other players in the field are trying to bring. The major difference is that his system prepares the complete index, unlike most vertical search systems, and uses ontology and other tools at the query time to get the answers, organize them into relevant facets corresponding to the query and then present results for each facet. Of course this process is recursive so he creates facets in each facet. If you are looking for ‘mercury’ then the system will create facets corresponding to the planet, heavy metal and name of a person. It may first present results only for the planet and tell you that there are other facets that maybe of interest.
This system is operational and is very interesting. It is much deeper exploration and hence is much slower. Clearly, this approach is designed for people who want to analyze rather than just be informed.