(no title)
dclara | 12 years ago
Usually, the router search engine queries data from the second tier websites to get high quality results without having other websites' algorithms. Also there is another problem: how do you know which websites to go for given an arbitrary keywords? For example, when user searches for "cookie" on your search engine, where do you send the query to? How do you know if they are looking for food cookie or browser cookie?
wslh|12 years ago
Regarding how do you know where to route a query, it is an issue but not so great in this case. The article doesn't talk about having a two tiered search for every web site. If it has a two tiered search for the top 100 sites that is enough to challenge Google (the main point of the article) and making 100 searches and filtering them in the 2nd tier it's not difficult.