I think you guys must have had a real good experience searching on Google. No wonder they have a very good algorithm but then that too us susceptible to spam so you cant really have 100% reliability on that. That was the question that made me think that googlrithm is more complex than what it seems. And yesterday night I got that light from my thinking bulb. At first when Google introduced Google Sitemaps and Google Analytics. No one would have suspected it to have any consequences on search. But now I realize that the assumption is perhaps wrong. So heres my hypothesis on how Google makes it happen.
Well every webmaster now by default goes to Google Sitemaps to get his site in the list. The webmaster would get information like his indexed pages, when his page was indexed and what searches picked up his pages and at what rank. It also records for which searches his site was visited. Great now you know who visited your site for what result. But then Google also knows. From here Google Analytics takes over. It records how long was the user there before he moved on. That too determines how useful your page was. Very simple but effective most of the time. Also Google Analytics would let the search engine know what is the usual time a visitor is at the site. So you get a higher ranking for more the number of visitors and longer they spent on your site. A spam site would never get visitors to spend anything more that 5seconds and if you have most visitors staying for real short period then your ranking gets a good hit. A very simple solution indeed all while giving the webmasters a feeling of serving them well. Ironically it does that too well. Google Sitemaps and Google Analytics are both such good tools that they leave competition miles behind. Great going Google.