Uncategorized @et

How Google Might Rank Webpages In Its New Search Engine

I have discussed briefly on how search engines rank webpages based on DCG on this article now its time we look in to new metric system that Google will be using in its new search engine to rank webpages and as I mentioned before its one of the several factors that will affect your ranking but understand this metric system is better reason is because this will play a vital role in how your pages will be ranked though its one of several factors its one of the main factors that you should take in to account while developing a website or blog.

As we seen in the previous article that search engines were using a system called DCG (Discounted Cumulative Gain) to rank webpages so far and their assumption was proved wrong after a serious study in to the subject and if you have not gone through its just make sure you do before your proceed reading this article reason is it helps you understand the whole idea better.

Coming back to the topic the metric system based on DCG was proven to be wrong, according to DCG “usefulness of the document thats ranks in “i” position is independent of the usefulness if the document that ranks less than “i” and we normally its true but it need not necessarily be true and it was in fact proved to be wrong.

Let me put a similar example on what was actually pointed out in the patent paper, just assume you are searching for a particular piece of information on Google let me take a sample query, for example “how to rank high in search engines” so when you do a search for this in Google you get a set of webpages, just assume two cases with the way Google provides you the result.

Case 1: Google gives you 1 page with most accurate information on that topic and rest of the 19 pages are not relevant

Case 2: Google gives you all the top 20 pages with most relevant results.

So which one do you prefer? normally we go for the “case 2″ but in fact search engines prefer the case 1 after performing series of experimentation and the reason they point out is this “Likelihood a user examines the document at rank “i” is dependent on how satisfied the user was with previous ranking” and this is exactly the metrics that new Google search engine will function up on (ERR – Expected Reciprocal Rank)

So getting back on our example when I did a search for “how to rank high in search engines” the top three results I got in Google was this.

So in the first place you can see a webpage from

Followed by and then

So if a user in the new search engine made a similar query and he got in to the webpage from “high-search-ranking” to check for the information but just assume he didn’t find the information he needs on that webpage instead he went on to see the next result in the page and he spends more time there and didn’t bounce back to see the next result then Google will assume that the result in the position two provides more relevant information for this particular query and it will alter its ranking accordingly so assuming that is the case its more likely “webpronews” website might rank number one pushing back “high-search-ranking-site” to second or third position possibly.

But you should know that search engines rank webpages based on many factors so this might play a critical role to some extent but still when Matt Cutts mentioned that you will not see much of a difference when the new search engine rolls out next year he means only that, you should won’t see a much difference initially when the new search engine rolls out but as days progresses and if you have any junk pages ranking high for a competitive search term you should know that your page ranking might go down, so make sure you give out more valuable contents to the users and when you do that you may chase out your competition and rank higher and this is really good from user’s point of view as users of Google will start to find more relevant webpages ranking higher and more and more people will start to move to Google to look for information but it may be bad for webmasters who have less valuable pages ranking high for now because as the new search engine comes to play their pages might well go down.