An intriguing thread I found at various forums discusses whether clicks on organic search results influence rankings. I used to overwhelmingly say “no it isn’t” mainly because this method is easily manipulated. In fact it is even more easier to do than stuffing keywords on meta tags. And less obvious too because a webmaster can request a pal elsewhere or an MSN or a Yahoo! Messenger contact to search for specific queries and click the prescribed URL if it’s found on search results.
In Google AdWords’ Quality Score, one of the elements that influence pay-per-click rankings is click through rate (CTR), which is defined as the ratio between the number of times an ad is clicked over the total number of impressions. By identifying which ad copies are attracting visitors through CTR measurements, search engines can conclude that these ads are more relevant to certain search phrases even if their ad position is low. Yahoo! Search Marketing is also doing the same thing, while abandoning the predatory bidding procedure where top bidders are the only ones guaranteed top ad placements. To me, this is analogous to link citations perceived to be an influencer in search rankings and a major factor in Google’s search algorithm. And even if this Quality Score changes, the pattern is more or less the same.
I believe that search algorithms dictated by the behavior of unbiased users will greatly improve results. After all, it’s the users who will be looking at results when searching for something so their experience must also be taken into account in improving future results using the same search queries.
While it can be hypothesized that click-through rates be considered a factor in rankings, there are others that can be considered as well. Bounce rates will be helpful to validate a website’s claim that its contents reflect on what visitors see on search results: page title and description. High bounce rates will only prove that the site in question did not live up to the expectation set forth at organic search results. Since Google Analytics is a commonly used free web analytics tool, somehow, this type of metric can easily be determined.
Other metrics found in Google Analytics that can be used to gauge page importance:
- Page Views – How many pages did a visitor end up looking at the website? Does high number of page views indicate a better interaction between the user and the site? This measurement is biased towards sites that have multiple pages or sites that are properly tagged (assuming they run in Flash whose events are properly marked fro tracking).
- Time Spent on Site – Does spending more time on the site mean the site is relevant and be given more merit by putting it on the top? Again this is a metric that could get abused big time. But I guess, just like the percentage of click fraud on pay per click networks, these numbers are small and legitimate (read: unadulterated) visits data could still be reliable. Besides, search engines have developed click fraud detection to filter out any questionable data.
It may not mean that sites need to install Google Analytics tracking to be evaluated. Google may have its own methods of measurements if ever it gets interested in pursuing. Whether or not they cross the boundaries of intruding individual privacy (search history) is the question. But personally, I don’t care.
The likelihood of a user clicking on natural ads is about six times more than on paid ads that appear on the right panel of search results and on some queries, on top of organic results. This is due to 1) the prominence of organic results; 2) the large coverage of this section 3) the section typically sits on the most prominent area of the page. Improvement on quality of organic search results will considerably broaden the gap between organic and paid search click-throughs. And its effects could either bring down cost per click or lose more advertisers due to poor PPC campaign performance.
To conclude, I feel that while there is some validity in considering user clicks as influencer of future search results, search engine algorithms must be able to effectively distinguish and filter out statistics noise in order to implement this hypothesis.