seo

Has Google Shown Their Hand?

Yesterday morning, I did the same thing I always do:  Check my company email, talk to the salesmen, scour SEOmoz for new info, and check on my site’s analytics for the previous day.

Man, did I get shock…the average time on the site had been reduced to less than one-third of what it had been.

I panicked, screamed, shaved my head, and burned a picture of Matt Cutts in effigy. Afterwards, I calmed down.

Now, of course Matt Cutts (probably) has nothing to do with Google Analytics. In fact, a fellow with the far less sexy name of Brett Crosby is in charge of blowing your website to pieces and reassembling the bits.

In a blog post yesterday, he casually mentions that they are changing the way they are calculating time spent on site:

We recently introduced a new way of calculating “Average Time on Site” that removed visitors who “bounce” from your website (people who hit one page of your site and then leave). This updated calculation attempted to give you a better idea of how long engaged visitors spend on your website. However, many of you prefer the original calculation: the total time on site for all visits divided by the total number of visits. So today we are changing it back.

 

You have to wonder if the previous calculation mimicked the way Big G actually looks at your website in terms of the algorithm.  By excluding bounces from the calculation, you reduce the chance of competitor sabotage. Then they are able to see how long people who actually find your content relevant stay.

Since Nielsen has announced that they no longer consider page views a relevant metric, you have to believe that Google has adopted the same view.  And if page views are irrelevant, one could argue that bounce rates are as well. 

In these days of AJAX heavy sites, it is theoretically possible for a person to visit a single web page, and have all the information they need fed to them dynamically. Then they exit that same page, thereby artificially increasing a site’s bounce rate.

It is my view that if bounce rates were a factor, they have been severely deprecated with respect to actual time on site. If they have not, then any future Google Dance could bring big dividends to sites with large amounts of dynamic content. 

I’m aware that some of what I am saying seems like something of a paradox: How can you measure the relevance of a site whose content is largely unspiderable?

My guess is that Google has flagged sites for manual review that have a high bounce rate and a high time on site. In all probability, they have developed a sub-algo to take care of this for them.

Again, this is just a theory. So please be kind as you tear me apart limb-by-tender-limb.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button