Site Performance: New Labs Feature in Google’s Webmaster Tools

by Canonical SEO on December 3, 2009

Site Performance feature Google WMT Labs

Google has added a useful new feature to the Labs section of its Webmaster Tools (WMT) called Site Performance.  If you haven’t already done so then I would suggest that you signup for a Google Webmaster Tools account and go through the Google site verification process for your web site(s).

Once you have an account, log into WMT. Once you click on your site on the WMT home page and go to that site’s dashboard, expand the +Labs in the left navigation to reveal all of the Labs utilities.  You should see the new Site Performance feature listed as an option.  Click on Site Performance in the left navigation and the new Site Performance page will appear as follows:

Site performance in Labs left navigation of Google WMT

Site performance in Labs left navigation of Google WMT

Why would Google include site performance metrics in WMT?

I think it is quite obvious “why” they have added this.  Google wants webmasters to improve load times because it is not only good for the user experience, but it is also good for their crawler.  The less time Googlebot spends waiting on a response to each page request when crawling the web, the more pages it can crawl in any given period of time.

On Nov 12th at the 2009 Pubcon in Las Vegas, Matt Cutts announced that Google was seriously considering adding page load times as a ranking factor.  So this may also be something done to prepare webmasters for a Caffeine update to consider load times.  Personally, I thing that page load times as a site performance metric is likely already part of the Caffeine update, and Cutts was simply preparing webmasters by saying they were considering it.

What does this site performance feature in Google’s WMT include?

The Site Performance page reveals performance statistics for your site based on load times recorded when Google crawls your site.  It includes several tidbits of information that can be used to improve the performance and thus the user experience of your site.

Feature 1: Site performance overview

The first section of the page is a performance overview of the site.  It includes the average load time for pages on your site with a note of when the data was last updated.  The performance overview section indicates the percentage of all sites on the web which are faster than your site.  This section includes a trend graph of average load times over the last several month:

Site performance overview in Labs of Google WMT

Site performance overview in Labs of Google WMT

Obviously, I need to consider caching my WordPress pages. LOL

Feature 2: Example pages and corresponding load times

The Labs Site Performance page includes a small sample of pages with their actual load times.  You can sort the list by URL or by Load Time.  This particular feature didn’t appear to me to be that useful.  The section appears as follows:

Site Performance example pages in Labs of Google WMT

Site Performance example pages in Labs of Google WMT

NOTE:  One thing I noticed of interest:  I have my /wp-admin/ folder disallowed for all user agents in my robots.txt file as you can see here:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/

Yet two of the three URLs in this section are located in my /wp-admin folder.  So how can Google get load times for those pages if they are obeying my robots.txt?

Feature 3: Page speed suggestions

This is probably the most useful section of the new Site Performance page in Google’s Webmaster Tools.  This section of the page includes an even larger sample of pages from the site with suggestions for decreasing the load times of the page. 

These suggestions are based on Google’s new <a href=http://code.google.com/speed/page-speed/>Page Speed</a> tool.  Of course, all of the URLs on my site have the same suggestions since it is a blog and all pages are rendered using the same theme – Thesis from DIYThemes.com.  The section appears as follows:

Site Performance page speed suggestions in Labs of Google WMT

Site Performance page speed suggestions in Labs of Google WMT

Each URL can then be expanded to see specific examples of potential problems and suggestions of how to fix them.

NOTE:  One thing I find interesting is Google bitching because of the DNS lookup for accessing their Google Analytics JavaScript code.   I guess I should drop it. ;)

Feature 4: Page Speed plug-in download link

There is a link at the bottom of the Site Performance page where you can install the Google Page Speed add-on.   This plug-in requires Firefox to install.

Summary of the site performance feature

This new Labs feature seems like it should be useful, especially to novice webmasters who might not know that load times are important to the user experience and who might not know how to identify and correct problems leading to load times.

Though not tremendously useful to me at the moment, it does provide some insight into what Googlebot is encountering when crawling your site.  I’m sure the tool will likely be tweaked to provide more useful information over time.

{ 4 comments… read them below or add one }

dan November 8, 2010 at 10:50 am

We have found this very usefull however, the administration areas of clients websites are also being pulled into these statistics even when using robots to prohibit. The administration areas of sites are often optimised for usabilty and not speed. does anyone know how to prevent this?

Robert Geczi November 16, 2010 at 4:46 am

It’s great that you pointed this out (snippets) as many times I surf the net and see very odd and out of place snippets. Recently, I discovered the Wordpress SEO plugin which shows you what the snippet will look like on Google, right there in your message post.

Canonical SEO November 19, 2010 at 12:20 pm

Honestly, I would not rely on a plugin to tell you what the snippet will look like in Google. It’s impossible for a plugin to tell you this because the snippet will change based on the search phrases. And unless the meta description is used, the plugin has no way of knowing exactly what will be displayed. The exact algorithm Google uses to “construct” a snippet from content on the page is not known.

If ALL words in the search phrase appear in the meta description then the meta description will “typically” be shown as the snippet. But if the search phrase contains words that do NOT appear in the meta description then you (nor any plugin) have any clue what will be displayed as the snippet.

Canonical SEO November 19, 2010 at 12:28 pm

I’ve never known Google to ignore a robots.txt disallow for any reason so if you’re using a robots.txt disallow: directive to block your admin screens, I’d have to guess the disallow: is incorrect and not actually blocking the admin folder/page(s). Now if you’re using the element then they WILL crawl the page… they have to do so in order to find the element in the page source so that they can react accordingly. So in this case, they WILL have statistics on how fast those admin pages load.

Make sure you have a robots.txt disallow directive to block admin folders/pages (NOT a element) if you want them to NOT crawl your admin screens.

Leave a Comment

Previous post:

Next post: