You are currently browsing the tag archive for the ‘Google Analytics’ tag.
Infographics and data visualisations seem to be very popular at the moment and for a while I’ve been keeping an eye on visual.ly as they have some great infographics and data visualisations. One of the good things about the visual.ly infographics is that there is some scope to customise them. So for example there is one about the ‘Life of a hashtag’ that you can customise and several others around facebook and twitter that you can use.
I picked up on twitter the other week that they had just brought out a Google Analytics infographic. That immediately got my interest as we make a lot of use of GA. You just point it to your site through your Google Analytics account and then get a weekly email ‘Your weekly insights’ created dynamically from your Google Analytics data.
It’s a very neat idea and quite a useful promotional tool to give people a quick snapshot of what is going on. So you get Pageviews over the past three weeks, what the trends are for New and Returning Visitors and reports on Pages per visit and Time on site and how that has changed in the past week.
It’s quite useful for social media traffic showing how facebook and twitter traffic has changed over the past week and as these types of media are things that you often want quite quick feedback on it is a nice visual way of being able to show what difference a particular activity might have had.
Obviously as a free tool, there’s a limit to the customisation you can do. So it might be nice to have visits or unique visitors to measure change in use of the site, or your top referrals, or particular pages that have been used most frequently. The time period is something that possibly makes it less useful for me in that I’m more likely to be want to compare against the previous month (or even this month last year). But no doubt visual.ly would build a custom version for you if you wanted something particular.
But as a freely available tool it’s a useful thing to have. The infographic is nicely presented and gives a visually appealing presentation of analytics data that can often be difficult to present to audiences who don’t necessarily understand the intricacies of web analytics.
The Google Analytics Visual.ly infographic is at https://create.visual.ly/graphic/google-analytics/
Having read Matthiew Reidsma’s blog post recently on how the fold metaphor in web design doesn’t really exist I was intrigued to see that the latest version of Google’s In-page Analytics has introduced a ‘fold’ feature to show how much web page activity takes place below a certain point on the page. The ‘fold’ idea is connected to a design concept that essentially says that people only look at what they see immediately in front of them on a web page and that they don’t scroll up and down the screen.
In the latest version of Google Analytics In-Page Analytics you get an orange line that slides up and down the page to show how much activity takes place below that line. Because of the way that analytics handles traffic to external links by adding the traffic figures together it isn’t all that accurate a tool, but I find it is interesting that Google saw the need to introduce this sort of feature. Making the feature slide up and down looks like the thought was that you could use it as a tool to plan where you might put the most important content. But I’m not convinced that it is all that useful as the tool only moves up and down vertically, it doesn’t move from left to right. And critically for me it doesn’t really represent how your users viewed your content. To make the tool work I think I’d want to segment the users by people using a particular resolution and then look at the In-Page Analytics for that segment only. I need to do some investigation to see if segmenting people by screen resolution is feasible.
Thinking about screen resolutions made me check back to the Google Analytics data to see what screen resolutions people use to access one of our sites. While nearly 60% are using just four different screen resolutions from 1024 upwards there have been a total of 1,326 different screen resolutions in just three months. That seems to me to be an astonishing number but it’s probably a reflection of two things. Firstly that we are getting more people using mobile devices, both phones and tablets. Secondly I think it reflects the fact that our latest site has been designed to cope with a wide variety of screen resolutions (largely as a design feature to allow it to work on phones and tablets) and as a consequence if users want to resize their screen to pretty much any resolution they want, the content should reflow reasonably well.
As we’ve worked our way through the various stages of the library website project we’ve used a number of different tools and techniques. These have included tools to find out what works on the current site, what users think, to plan how the content should be arranged and to engage with users and staff. As we draw towards the launch of the site it seems like a good time to reflect on those tools, how we have used them and what they have told us.
We’ve been using Google Analytics www.google.com/analytics for some time and in many ways it provides the foundation for our work around the website. So it can be used to identify basic things such as which pages are being used in your current site (and which pages aren’t) and the paths that people use through your site. It can tell you where people are coming from to visit your site – so we know that a large number of our users come from our institutional VLE, which has informed our decisions about some of the terminology we use in the new site – we’re using Library resources to describe our ‘stuff’ to be consistent with the VLE. Analytics gives us a vast amount of data and interpreting that data is key to any redesign project.
Card sort exercises
It’s a pretty well established technique to use card sorting exercises to help with developing the information architecture of the site e.g. http://en.wikipedia.org/wiki/Card_sorting As an early part of the design work we carried out this type of exercise with a group of library staff to try to get an idea of a sensible information architecture for the new site. We ended up using it very much as a starting point rather than a finished article as we were keen to test it with users to validate it. On reflection it does seem to be hard to get people to visualise how the website information architecture will translate to a navigation system in a real website.
An almost obligatory component of workshops, often found in combination with post-it notes and card-sort exercises. Even in these digital times they still seem an inevitable element (along with post-it notes) when a group of people get together to plan something.
Once we had come up with a prototype information architecture we wanted to test it with users to see if it made sense to them. There are a few tools out there that allow you to setup quick tests for users to complete. Essentially they allow you to ask users to navigate through a website structure to find a particular page. They test whether your information architecture makes sense to users. We went for a tool called Plainframe http://plainframe.com It costs a small amount of money but had the advantage for us that the pricing was based on the number of tests you ran rather being time-limited. We were able to offer the test to a group of users and it was certainly useful to see how they reacted to the site and has led to some tweaks in the IA.
We decided fairly early on that we wanted to find some different ways to engage with users. So one of the techniques we used was to run a quick poll on the library website to ask students about features they’d want to see, particularly around ‘induction-type’ content. Positioned prominently on the website homepage we got really good reaction with several hundred responses that greatly helped with defining the content for this section.
The key document for the project was a specification. This set out the audience for the site, the page layouts, the information architecture and navigation. So it was created as an output from the workshops, surveys and exercises. The intention was that it would be focus for discussions about what went into the site and end up as a tool that we would use to get agreement over how the site was to be created. It probably didn’t work quite as well as we’d expected, we found it difficult to get people to engage with the document and visualise what it would look like when turned into a real website. And we ended up having to make changes to things like the IA once the site structure started to take shape.
In an ideal world with a project of this nature you want a functional specification that is created and agreed before any development work starts. In reality it is diificult to do that when you move to a new platform like Drupal as you don’t always know when you start exactly what is and isn’t possible. Users often need to see some form of prototype to be able to decide what they want, and a paper prototype (whilst useful) isn’t always enough.
One of the starting points for us was the results (and particularly the detail) from the Library Survey that was carried out in 2010. Although the results were good it did identify some particular problems with search and accessing library resources.
We’ve also been conscious throughout the project that there is a big issue around terminology (something that libraries seem to have a particular problem with). Users seem to struggle with library terminology so we used further surveys using a tool called Survey Monkey (www.surveymonkey.com) to design questionnaires for users to find out their preferences on the information architecture and terminology of the site. We will also use SurveyMonkey to capture some structured feedback from users once the site goes into the beta and launch stages. We find SurveyMonkey really useful to run surveys and use it extensively to get feedback from users and it lets you design a series of questions and then collect, analyse and download the results in a way that can be easily analysed.
One of the main techniques used to plan out what your website is going to look like when it is built. We’ve made extensive use of wireframes for the home page and sub-pages within the new site. I think they are essential to help to visualise what the site will look like, but I am aware that some people find it hard to visualise what the website will look like from a wireframe and want to see something that looks much more like a prototype website.
W is for … Workshops
We found that we used workshops extensively in the project. In the initial stages it was to help with user requirements and information architecture. We’ve made a lot of use of them to help with the work around arranging the subject categories and subject resources. They can be quite time-consuming to setup, run and particularly to analyse the results, but they have the distinct advantage of being a great way of getting people engaged with the project and creating new ideas.
A comment from someone in a meeting last week that on one of the websites mobile traffic was now 10% of all traffic sent me off to Google Analytics to check the latest position with our main website. We’ve certainly seen a big year on year growth in mobile use, 2010 saw 8 times the number of visits from mobile devices that in 2009. This year it looks like doubling. But still mobile use is around 2% of visits rather than 10%.
Although we do have a mobile website version it hasn’t been promoted heavily and even though it automatically detects mobile devices and directs users automatically to a mobile interface it considers iphones and ipads to be suitably internet capable to be directed to the standard website interface rather than a cut down version.
Digging a bit deeper into the analytics shows that ipad usage is now 50% of what Google Analytics classes as ‘mobile’ use (up from 38% last year). Based on the first two months of this year ipad usage looks to be up by three times, while non-ipad moble use looks to be increasing by about 20%. Whilst we are working on a new mobile version of the drupal website we aren’t planning an ipad app version.
What intrigues me is whether ipads really are mobile devices for websites. The safari browser is perfectly functional (flash inabilities notwithstanding), and although some sites direct you to mobile versions (or like google docs give you the option) it’s a purpose built internet browsing machine. This year there are dozens of tablet-type devices being launched with a variety of different operating systems. iPads already seem to be coming up as the ‘mobile’ device most likely to be using our website, internal use plays a part in that. So it implies for me that we need to be a bit more selective in how we define mobile use (and maybe so should Google Analytics) and split the mobile category into tablet use of the full website and mobile use of the mobile version.
In the middle of last year we changed the terminology on one of the main navigation sections on the website. The Help and Support section of the website contains a large amount of the content of the website. The reasons behind the change were several. The tab was getting quite low levels of usage and informal feedback seemed to be showing that users were a little confused about the purpose of the section. So we changed it to Help which benchmarking against other similar sites seemed to be the most common term that was used. Ideally we would have A/B tested the two versions, but we tend to stay away from having different versions as they can cause their own support headaches. So our question is – What difference has changing the terminology made to usage of this section?
We settled on using four sets of data from Google Analytics and looked at the last six months of 2010 compared with the equivalent period of 2009. The four pieces of data we decided to use were:
- the percentage of clicks on the Help and Support/Help tab on the home page – using the beta In-Page Analytics tool against the home page;
- the percentage of total site page views represented by the page views of the Help/Help and Support home page – by comparing the site page views with a filtered version of Top Content to look at the Help home page;
- the page views of the whole Help/Help and Support section as a percentage of the whole site page views – using a similar filter that includes all the help content; and finally
- the percentage of users of the help home page that come from the home page – using the Navigation Summary in the Content section
Looking at those four pieces of data gave us some results that showed that across all those measures use was down by small amounts. The percentage change varied a bit between each of the measures but there was a distinct reduction in the users accessing that section.
Oddly there doesn’t seem to be any evidence that people are finding it harder to find the help they need – we don’t seem to have more telephone enquiries or people saying they can’t find the help they need on the website. So it’s some more evidence to fit into our redevelopment of the website.
One of the problems with the data is that we can’t be sure that there has been a change in behaviour that would have happened anyway and isn’t to do with the website terminology change. On reflection we should have A/B tested the change so we would know that the data was being collected at the same time. And we need to think some more about how to apply what analytics is telling us.
Starting with Google Analytics mapping
We had a request a couple of days ago to pull together a map that showed which countries visitors to the library website were coming from. On the face of it – that’s reasonably straightforward as you can use the Map Overlay feature in Google Analytics to show an Intensity Map of where visitors come from.
But when we looked at it in more detail we realised that there isn’t all that much flexibility about what the map looks like. Although it works fine online if you are going to use the image as a JPEG then you lose the features that you get from the mouseovers. In our case there’s a lot of UK usage but usage outside the UK soon starts to reduce to quite small numbers which only show up as a very light shading on the map. There is an option to show cities on the map which show up as dots, but there’s limited flexibility in customising the map, you don’t seem to be able to have markers for the countries for example. So we started to look for alternative ways of mapping the data to see if we could get a better format.
What else can you do?
Fortunately it is quite easy to extract the data from Google Analytics in a way that can be used by other tools. So the first step is to export the data in CSV format. Once the data is in CSV format you can edit the file, so we took out some of the other data that Google Analytics includes – such as Average Time on Site and Bounce Rate that we didn’t need for this map. Then we started looking at a few other tools to use for mapping the data – Google Fusion tables and Google Maps to start with, to see what they would look like or whether they allow you to do anything you can’t do with Google Analytics.
Google Fusion tables http://www.google.com/fusiontables/Home
This tool allows you to import data in a variety of file formats – .CSV, .XLS, .XLSX, .ODS and .KML, as well as Google spreadsheets. .CSV and .KML allow file sizes of up to 100mb, the others only 1mb. Using the tool is quite straightforward, you’ll need a Google Account and then from the homepage of Google fusion tables click the New table – Import table option. Then browse to find and upload the file. We uploaded the CSV file from Google Analytics. Once you have your data imported you can use the visualize option and choose from a couple of map-types (Map or Intensity Map). On the Map option it’s possible to change the marker type to a more prominent marker or change the colour used. The intensity map is essentially the same as the default you get on Google Analytics. You can also zoom in and out but if you zoom out you get multiple versions of the world repeated. A useful feature is that you can export your data in KML format which is used by Google mapping tools such as Google Maps.
Google Maps http://maps.google.co.uk/
Taking the KML file created through Google Fusion tables you can upload it into Google Maps. Again you need a Google account, and there are very similar options to the other Google tools, but with slight variations. For example, when you zoom out you get the continents repeated but the markers only appear once. to create your map click on the MyMaps option and then upload the KML file. The maps are optimised for viewing online and if you want to produce a map that can be output as a JPEG file they aren’t ideal as you can’t really get a full screen sized map with the countries displayed.
Producing an image for use in a document
What you end up with by taking a screenshot is something that looks like this (on the left from Google Maps) or this (on the right from Google Fusion tables). In Google Fusion tables you can change the marker style but you don’t seem to be able to do so in Google Maps. Neither are ideal unless they are being used online. Although they are good for an overview of the scope of the globe that is covered you can’t easily see all the countries to show which ones you have had visitors from. I haven’t come across a full screen version that would make it easy to take a screenshot. In an ideal world you would be able to set the size of the map, define whether you had one copy of the globe or several, choose whether to display the country names, and configure exactly how the markers or colouring is shown. It would be good if there was consistency between the various Google tools so you could do the same things for all the mapping tools.
Although the Google tools don’t do everything you’d want they are pretty easy to use to provide a quick map of visitor locations. It only takes a few minutes to extract the data from Google Analytics and load it into Google Fusion tables and then present it in a map. There are other tools around, such as Zeemaps, which may bear investigation to see if they have a more suitable output.
I’m always interested to see new ways of visualizing data but I hadn’t come across this tool for visualizing Google Analytics data Juicekit visualizations. This tool links in to your Google Analytics account and asks you to agree to grant access. Then it offers a couple of different visualizations of the data: Referrer Flow and Keyword tree.
This is essentially a tree map view of your website referrals. There are a few options and you can click on one of the boxes to filter by a particular page.
The other option is to look at the keywords used by people coming to your site. Again there are options to filter by different words and you can continue to drill down with combinations of words.
My immediate reaction as with so many visualizations is that it looks nice – but does it help understand what is happening on the site – or make it easy to show people what is happening. The tree map approach is often useful to help people see what gets most use. What is particularly nice is to be able to do it without having to download CSV files and export them into Many Eyes or something similar.
One area that it would be good to see visualized would be something that shows users progression through a site – as some form of flow or tree diagram – showing the routes users take. I’m sure there are many other ideas for further development.
I’ll have to spend some more time working through the options and understanding a bit more about the message it is showing. The sorts of key things we want to show (which follows on a bit from a previous post about skills for Digital Librarians) is who is using our resources/website – where they come from – so we can assess the value of links etc in specific places. So anything that helps to understand that is a useful tool.
With the release of an API for Google Analytics http://analytics.blogspot.com/2009/08/analytics-data-in-excel-through-our-api.html) we are now starting to see the appearance of tools to make use of the API. A couple of the tools embed plug-ins into Excel giving the ability to run your GA queries directly from Excel and cutting out the whole .CSV export and import process
A couple of Excel plug-ins are available. Excellent Analytics http://excellentanalytics.com/ works with Excel 2007 and Tatvic http://gaexcelplugin.tatvic.com/ works with both Excel 2007 and Excel 2003.
I’ve been playing around with Tatvic’s version in recent days. Once you have registered, downloaded and installed the plug-in you just have to enable it from the Tools | Add-Ins menu option – it shows up as Gaclient. The toolbar then shows up as Tatvic Google Analytics Excel Plugin in the View | Toolbars menu.
As a first step click on Login and enter your Google Analytics username and password to connect to your GA account.
Once you are logged in then you can start to pull your data from GA into your spreadsheet. Click Add a new data block to start the query-building dialogue. The first screen lets you choose from your GA profiles which website you want to analyse, select the time period for the query and decide whether you want the data selected by dates.
One of the website tools that can help you make sense of the use that is being made of your website is Google Analytics www.google.com/analytics/. This is a free website metrics application that allows you to analyse traffic to your website by using tracking codes instead of using website logfiles.
Google Analytics will provide you with the usual set of visits, page views, browser type and location data that you can get from most of the website statistics and analytics products. I’m going to concentrate on one of the tools in Google Analytics that can be used to find out some information about what users are doing on your site.
This useful tool allows you to look at where users are clicking on your web page. This ‘Click Map’ shows which elements of the page are being used. Careful use of this information can help you to redesign text, images or location of content to optimise your site. However, there is one ‘feature’ you need to be aware of. Links that go to the same place have their clicks added together (and note that GA also treats any links that go to an external site as going to one place) . So if you have more than one link going to the same URL the results will show the same. The solution to this for internal links to to setup variations of your links so you can track them individually.
One of the things to bear in mind is that internal users, external users and people updating the website will exhibit different behaviour. That may skew your results. If you can identify ranges of IP addresses that are used (by internal users for example) then Google Analytics lets you segment your users into different categories by IP address. In that way you can identify what different types of users are doing and ensure that you have a better idea of what your actual users are clicking on.