Jan 31 2012

Web Analytics and a Website Redesign

Published by under analytics    3 Comments

I can’t believe my last post was almost a year ago. It’s probably time I start posting again and there’s no time like now with a completely shameless plug for an upcoming HigherEdExperts webinar I’m honored to be a part of – Redesign Bootcamp 2012 from February 7 – 9th.

No matter what kind of redesign you might be doing (full scale or smaller one), clickstream web analytics can help both benchmark where you are (how do you know you’re successful if you don’t know where you are now?) as well as help in the process of coming up with potential tasks for usability testing.

This past July we redesigned the Penn State World Campus website. The redesign was driven by data – using clickstream analytics, onsite surveys, as well as extensive usability testing. Our old website was last redesigned in early 2005 so we thought it was time, but how did we really know we needed to redesign the site? What did we want to accomplish and how did we use web analytics and usability testing to keep internal opinions out of the process?

First thing was first – we documented the goals of the redesign as well as what can be measured to know whether the goals were met – the KPIs. In coming up with the KPIs we *didn’t* do two things: 1) we didn’t just think about metrics that we could get in Google Analytics and 2) we didn’t limit ourselves to what we could measure at the time. The second one is important so you can be sure to implement measurement for whatever that item is in the new website.

Here is a really quick example of what we learned from our web analytics and applied it to usability testing

We are obsessed with our internal site search keywords. People are telling us what they want in their own words. One of our goals was to decrease the usage of our internal site search and our knowledgebase for what we called “easy” topics. Granted, some people will always go right for site search instead of browsing via navigation, and that’s fine, but we wanted to see how many “easy” topics were searched for a lot within our internal search and our knowledgebase.

“Tuition” was the top keyword searched for the prospect audience (we filtered out current students). Although the popularity of the topic didn’t surprise us, the fact that people had to search for it to find it did. After all, it was listed clearly on the homepage.

Tuition link on the old website homepage.

Tuition is one of the most popular areas of our website. We assumed that it was easy to find as it is not only linked from our homepage, but also from all of our program pages. But, seeing it as the number one searched for item within our internal site search made us want to really test this out. Is this just because some of the population uses search and “tuition” is a popular topic? Could be. Is this because tuition is really hard to find (even though *we* think it’s easy). That could be, too. We decided to find out.

We added a task for tuition to the usability test. We wanted to know how easy it was to find tuition *and* how people searched for it. Seems kind of obvious now, but we were a bit surprised that even when the task specified a degree program (i.e., “You’re interested in the Bachelor of Science in Criminal Justice; find the tuition rate for this degree program”), most users tended to search generally instead of drilling down into the program first. They were looking for tuition, then the degree, not the other way around. Again, might seem kind of obvious now, but because a lot of important information “depends” (from tuition, to the admissions process, etc.), information on the old site was definitely program-specific without a lot of “general” information.

As a result of that test, we put tuition in both places (generally and within each program area). Getting around the “it depends” issue was tricky, but we decided to have that information live in one place, but be fed to both places – to the specific degree page as well as to a general tuition page.

Main navigation on the new website.

“Tuition” is searched for much less now (down to 13th popular term for prospects). But now we are finding some unexpected other topics at the top of internal site search and you can bet we will be testing those out in near future.

So there is a simple example of how our clickstream analytics told us what was happening and then drove us to dig deeper with usability testing. We’ll be talking about this and much more during our session of the series including *how* to set up KPIs and segments for those KPIs in Google Analytics, how to report on your KPIs after the redesign, using onsite surveys to answer the *why*, specific examples of how usability testing helped us solve internal debates on layout and design, and much more.

It should be an awesome series, too. The first webinar in the series will be from Stewart Foss, founder of EduStyle talking about the top trends in higher ed redesigned websites. Next up is Mike Richwalsky, Director of Marketing Services at John Carroll University. He will talk about planning for and managing a successful redesign project. Dave Housley and I will round out the series talking about using data in the redesign process – before, during, and after.

Hope to see you at Web Redesign Boot Camp!

3 Comments

Feb 20 2011

Using Google Analytics Events and Custom Reports to Help Track Off-Site Conversions

One of the biggest challenges in using web analytics in higher education is the fact that so many of our conversions happen off-site. When a visitor comes to our website and wants to request information or fill out an application or donate or buy a t-shirt from the bookstore, more often than not those things happen off-site.

By using a combination of Google Analytics events and custom reports, tracking the off-site conversion is a little easier. Does this give us a direct link from the visitor on the website to the actual conversion (if the conversion is off-site)? No. But it’s the next best thing – and it’s certainly better than guessing – or only paying attention to campaign click-throughs.

Manually Tracking Outbound or Download links as Events

Events are not new to Google Analytics, but they are definitely underutilized. By using an onClick on the exit link (or download to a form, etc.) you can easily track those exit links as events and then tie them back to campaigns.

When you track an event, you specify the event category, action, label, and value. A more detailed description can be found in the GA event tracking guide. In post we’ll only be talking about using events for tracking outbound (exit) links or downloads.

If you want to track an outbound link as an event, here’s how to do it:

Manual onClick event to show outbound links as events in Google Analytics.

Once that is in place, go into Google Analytics reports. To view events go to content >> event tracking.

Here is how it looks in the reports. “Outbound” is the category. “Click” is the action. In the below image, if you click on “click,” you then go to the event label screen shown in the second image below.

Event category and action names

“Application Link” is the event label in this example.

In this example the event label is called application link.

This report then shows you the number of times the outbound link was clicked. Be sure to use the same naming convention with your category, action, and labels. Otherwise you’ll end up with outbound, Outbound, and exit (when they all mean the same thing). Think campaign URL parameter names (email, e-mail, and E-mail) – keep it consistent. By the way, I chose “outbound” for the category and “click” for the action above because that is what is used by default in the gaAddons script (introduced below).

So there you go. Simple, right? Now you’re tracking your outbound links as events. Well, hold on to your hats, here’s where it gets good.

Automate Tracking Outbound or Download links as Events

Stéphane Hamel, web analytics consultant, wrote a great script called gaAddons. This script automatically tracks outbound, download, and mailto links as events (and more). It’s so important for higher education, in my opinion, for a few reasons:

  1. So many of our conversions happen off-site.
  2. Creating onClick’s manually is sometimes difficult to do for many reasons (content creators might not know how to do it, they might simply forget to do it when new content is created, or, maybe they just don’t have the time to go back into current content and create manual onClick’s on existing outbound or download links).
  3. Even if you can track events manually, if you’re not careful with naming conventions, names can be separated (ie, specifying outbound and Outbound differently, etc.)

This script overcomes all 3 challenges.

Using the script is easy. There are step-by-step instructions for using gaAddons on the documentation page of the gaAddons website. Make a quick change to your Google Analytics tracking code, download the javascript file to your server and reference it, then reference jQuery and you’re done. No need to go back and manually put those onClick events on existing links. No need to remember to do it on newly created content.

If you’re still using the old version of the Google Analytics tracking code, there is an older version of the gaAddons script that can be used with the old version of the GA tracking code. Hopefully everyone has upgraded to the newer version of Google Analytics tracking code, however. There are so many advantages of upgrading to async. Also, in the gaAddons version 2.0 (to use with the new version of Google Analytics tracking code) there are so  many more options available. Of course you can track outbound, download, and mailto links as events, but you can do so much more with it.

Create Custom Reports to Make it Easier to tie to Campaigns

So, now that your outbound or download links are being tracked as events, you need to be able to easily see how your campaigns are doing driving people to those “events.”

This is where we’ll take advantage of custom reporting.

The way events are currently reported in Google Analytics is clunky. It’s great that they are there, but beyond just seeing those events, it’s hard to determine if campaigns are driving people to those events.

So we create a custom report to more easily show this data. To set up a custom report, in the left nav in Google Analytics, click on custom reporting >> manage custom reports. Then click on “create new custom report.”

Here’s how to set up this specific report. I call it “events by campaign.” You can call it anything you’d like.

Metrics and dimensions used to create custom report.

You can also use the shared custom report by clicking here.

This custom report allows you to see all at once if campaigns led to any events. Here is the first page of the custom report:

Events by campaign list.

So all your campaigns that led to an event are listed. (not set) means no campaigns led to those events.

Then, if you click on “campaign #1,” you drill down to see which specific events were credited to campaign #1.

List of events that campaign 1 drove.

Now you can both see your important outbound (exit) links as “events” and then easily tie those events to campaigns.

So there you have it. Not perfect (we’d all love to see how many off-site conversions our campaigns drove), but it’s better than guessing – or just looking at click-throughs.

What do you think? I’d love to get your feedback about tracking important outbound and download links.

10 Comments

Dec 19 2010

Visitor Recency – How long has it been since they’ve come back?

Published by under analytics    1 Comment

In last month’s post we talked about visitor loyalty and how it can help you gain insights around the loyalty of your visitors. Especially for websites with longer buying cycles (like admissions sites), visitor loyalty is essential.

Now let’s talk about visitor recency.

Visitor recency is simply taking your returning visitors and it measures how long it’s been since they’ve come back within a certain date range. In other words – how loyal are your returning visitors within the specified timeframe?

So who cares about visitor recency? How can it help?

Let’s take a couple of scenarios.

Scenario #1: you run a blog (maybe a student, admissions, or alumni blog). What is the goal of the blog? Whatever the goal – engagement, conversion, whatever – you need people to come back, right? We went over visitor loyalty last time. But, you don’t just need people to come back, don’t you want them to check back often? If you post fresh content frequently, you want people to come back frequently to check out your content. For a blog, we want “high” visitor recency. In other words, we want people to come back often within a day or 2 or 3.

High visitor recency looks like the image below – visitors usually come back within a week. Depending on how frequently you update your posts, *high* visitor recency for your blog might be different. For instance, if your blog was updated multiple times a day, you’d want people to come back more frequently than a week. High recency for you might be within 1 or 2 days.

High Visitor Recency Example

Scenario #2: You run the admissions website. Although you want your visitors to come back, how many visitors are going to research the school, start the application, and submit it all in one visit? Not many.

Unlike the blog, although we want these visitors to come back, they probably won’t come back as soon as the blog visitors. The buying cycle is too long for this crew.

The example below isn’t a perfect low recency example because there is a distribution at the high-end, but pay attention to the distribution at the bottom. Notice how there is a much higher distribution at the bottom than that of the blog (above). The below website has a much lower recency.

Low Visitor Recency Example

Notice also that I filtered out new visitors. If you don’t filter out new visitors, Google Analytics will show new visitors in the first row. The distribution will be different, however. This is because the percentages will be based on all visitors. When you filter out new visitors, the percentages will be based on all returning visitors, not all visitors. This way you’ll get a better idea of the distribution for your returning visitors which is what you’re looking for with Recency.

When should I use Recency as a key performance indicator?

Although I probably wouldn’t use Recency as a KPI for a website whose target audience has a longer buying cycle, I would definitely use it for a blog or other website where the goal is frequent engagement – you not only want people to engage – you want them to engage frequently and often.

1 Comment

Oct 18 2010

Visitor Loyalty – Do They Come Back?

Published by under analytics    3 Comments

This is the first part of a two-part post about visitor loyalty and recency. Let”s tackle loyalty first. Visitor loyalty is a good metric for most higher education websites (and non-ecommerce websites in general). Loyalty gives us insight into whether visitors have a reason to come back – to engage … again … in what our websites have to offer. Visitor loyalty is simply how many times visitors visit the site within a specified date range.

Do our visitors want to come back for more?

Let’s take an admissions website. The conversion usually doesn’t happen on the first visit. The visitor might research the admissions process and programs offered. They might take the virtual tour. Maybe they read some student stories. The point is, this is rarely linear – come to site, take virtual tour, read student stories, apply – all in one visit.

Ultimately, they usually need to come back.

So what does loyalty look like?

Low Visitor Loyalty

Notice that almost 80% of visitors only visited once.  This website might be great at acquiring new visitors, it needs to work on visitor loyalty.

This looks a bit better:

Better visitor loyalty distribution

There is still a large percentage that only came to the website once, but there is more of a distribution at the bottom.

Take a look at this segment of visitors – obviously a more engaged bunch. A good 48ish% is very loyal.

Visitor Loyalty - Loyal Segment

So, what kind of distribution is good? That’s going to be different for different websites. The more loyal the visitors, the bigger the bottom of the distribution will be. Depending on the website, though, a bottom-heavy distribution might not be realistic or necessary.

To come up with a goal for your website you can take a look at a few things. First, what is your current distribution? Where are the majority of your visitors? Use that as a benchmark. You can also take a look at how many visits it usually takes for someone to convert. Take a look at how many visits it takes for people to do other important things on your website (maybe your micro-conversions). Taking all these into account, come up with a number for your website. Set that goal, then see how the website improves over time.

The type of website is going to matter as well. If the website is for prospects, the distribution will probably be a bit more top-heavy. If the website is, let’s say an intranet or a blog, the distribution might be top *and* bottom heavy – meaning you get a good number of *new* visitors, but you also have a good number of very loyal visitors.

Warning: If your website caters to more than one audience – for instance, if your website is for prospects *and* current students, when looking at the loyalty report, be sure to filter out the audience that you’re not measuring at the moment.

Quick example – if you’re using loyalty as a KPI for your prospects – let’s say your website goal is to increase applications and one of your KPIs for that goal is visitor loyalty (because you know that people don’t usually apply on their first visit), filter out your current students. If you don’t, the numbers will be skewed and misleading.

Using visitor loyalty with campaigns. Another great way to use Loyalty is with campaigns. Hopefully campaigns are tagged correctly. If they are, you can easily build an advanced segment for visitors coming in from specific campaigns (or a group of like campaigns – let’s say brand campaigns).

Then go to the visitor loyalty report showing data from the advanced segment that you just created. Traffic from campaigns is obviously nice, but how many times does the traffic come back? Are they one-hit wonders?

Cookie deletion and visitor loyalty. Does cookie deletion affect visitor loyalty? Yes. That’s why it’s important to set a goal and see your trend over time. No matter what the cookie deletion rate is, if you look at your trend over time, the deletion rate will remain basically the same, so it shouldn’t matter too much.

Obviously if you’re not able to use cookies for policy reasons at your school the visitor loyalty report won’t be useful.

Next up, visitor recency. Loyalty is great when used with visitor recency (when visitors do come back, how much time is there between visits – a day, a week, a month?). If content is updated frequently, let’s say a blog or an intranet or an IT alerts website or any other website that’s updated frequently, visitor recency is really important.

In part 2, we’ll talk about visitor recency. Stay tuned.

3 Comments

Jul 26 2010

Results are in – State of Web Analytics in Higher Ed

A couple months ago Karine Joly launched the analytics revolution in higher ed by asking everyone to fill out a survey about the state of web analytics in higher education. She received 399 submissions.

The executive summary is out and available at the HigherEdAnalytics website. Taking a look at it, a few initial thoughts came to mind. I’m just going to run through them here in no particular order.

First, I was ecstatic that 95% of respondents track website traffic (I know it’s not 100% but in the words of Bill Murray – baby steps). What struck me, though, was that a full 35% did not track any conversions and of the 65% that do, a minority track clickstream and conversion from marketing campaigns (email, online advertising, print, etc.). Now, this may just mean that they aren’t in the marketing department. I’d love to see that data segmented by department. I’m hoping that the majority of folks in the marketing department do indeed track those stats. What’s more – even those folks *outside* the marketing department should be tracking if they do any kind of external communication via emails, social media, etc.

The report also states that 15% of respondents said they do nothing with the data. That makes me sad. : (

The most interesting part of the report for me was around tracking conversions. The  “would like to track” column being the most intriguing. To me, this shows that we *want* to measure conversions, we just can’t for one reason or another. In other words, we need help. The more I wondered about it the more I wondered about the reasons why we don’t (or can’t) track conversions … maybe:

  1. The conversion doesn’t happen on our website and the third party site is either unable or unwilling to allow us to track.
  2. If the conversion happens off our site, we don’t have the right technology in place to tie a campaign to a conversion (let’s say a submitted inquiry form or application).
  3. We don’t think we have an actual conversion to track – for instance if the particular website in question is only informational, etc.
  4. There are several owners of all websites and to try and track traffic much less conversion is so complicated and political that it’s just not worth it.
  5. Folks are worried about the implications for privacy.

Whatever the case, it seems as though we really want to. Now we just need help to be able to do it.

Another area of the report that was interesting was the question, “who spends at least 20% of his/her time working on analytics?” 35% responded either 1, 2, or 3 people. This astounds me – in a good way. I was shocked to find that number so large as I realize that so many people in the higher ed web world are jacks of all trades. Obviously it would be awesome if at least some reported that, “it’s my entire job,” but … baby steps. As an industry we’re certainly nowhere near that yet. It’s definitely a good start, though. I know there were 50% that responded nobody : ( but this surprised me much less. I actually thought that number would have been higher.

Although the majority of people said they were tracking the basics – visits, page views, etc, when we get past the basics, the percentage really drops off. I wonder why. Is this because of a lack of resources? Is it because of the lack of *insights* we’re getting? If it is the lack of insights, the catch-22 here is that you’ll almost never get insights from the very basic metrics, especially if there is no segmentation (unless your site is down and your visits just flat-lined). That was something I also wondered about.

Anyway, I think the report shows both that we’re doing great stuff and there is also a long way to go. But we’re headed in the right direction. : )

Making web analytics a priority

So let’s get this party started! Starting August 12, on the 2nd Thursday of each month, we will be collecting data to start the analytics revolution in higher education. Karine’s group will then release the benchmarking data from the previous month at the end of month. For example, July’s benchmarking data will be released at the end of August and so forth.

To get a benchmarking report, all you need to do is participate in the benchmarking. Go to the HigherEdAnalytics website to join the revolution.

So, I’ve blabbed long enough. Go read the executive summary. I’d love to know your thoughts.

3 Comments

May 17 2010

Help Start a Revolution

Published by under analytics,resources    Comments Off

It’s about time we start a revolution … and we need your help! You know how important web analytics is for higher education websites, but we need to spread the word. So, where do we start? With a benchmark, of course.

Last week Karine Joly launched the “State of Higher Ed Online Analytics” survey to get a better idea of where we are as an industry with our use of web analytics.

Complete the survey today!

Enter your email address at the end of the survey to receive an executive summary in July highlighting the survey results.

Thanks to Karine for putting together the survey and starting the revolution!

Survey closes on May 24th.

Comments Off

Apr 24 2010

Quick Post – Commercials and Measuring Brand

Published by under analytics    2 Comments

Later today our annual inter-squad scrimmage (the “Blue/White game”) will be televised on EPSN2. We have a commercial airing twice during the game and I’m getting ready to track the impact on traffic to our site.

Obviously this won’t be a super popular game on TV – mostly very loyal fans and alumni, but that’s ok. It will still have some impact.

Here is what I plan on monitoring over the weekend to see how much of an impact the commercial (and even the game) had on traffic:

  1. Visit trend
  2. % branded keywords referrals
  3. % direct traffic
  4. % referral traffic from our main university website (psu.edu)
  5. % goals completed from branded keywords referrals
  6. % goals completed from direct traffic
  7. % goals completed from main university referral traffic

When looking at these reports, I have to remember that “like” date ranges matter. For instance, I won’t want to measure the difference in those metrics from yesterday to today because I know that our traffic goes down naturally on weekends (Friday and Saturday aren’t like days). This weekend should be compared to last weekend. Further, I need to keep in mind that the game itself (regardless of the commercial) will likely have an impact on traffic. Unfortunately we don’t have a “like” weekend to run it against (a weekend where our blue/white game aired on ESPN2 without the commercial). So, we’ll have to make due with just realizing that the commercial itself may not have caused the traffic.

I do think the metrics that have to do with branded keyword referrals and direct traffic can show impact from the commercial itself. Why? Because they have to do specifically with users seeking us out by our unit name (not just happening on our site or coming to our site from our main university site).

What do you think? What other metrics should I be looking at?

2 Comments

Apr 21 2010

Must Read Book – Advanced Web Metrics with Google Analytics – Second Edition

This post is long overdue. I wanted to do a review of Brian Clifton’s book Advanced Web Metrics with Google Analytics (second edition) last month when it came out. For those of us in higher education, this book is essential reading. Period.

When it comes to web analytics and, specifically Google Analytics, Brian Clifton is at the top. The book is the second edition, but it is so much more than just an update of the first book – it’s almost a complete re-write. So much has happened since the first book came out. You can read all about it over on Brian’s blog.

Full disclosure: I was lucky enough to read the book before it was published and offer feedback and comments. I’ve never done that before, but what an excellent learning experience!

So, let’s get down to the book. The name of it says “advanced” but you don’t need to be an advanced user of Google Analytics to get a lot out of it. It takes you from the very basics of what web analytics is, how to get started with both web analytics and Google Analytics all the way to advanced topics and techniques.
Continue Reading »

5 Comments

Mar 28 2010

5 Segments to Help You Gain Insights

I’ve written about segmentation in the past, but I still don’t think we give it the importance it deserves. There are so many valuable insights you can gain from using segmentation. Even more important, if you are only looking at your data in aggregate, without segmenting it, you can be making decisions based on misleading information.

Take this example – let’s say you open up your analytics tool and see that, on average, users view 5 pages per visit. Ok, pretty good. You take note and move on.  But if you used segmentation, you might see that the pages per visit is completely different depending on the type of user. Let’ say on average:

  • IE users view 7 pages per visit
  • Firefox users view 4 pages per visit
  • Chrome users view 2 pages per visit
  • Mobile users only view 1 page per visit
  • Campaign A users only view 1 page per visit

Of course these are made up numbers, but you get my point. Doesn’t this tell you much more? My site doesn’t seem to render well on mobile devices and campaign A needs a good look.

The fact that my site averages 5 pages per visit actually tells me absolutely nothing.

This is why segmentation is essential. Averages are misleading.
Continue Reading »

2 Comments

Feb 13 2010

Web Analytics Community at Penn State

I love talking to colleagues about web analytics and yesterday I got to do just that – but on a much larger scale.

A little background … Penn State is very large. We have about 43,000 students at main campus and about 78,000 students throughout all of our campuses. Needless to say we have a lot of websites – I can’t even guess how many we have – 200? 400? 500? Not sure. It’s a lot, though. That much I know.

With so many websites owned by so many different units, community is so important. We have a great web community with a fantastic annual web conference and yesterday we started a different kind of web community – one devoted to web analytics.

Since most units at Penn State use Google Analytics, it was called the Google Analytics User Group kickoff event. Going forward we’re probably going to call it something less tool-specific, but we’ll see.

Some great folks at the Penn State Libraries – including doteduguru Nikki Massaro Kauffman – put the event together. It was an awesome event  and the attendee list was completely full only a few days after invitations went out.

We gathered in the morning at the libraries to kick off the event with an open panel – discussing how Google Analytics was being used at the university within different departments. It was so great to hear the different ways web analytics is having an impact at the university and how website owners are using it. Experience with the tool (and analytics in general) ran the gamut from just getting started to years of experience.

There were many sessions throughout the day including methodologies (which led into a discussion about privacy issues), Google Analytics implementation, new users, and reporting.

At the wrap up discussion we talked about how we will continue with the community. I’m very excited to continue the conversation within Penn State and we already have ideas of specific projects going forward. What a great way to end the week! I’m excited to continue the conversation.

2 Comments

Next »