I can’t believe my last post was almost a year ago. It’s probably time I start posting again and there’s no time like now with a completely shameless plug for an upcoming HigherEdExperts webinar I’m honored to be a part of – Redesign Bootcamp 2012 from February 7 – 9th.
No matter what kind of redesign you might be doing (full scale or smaller one), clickstream web analytics can help both benchmark where you are (how do you know you’re successful if you don’t know where you are now?) as well as help in the process of coming up with potential tasks for usability testing.
This past July we redesigned the Penn State World Campus website. The redesign was driven by data – using clickstream analytics, onsite surveys, as well as extensive usability testing. Our old website was last redesigned in early 2005 so we thought it was time, but how did we really know we needed to redesign the site? What did we want to accomplish and how did we use web analytics and usability testing to keep internal opinions out of the process?
First thing was first – we documented the goals of the redesign as well as what can be measured to know whether the goals were met – the KPIs. In coming up with the KPIs we *didn’t* do two things: 1) we didn’t just think about metrics that we could get in Google Analytics and 2) we didn’t limit ourselves to what we could measure at the time. The second one is important so you can be sure to implement measurement for whatever that item is in the new website.
Here is a really quick example of what we learned from our web analytics and applied it to usability testing
We are obsessed with our internal site search keywords. People are telling us what they want in their own words. One of our goals was to decrease the usage of our internal site search and our knowledgebase for what we called “easy” topics. Granted, some people will always go right for site search instead of browsing via navigation, and that’s fine, but we wanted to see how many “easy” topics were searched for a lot within our internal search and our knowledgebase.
“Tuition” was the top keyword searched for the prospect audience (we filtered out current students). Although the popularity of the topic didn’t surprise us, the fact that people had to search for it to find it did. After all, it was listed clearly on the homepage.
Tuition is one of the most popular areas of our website. We assumed that it was easy to find as it is not only linked from our homepage, but also from all of our program pages. But, seeing it as the number one searched for item within our internal site search made us want to really test this out. Is this just because some of the population uses search and “tuition” is a popular topic? Could be. Is this because tuition is really hard to find (even though *we* think it’s easy). That could be, too. We decided to find out.
We added a task for tuition to the usability test. We wanted to know how easy it was to find tuition *and* how people searched for it. Seems kind of obvious now, but we were a bit surprised that even when the task specified a degree program (i.e., “You’re interested in the Bachelor of Science in Criminal Justice; find the tuition rate for this degree program”), most users tended to search generally instead of drilling down into the program first. They were looking for tuition, then the degree, not the other way around. Again, might seem kind of obvious now, but because a lot of important information “depends” (from tuition, to the admissions process, etc.), information on the old site was definitely program-specific without a lot of “general” information.
As a result of that test, we put tuition in both places (generally and within each program area). Getting around the “it depends” issue was tricky, but we decided to have that information live in one place, but be fed to both places – to the specific degree page as well as to a general tuition page.
“Tuition” is searched for much less now (down to 13th popular term for prospects). But now we are finding some unexpected other topics at the top of internal site search and you can bet we will be testing those out in near future.
So there is a simple example of how our clickstream analytics told us what was happening and then drove us to dig deeper with usability testing. We’ll be talking about this and much more during our session of the series including *how* to set up KPIs and segments for those KPIs in Google Analytics, how to report on your KPIs after the redesign, using onsite surveys to answer the *why*, specific examples of how usability testing helped us solve internal debates on layout and design, and much more.
It should be an awesome series, too. The first webinar in the series will be from Stewart Foss, founder of EduStyle talking about the top trends in higher ed redesigned websites. Next up is Mike Richwalsky, Director of Marketing Services at John Carroll University. He will talk about planning for and managing a successful redesign project. Dave Housley and I will round out the series talking about using data in the redesign process – before, during, and after.
Hope to see you at Web Redesign Boot Camp!