Measuring the Success of the Online Course Catalog

The online course catalog is the place where prospects and students go to not only see what courses are offered, but what the prerequisites are, who the instructor is, how many seats are left and more.

The bottom line is the online course catalog is essential for higher education websites. Creating and maintaining a *usable* course catalog is as important. So how do you know if the catalog is usable?

Before going on, I’ll mention that it’s important to not only track the form page, the *results list* page, and the details page, but it’s also a good idea to track what users are entering into the search form.

Another point I’ll mention quickly is that the search engine for our course catalog is independent of the search engine for the rest of our website. For this reason, in our analytics tool, although they are within the same report, we use a pipe (|) delimiter to denote course catalog search results (the pipe separates each input field) while the overall site search results have no pipe delimiter so we can quickly and easily separate out the two results. This technique is probably not the best, but it works well for the time being.

Hard decisions about the course catalog. A little more than a year ago, we decided that something needed to be done about our online course catalog. How did we come to this decision? We not only looked at the analytics, but we also listened to the customer. We found that:

  1. We had a low “search conversion rate.”
  2. Our search form was confusing and thus users were abandoning it.
  3. Students and prospects were calling in with questions that could have been answered using the course catalog.

Low search conversion rate. What do we mean by that? We’ve talked before about conversions not being only for e-commerce sites. In this case, we wanted to figure out what percentage of the time users viewed the search form, searched and got to the results list page, and then clicked through to the course “detail” page?

For example:

  • catalog-search-form.asp –> catalog-results-list.asp –> course-detail-page.asp

We were finding that the conversion rate was quite low and, further, the conversion rate between the form and the list results (the first step) was low. This means that a high number of users were taking a look at the search form and never actually searching for anything.

Because our search form was quite complex, our assumption was that users were getting confused when confronted with the search form.

Failed search results. Looking at failed search results is more complex. We’ve tagged our search form such that we knew the specific criteria users were inputting into the form to search. At the time, our search form had many input fields:

It was basically an advanced search, but it was our default course catalog search. Our analytics tracked not only what users entered into each field, but how many results came back. What we found was that users were entering too many criteria, thus narrowing their searches so far that no results came up.

Voice of the customer. We were also experiencing a high number of calls that were what we’d call *simple* calls. Questions that the caller could easily help themselves to on our course catalog (if they knew how to find the information).

Time for some user testing. Next, we decided to do some user testing to see if we were, in fact, correct in our assumptions taken from our analytics. If possible, it’s important to user test and not just *assume* user behavior is one way for a reason. This is another reason voice of customer is so important. Basic web analytics can tell you the what, but they cannot tell you the why. That’s where voice of customer (surveys, feedback forms, etc.) and user testing come in. We had our assumptions about what we *thought* was the reason for the poor performing course catalog, but we couldn’t be sure until we saw users struggle with our own eyes.

After testing 8 users, our assumptions were confirmed:

  1. When users were faced with the search form, it took them all quite a long time to search for a course – meaning, they had to think about how to do it. They had to think hard. Obviously we didn’t get the abandonment rate that we saw in our analytics because we were forcing these users to find a certain course so they couldn’t abandon the form.
  2. When users typed in criteria to search for a course, more often than not, they entered too much more criteria than was necessary (guessing at some of the criteria). Because of this, they were getting no results when, in fact, the course they were searching for was actually there.

A facelift and more analysis. After a redesign of the catalog (the form, the results list page, and the detail page), we were careful to analyize the data to see if things improved.

Before the redesign, we found from the analytics that the most used fields were the keyword search, the semester pulldown, and course abbreviation pulldown. We decreased the number of fields on the search to 4 fields (including the top 3 most popular from our analysis as well as a field for schedule number since that’s is  how our current students are used to searching for courses.

We  did include an *advanced search* so if users were looking for something more specific, the functionality was there for a very focused search.

The course catalog redesign worked. We saw signifant increase in conversion from the search to the results list to the details page. We found that the amount of criteria users are inputing has decreased dramatically and thus, users are getting results to choose from. Further, we found that users were not only getting search results, they were also clicking through to the detail page.

All-in-all it had all the elements of a successful redesign – evaluation and analysis, user testing, redesign, more evaluation and analysis, design tweaking.

When redesigning or updating your websites significantly, it’s important to not stop there, but to continue to analyize the data to make sure the new content and/or design is working.

If not, tweak and repeat.

2 Responses to “Measuring the Success of the Online Course Catalog”

  1. David Watters says:

    That’s very interesting! Thanks.
    As for collecting the Voice of Customer – I saw few websites using Kampyle (www.kampyle.com) for collecting feedback. It seems to be a pretty cool feature – just my 2 cents.

  2. [...] Last week’s post concentrated on how we measure the success of our online course catalog. As I’ve stated previously, we do this with Omniture, but it can also be done using Google Analytics. This post is a practical guide to set up Google Anlaytics goals and funnels to track online form submissions. You may have many forms on your website (request information, application, class registration, online course catalog search form, etc.). Google Analytics conversion goals let’s you track every step of these online forms so you can: [...]