A really huge project and early in my time at BCIT. We had two online program catalogues, meaning duplicate data, inconsistent user experiences, the doubling of content management efforts, and significant confusion for end users. We blended both sources of data into one end-user catalogue application.
What I did:
- business analysis & requirements
- IA and wireframes
- project lead & testing lead
Analysis & Research
This was a very large effort spanning several months. Leveraging existing research on the program catalogue which identified key usability problems, I conducted internal stakeholder research via group discovery sessions, follow-up interviews and presentations. From these two sources and our basic project challenge to blend the data sources, I built functional requirements for the project.
The longer single catalogue page above right was the official ‘Banner’ version of our program data, while the smaller multi-page sitelet above left was the marketing version which schools had control of.
- Restructure and blend data from two sources into one IA and user interface.
- CMS changes to accommodate new user interface, publishing and previewing content.
- Design changes to address ongoing user feedback:
- improve call-to-action of applying to program
- improve ability to contact someone about program
The problem we needed to solve is illustrated by the two adjacent images. First, the staff-published program ‘sitelet’ which was better structured for users but was not supposed to contain any duplicated Banner data. Second, the long, single-page output from our Banner eCRM system. This is actually only about 1/4 of its full length. Key program data, but not very usable.
We began by looking at the existing publishing environment:
- Banner program data structure: 422 programs, 77 possible section headings, output to single long scrolling pages
- CMS marketing options: multi-section ‘sitelet, 7 possible section pages
- Not all programs used marketing sitelets.
Working with our Registrar’s Office and the schools, we were able to reduce Banner structure significantly, down to 35 headings from 77. I examined the seven existing ‘marketing’ page categories and the new 35 Banner sections and came up with a new proposed structure. Using Optimal Sort, I ran an online closed card sort with staff and student volunteers to confirm my new IA and, once results were factored in, I built a sitemap and annotated front-end / publishing template wireframes (below).
Click any tile below to enlarge.
Along with the new program IA above, the new wireframes included new calls to action making it easier to apply to the program, to submit questions about the program and to request more information by mail, all identified by the existing research as being problematic. These were some of the secondary goals of the project that I built into the requirements.
To accommodate, particularly, the new feature to improve the ability to contact staff and ask questions, we made a key functional change. We removed our ‘Ask an advisor’ form-driven question application, which essentially erected barriers to submitting questions, since it took forever to get through. Instead, we implemented a simple ‘Contact Us’ form, which went to our central enrolment services staff.
Iteration, testing & feedback
At key points in the design and development process, I tested our progress with internal stakeholders:
- Front-end wireframes and mockups were presented to all levels of the institute via demo and feedback sessions.
- New CMS editing page process and comps were ‘socialized’ with school CMS editors.
- When feedback was integrated and the new publishing process was functional, it was lab-tested with our CMS editors.
- It was during this phase we found we needed to add a new kind of ‘umbrella’ page, allowing publishers to select programs by code and add marketing content.
Further into the development process, but while design changes were still possible, I took our front-end development to student lab-based testing:
- Functional beta versions of program catalogue sites and umbrella pages were lab-tested.
- Students were given task completion tests with minimal intervention.
- We observed and took notes regarding issues.
- Think-aloud feedback was encouraged and the entire sessions were audio recorded.
- Notes and recordings were used to adjust visual and interaction design elements.
We successfully blended marketing and Banner data into one program catalogue site, with the long single Banner pages gone. Marketers could add promotional content to the more dry Banner data and have it appear in one place. End-user confusion and content duplication were gone. Follow-up research indicated that site visitors found the new design far more pleasing and we were now able to track ‘Apply Now’ call-to-action clicks from a single user session.
As well, as a result of interaction design changes, submitted questions and requests for program information by mail both increased substantially. In the case of the program questions, new requests “went through the roof” immediately after the launch of the new catalogue. In fact, removing the Ask an Advisor application and replacing it with a clear call-to-action and a simple request form, was really an A-Ha! moment in this project. Requests went up by a magnitude of several hundred percent.
This actually meant we needed to qualify questions better post-launch, but it was a real eye-opener for us as to how very small user experience changes could make big improvement.