A very large IA and design exercise from several years ago. We had two online program catalogues, meaning duplicate data, inconsistent user experiences, and significant confusion for end users. We blended both sources of data into one end-user catalogue application.
What I did:
- business analysis & requirements
- IA and wireframes
- project lead & testing lead
Analysis & ResearchThis was a very large effort spanning several months. Leveraging existing research on the program catalogue which identified key usability problems, I conducted internal stakeholder research via group discovery sessions, follow-up interviews and presentations. From these two sources and our basic project challenge to blend the data sources, I built functional requirements for the project.
- Restructure and blend data from two sources into one IA and user interface.
- CMS changes to accommodate new user interface, publishing and previewing content.
- Design changes to address ongoing user feedback:
- improve call-to-action of applying to program
- improve ability to contact someone about program
The problem we needed to solve is illustrated by the two images above. First, the staff-published program ‘sitelet’ which was better structured for users but was not supposed to contain any duplicated Banner data. Second, the long, single-page output from our Banner eCRM system. This is actually only about 1/4 of its full length. Key program data, but a usability nightmare.
IA & Interaction Design
We began by looking at the existing publishing environment:
- Banner program data structure: 422 programs, 77 possible section headings, output to single long scrolling pages
- CMS marketing options: multi-section ‘sitelet, 7 possible section pages
- Not all programs used marketing sitelets.
Designing catalogue site taxonomy
Working with our Registrar’s Office and the schools, we were able to reduce Banner structure significantly, down to 35 headings from 77. This came down to a simple exercise of eliminating unused Banner headings (of which there were a few) and blending headings which dealt with similar content. My hope was that I’d be able to keep the main taxonomy to seven sections to align with the CMS published marketing sites.
I examined the seven existing ‘marketing’ page categories and the new 35 Banner sections and came up with a new draft structure. Using Optimal Sort, I ran an online closed card sort with staff and student volunteers to confirm my new IA and, once card sort results were factored in, I built a new taxonomy with updated labels, which stuck to seven sections and accommodated all remaining Banner headings.
Our CMS solution would automatically build program Banner data into the new sites so that schools didn’t have to use our CMS at all if they didn’t want to. If they wanted to add marketing content they used to publish in the separate sites, they could edit the first part of every page of the new blended sites. This gave them the option to add a section that didn’t get built from Banner, but which existed in the new structure, and we allowed them to add up to one additional section/page if the seven sections didn’t accommodate outlying content.
Page data & CMS wireframes
Along with the new program site data structure above, the new wireframes included new calls to action making it easier to apply to the program, to submit questions about the program and to request more information by mail, all identified by the existing research as being problematic. These were some of the secondary goals of the project that I built into the requirements.
Additional featuresTo accommodate, particularly, the new feature to improve the ability to contact staff and ask questions, we made a key functional change. We removed our ‘Ask an advisor’ form-driven progressive question application, which essentially erected barriers to submitting questions, since it took forever to get through. Instead, we implemented a simple ‘Contact Us’ form, which went to our central enrolment services staff. Users needed to have read some preamble and ticked a checkbox to allow form completion. Program staff could also add their own contact information before the form on the ‘contact us’ page. This ended up improving user experience by leaps and bounds.
Iteration, testing & feedback
At key points in the design and development process, I tested our progress with internal stakeholders:
- Front-end wireframes and mockups were presented to all levels of the institute via demo and feedback sessions.
- New CMS editing page process and comps were ‘socialized’ with school CMS editors.
- When feedback was integrated and the new publishing process was functional, I lab-tested it with our CMS editors.
- It was during this phase we found we needed to add a new kind of ‘umbrella’ page, allowing publishers to select programs by code and add marketing content.
Further into the development process, but while design changes were still possible, I took our front-end development to student lab-based testing:
- Functional beta versions of program catalogue sites and umbrella pages were lab-tested.
- Students were given task completion tests with minimal intervention.
- We observed and took notes regarding issues.
- Think-aloud feedback was encouraged and the entire sessions were audio recorded.
- Notes and recordings were used to adjust visual and interaction design elements.
OutcomeWe successfully blended marketing and Banner data into one program catalogue site, with the long single Banner pages gone. Marketers could add promotional content to the more dry Banner data and have it appear in one place. End-user confusion and content duplication were gone. Follow-up research indicated that site visitors found the new design far more pleasing and we were now able to track ‘Apply Now’ call-to-action clicks from a single user session. As well, as a result of interaction design changes, submitted questions and requests for program information by mail both increased substantially. In the case of the program questions, new requests “went through the roof” immediately after the launch of the new catalogue. In fact, removing the Ask an Advisor application and replacing it with a clear call-to-action and a simple request form, was really an A-Ha! moment in this project. Requests went up by a magnitude of several hundred percent.
This actually meant we needed to qualify questions better post-launch, but it was a real eye-opener for us as to how very small user experience changes could make big improvement.