The Process of Restructuring a School’s Information Architecture — a UX Case Study
In this article, I will take you through the entire process of restructuring a school’s IA –the research we did to narrow down our persona, content auditing and cognitive walkthroughs, our contextual inquiry and heuristics evaluation of the current site, it’s competitive analysis and finally, the iterative process of restructuring the site, which includes methods like card sorting, treejack tests, usability tests and prototyping.
I’ve also included links to my notes and research.
Redesign the Information Architecture (IA) for a Singapore Polytechnic’s Professional and Adult Continuing Education (PACE) Academy.
Singapore Polytechnic, or SP for short, has a dedicated school under it for Continuing Education(CE). PACE Academy offers short courses, post-diploma programmes, online courses and earn & learn schemes for working professionals who want to up-skill and further their education.
We were also assigned 3 personas:
Mark, 18 year old prospective student
Jessica, 21 year old university student
John, 38 year old working professional
First, we did a content audit of Singapore Polytechnic’s main site to better understand and visualise the content that they had. A high-level sitemap of SP’s main site allowed us to highlight the three user’s potential journey in different colors to see which pages each persona would go.
This easily identified our person as John, the 38 year old working professional, as Mark would only use SP’s main site, and not the PACE Academy’s website.
Next, we worked on a content inventory and a more detailed sitemap of SP PACE, which allowed us to see the number of unique pages SP PACE had. We also did a qualitative sitemap/user flow that gave us a visual of every page every step of the way.
View PACE’s content inventory here.
Now that we knew our content, we gathered 6 people who resembled our persona and gave them specific tasks to carry out on the PACE site.
By observing users using the site to achieve their assigned goals, we were given a first hand viewing of the pain points they encountered when trying to navigate the site. Most of the participants were frustrated navigating the site, found the pages to be too long and full of text, and some almost missed chunks of information as the page lacked formatting.
These were some of the pain points that we took into consideration moving forward with the site restructure. You can find the full interviews here.
Heuristics Evaluation — Site analysis
After we got users to use the site, our team had an internal check ourselves. Combing the site, we noted usability issues on the pages that affected our users’ journey, and annotated and kept a record of them with UX Check. Given the tight timeline of 2 weeks, we then picked out the more severe heuristics violations and focussed on them. Here are some of our findings:
#1 Confusing placement of nav bars and breadcrumbs
Right off the bat, the unconventional placements of SP’s main nav bar, PACE’s nav bar and breadcrumb proved to be the most confusing. Only the global nav bar-which led users to pages on SP’s main site- was sticky. Scrolling further down would hide PACE’s nav bar, and leave users thinking SP’s global nav was the same nav for PACE.
The existing breadcrumb is also placed above both the sub-nav bar and hero banner, essentially defeating the purpose of the breadcrumb.
#2 Broken link left unchecked… or long stack of banners?
The first thing that catches your attention when you land on the PACE site is a long stack of 7 banners. We couldn’t tell if the link was broken, or if this was intentionally placed as such, but either way, users might not see the content hidden under the stack of banners because of the lengths (quite literally) they have to scroll to get there.
We also noticed that important information(such as course calendars, and upcoming schedules) are part of these banners. Our recommendation is that these information take a permanent place within the website as these types of information affect if the user chooses to enrol in the school, or not.
#3 Severe lack of organisation
We found that SP PACE had a relatively low number of clicks to get to the end page. So what was taking users so long to find information that they needed?
Zooming in on PACE’s Course Catalog, it was easy to see:
The courses have been categorised alphabetically! This caused the page to run really long, disallows exploration of available courses, and easily causes fatigue to our users, who have no context of what they’re searching for.
#4 Poor information structure
Say the users get to the course they want. The next step they face is this:
The information structure on single course pages were cluttered, messy, and lacked formatting, which also means the lack of hierarchy. For John, time is of the essence, and a page with no hierarchy means the inability to quickly scan the page for relevant information.
We then moved forward with competition analysis. As our heuristics clearly pointed out specific problematic areas, we decided to do a qualitative heuristic evaluation comparison with other Continuing Education(CE) schools in Singapore.
Our team made points on what and how things could be improved.
The process of restructuring
With all the knowledge that we acquired thus far, we’ve come to the meat of the process– the actual restructuring. We took the content inventory we had, and embarked on an iterative process of reorganising the architecture of the school’s site, then testing it on users.
We went through the site internally as a team first, and restructured the site into what we thought was better based off our competitive research. These included renaming section headers, shifting entire sections around, and removing redundant or repeated information.
When we all came to a consensus, we launched our first manual card sort on 3 users to gather qualitative feedback, and collected more card sorts from 20 other people for more accurate results.
On closing our card sort, we had positive success rates on some headers, whereas other headers, such as “finance” and “training” did not fare so well. Titles that were names of subsidies, such as “National Silver Academy”, also fared badly in the card sort.
We reconvened as a team again, and went through the sections that did not do well. We discussed how better we could handle the information, and what we could do to make those headings better?
We also discussed if placing some sub categories under more clearly defined headers lent context to them. For example, when visiting a website with “National Silver Academy” under the a menu titled “Subsidies” would tell the users that its a subsidy. However, when it is taken out of context to sort (such as the case of this card sort), users think that the National Silver Academy is a type of school.
Eventually, we did not shift the page out of subsidy like our users had sorted, but instead, we renamed the headers to more clearly encapsulate the pages within the menu item.
We tested if our structure was sound with a treejack test, and this time, we had 79% success rate. When we were satisfied with our results, we went on to creating our wireframes.
We started by creating the nav bar first. The structure of the nav bar was informed by our latest round of treejack test. Then, we started sketching the strutures of the key pages and elements on paper before creating mid-fi wireframes on Axure.
We targeted to produce low to mid fidelity wireframes for our first usability test as we still wanted users to test the flow of the website. However, we felt that it was important we nailed the right font sizes in the same prototype as the feedback we received from the initial inquiry was one concerning the legibility and readability of the fonts used.
When we finally had something visual and interactive to test, we went ahead with our usability test.
We tested 6 participants that matched our persona and gave them tasks that were similar to our first test to see how they would match against the current site. Have similar tasks also ensured that the data we collected was consistent. You can view my interview questions and notes with 4 users here.
We then put observed behaviours and quotes from the usability tests into a feedback capture grid, and found certain trends in our data.
In addition to “what worked” and “things to be improved”, we found discovered some interesting behaviour from our users which could not surface before having a testable prototype: Users were going back to the landing page as a means of navigation instead of using the sticky nav bar which we worked hard to be easily accessible. When probed why, users said that they recalled having saw a large area(both large placeholders and texts) dedicated to the specific assigned task, and went through the landing page, even if it meant more clicks.
With this insight in mind, we worked to incorporate our changes into a more hi-fi prototype.
Our latest prototype can be seen here.
Moving forward, I’ll be working to implement feedback and update this post with design refinements. Meanwhile, here’s the landing page with the updated design direction.
Thank you for reading.