Substantial Cost Savings & Reduced Call Time for a Top Bank
Information Architecture Research To Improve Customer Service Efficiency
Results
2 Minutes
Average Reduction in Call Handling Time
$6.2 Million
Saved in Operational Costs in a Quarter After Release
Research Contribution:
Card sorting revealed how customer service agents naturally organize information.
I analyzed these mental models and delivered clear recommendations for restructuring the Fare Rules interface.
These insights directly informed the redesign that reduced agent search time, shortened call duration, and improved operational efficiency.
After the Project: As is typical during formative research, this study was followed by a usability test, which yielded additional data and suggested integrating the Fare Rules application with the customer service agent dashboard.
Context: Fare Rules, a feature within a larger banking application customer service agents use to determine travel costs.
Problem: The text-based interface made information hard to find, slowing down agents and increasing human error, because commands were manually typed, costing the bank millions in operational costs annually.
Role: As the Principal UX Researcher, I led method selection, data collection, and synthesis, and collaborated with product managers, help desk agents, and the design team.
Skills Applied: Card sorting, information architecture, R programming, statistical visualization, enterprise UX research, quantifiable ROI measurement, dendrogram analysis, mental model extraction.
Impact:Reduced call handling time by an average of 2 minutes per agent that led to 6.2 million dollars in cost efficiency improvements in a quarter. Improved efficiency and usability for customer service agents via a natural categorization structure. These insights directly shaped the new IA and improved customer service agency efficiency.
Opportunity: I identified Fare Rules redesign as a high-impact opportunity through discussions with the Senior Product Manager and clients.
Planning: I created a research roadmap six months in advance with product managers and client stakeholders, addressing application priorities that included research for Fare Rules.
Challenge: Text-based interface redesign required understanding agents' mental models of information hierarchy.
Strategy: I selected card sorting to extract natural categorization patterns from daily users as the most suitable research approach.
This structured approach ensured comprehensive data collection and analysis. I was able to complete the data gathering, analysis, and presentation creation work in 10 days.
The findings were dispersed and described to the client and UX design team in one-off discussions as needed over subsequent months.
Method:
Stakeholders were engaged to determine the cards because Fare Rules is a complex application. I also confirmed a research plan before starting the research.
Participant customer service agents were scheduled via their manager, and data was gathered via Optimal Workshop
Clustering was conducted via Optimal Workshop to identify what cards participants associated with each other.
Highlighted (in yellow) categories chosen for each group and total number of groups. This group concerns ticket types.
Customer service agents grouped categories concerning discounts and billing in the second group.
Customer service agents grouped categories closely related to ticket policy.
The group relates to booking and flight details. This group contains number and trip number cards, two fields currently used to search Fare Rules. This grouping shows the design should allow for searches by number and trip number, a design insight.
The Optimal Workshop analysis consolidated51 fragmented categories into 21, defining four major groups.
Further Analysis in R
This analysis was conducted after the report was released. Work with the existing tool identified the following needs:
Optimal Workshop couldn't show agreement strength between individual cards and groups, making it difficult to prioritize which groupings mattered most.
Stakeholders needed clearer visual evidence to approve major interface changes affecting thousands of agents.
Statistical validation was necessary to confirm that the patterns weren't random but represented genuine mental models.
The design team required specific guidance on which elements absolutely needed to be grouped together versus which had flexibility.
The following is a dendrogram I created, a diagram used to determine which cards should be grouped together:
The dendrogram visualization revealed five distinct information categories that agents naturally use in their workflow.
The dendrogram visualizes how participants grouped cards based on similarity, forming five main clusters, which would later be used to design pages or sections of pages.
The dendrogram is useful but does not show individual agreement of users on cards for clusters. I needed a heatmap, the diagram seen below:
The heatmap analysis showed precisely which features agents consistently associated with each other.
As Senior Manager of User Research at Nabler, I led a three-person team in a mixed-methods study for a higher-education client. Building on prior data, we pinpointed the best email to send prospective students, increasing enrollment and boosting conversion rates.
As the Principal User Researcher at Tallwave, I authored a playbook to disseminate, standardize, and optimize methods across the UX team. Integrated into Community of Practice sessions, it saw widespread adoption, elevating how the team gathered and applied user insights.