🕣 Duration: 5 weeks
Mercer Brain – AI Powered Knowledge Platform
Mercer Brain – AI Powered Knowledge Platform
Mercer Brain – AI Powered Knowledge Platform
Mercer Brain is an AI-enabled cognitive search and content management solution. It was designed to unify and streamline search across all Mercer sites and apps, making it faster and easier for both clients and employees to locate internal content and content creation.
Mercer Brain is an AI-enabled cognitive search and content management solution. It was designed to unify and streamline search across all Mercer sites and apps, making it faster and easier for both clients and employees to locate internal content and content creation.

100%
100%
Of beta users stated it took them less time to load, tag and publish content to Brain compared to current repositories they use.
70%
70%
Of respondents stated they would choose Brain over any other search tool they currently have available in the organisation.
8.7
8.7
NPS Score, reflecting healthy user adoption and enthusiasm for the product.
My Role
My Role
Year: 2019
Role: Product Designer & Lead
Responsibilities: Owned end-to-end design for Mercer Brain, an AI-powered knowledge platform serving 25,000+ consultants globally. Led discovery, research, and design strategy across a 5-week sprint, partnering with Product and Engineering leadership to balance ambitious AI capabilities with technical feasibility.
Drove the complete design process—from system mapping and competitive analysis through user validation—resulting in 100% improvement in content workflow efficiency and 70% user preference over existing tools.
Team: Collaborated with Product Owner, PM, and 2 engineers.
Year: 2019
Role: Product Designer & Lead
Responsibilities: Owned end-to-end design for Mercer Brain, an AI-powered knowledge platform serving 25,000+ consultants globally. Led discovery, research, and design strategy across a 5-week sprint, partnering with Product and Engineering leadership to balance ambitious AI capabilities with technical feasibility.
Drove the complete design process—from system mapping and competitive analysis through user validation—resulting in 100% improvement in content workflow efficiency and 70% user preference over existing tools.
Team: Collaborated with Product Owner, PM, and 2 engineers.
Key Pain Points
Key Pain Points
Content scattered across multiple systems, making information hard to find
Inefficient keyword-based searches returning irrelevant results
Version control issues causing use of outdated documents
Time lost on document retrieval rather than delivering client insights
Tedious document upload and approval processes causing delays
Content scattered across multiple systems, making information hard to find
Inefficient keyword-based searches returning irrelevant results
Version control issues causing use of outdated documents
Time lost on document retrieval rather than delivering client insights
Tedious document upload and approval processes causing delays



STEP 1
System Mapping to Wireframe-Flow
STEP 1
System Mapping to Wireframe-Flow
I started by mapping the AI content journey and underlying systems to create a technical blueprint, identifying key handoff points for human and AI collaboration. Early client syncs ensured the design aligned with business goals by setting up a priority level for each user goals and technical constrains.
I started by mapping the AI content journey and underlying systems to create a technical blueprint, identifying key handoff points for human and AI collaboration. Early client syncs ensured the design aligned with business goals by setting up a priority level for each user goals and technical constrains.






STEP 2
Competitive Analysis
STEP 2
Competitive Analysis
Building on the wireframe-flow, I mapped competitor flows to identify best practices and set clear UX standards for document creation, upload, and publishing.
Building on the wireframe-flow, I mapped competitor flows to identify best practices and set clear UX standards for document creation, upload, and publishing.



STEP 3
Critical Design Screens
STEP 3
Critical Design Screens
Using the competitive analysis, I created high-fidelity wireframes for the critical flow screens, for early client feedback and alignment on the core user journey. Once approved I started designed key components.
Using the competitive analysis, I created high-fidelity wireframes for the critical flow screens, for early client feedback and alignment on the core user journey. Once approved I started designed key components.






STEP 4
Task-Based User Research
STEP 4
Task-Based User Research
I then ran targeted user tests, giving participants specific tasks—like uploading a document—alongside open-ended questions to validate the flows. See key findings below.
I then ran targeted user tests, giving participants specific tasks—like uploading a document—alongside open-ended questions to validate the flows. See key findings below.






100%
100%
User knew how or where to
upload a documents
2/5
2/5
Mentioned the need of having
Document Classification (Tag).
3/5
3/5
Wasn't sure where to find uploaded document.
Final Design
Final Design






Trade-Offs


LLM-Driven Document Summarisation
Rejected: Provide immediate summaries of uploaded documents, eliminating the need for manual doing it and reducing review time for peer review.


Unfinished global search
Did not have enough time to fully understand how the search was going to work. As it was complex and needed to sync well with other sister/parent platforms i.e Marsh McLennan.
Reflection and Key Takeaways
Reflection and Key Takeaways
Speaking Up for Support: I learned to proactively scope and advocate for resources (time, technical insights, or budget) early in the process. If I could time travel, I would have told myself to request the dedicated technical architecture review syncs with engineering right after the system mapping phase, not after the wireframes were complete.
Designing the Invisible: Because back-end systems (databases, APIs, AI models, servers, etc.) are invisible and designers often lack the technical tools to manipulate them directly, a close, communicative partnership with engineering is crucial. This partnership provides the essential technical blueprint needed to turn constraints into reliable user flows.


If you need to hear the amazing story behind the
project, don't be a stranger and reach out
Anthony.kellly@gmail.com
If you need to hear the amazing story behind the
project, don't be a stranger and reach out
Anthony.kellly@gmail.com


Fun Fact
This was my very first major project and I had to lead the whole thing. It was basically a crash course in leadership—and a masterclass in pretending I knew what I was doing!
Fun Fact
This was my very first major project and I had to lead the whole thing. It was basically a crash course in leadership—and a masterclass in pretending I knew what I was doing!
Trade-Offs


LLM-Driven Document Summarization
Rejected: Provide immediate summaries of uploaded documents, eliminating the need for manual review for certain users.


Unfinished global search
Did not have enough time to fully understand how the search was going to work.
Reflection and Key Takeaways
Speaking Up for Support: I learned to proactively scope and advocate for resources (time, technical insights, or budget) early in the process. If I could time travel, I would have told myself to request the dedicated technical architecture review syncs with engineering right after the system mapping phase, not after the wireframes were complete.
Designing the Invisible: Because back-end systems (databases, APIs, AI models, servers, etc.) are invisible and designers often lack the technical tools to manipulate them directly, a close, communicative partnership with engineering is crucial. This partnership provides the essential technical blueprint needed to turn constraints into reliable user flows.
STEP 5
Improvements After Research
STEP 5
Improvements After Research
I refined the design to highlight AI auto tagging suggestion and improve discoverability of uploaded files, by adding pulsing animation on the circle icon on workbench.
I refined the design to highlight AI auto tagging suggestion and improve discoverability of uploaded files, by adding pulsing animation on the circle icon on workbench.



BEFORE
BEFORE
The initial version required Manual Tagging.
However, user feedback indicated that finding suitable
tags particularly those that improved document visibility
was often difficult and time-consuming.
The initial version required Manual Tagging. However, user feedback indicated that finding suitable tags particularly those that improved document visibility
was often difficult and time-consuming.



AFTER
AFTER
We introduced an Automated Tagging Suggestion feature. Instead of requiring users to manually tag documents after upload, the system now offers AI-generated tag recommendations.


If you need to hear the amazing story behind the
project, don't be a stranger and reach out
Anthony.kellly@gmail.com