About Mercer HDM

      Mercer has been consulting on health and benefits for clients since 1945. However, consultants and analysts are currently limited by the broking data they can access to. They can only access to who’s in their network and anecdotal stories from colleagues to determine whether their client’s rates are competitive or not. At the same time, they need to repeatedly enter the same client information manually into multiple tools. There was a huge shout-out for a tool that aggregate all the health and benefits data and manage client relationships for those consultants.

      That's what Mercer HDM does. It brings the ability to aggregate & ingest H&B data from unstructured and structured sources using machine learning which will allow us to 1) offer our clients a data-driven brokerage service, 2) optimize existing workflows, 3) automate the RFP process, along with 4) the creation of key compliance documents, which all enable consultants to focus on strategic consulting and business developments versus and minimize repetitive data entry.


      Project Information

        My Role

          Design Lead in Product Team @ MMC


          enterprise, machine Learning, data managment, usability


          Sep. 2018 - now


          Sketch, inVision, Zeplin
          a lot of paper sketch and meetings


        The main challenge we encountered in the beginning is that every consultant has their own way of working. Before we define a consolidated way of working for them, we did an extra round of exploratory research on our end users - the consultants.

      02.The 1st Diamond

      User Interviews

      In order to find a promising area that Mercer HDM can create value, a user researcher and I started off by doing exploratory interviews with real consultants from 15 different offices. The purpose of the interviews was to explore how this platform can be applied to their work, to discover potential problem of their workflow and then identify what key aspects we should focus on.

      Refined Project Scope

      03.The 2nd Diamond

      Key Insights

      Based on the user interview findings, I synthesized the key values of HDM and further defined the problems I would like to address through the design.

      Opportunity Statement

      04.The 3rd Diamond: Planning

      Sprint Plan

      After analyzing all the data collected from previous user research, the user researcher and I created personas and user journeys for consultants in different job level. Based on those user research findings, the whole product team sit together and decided our approaches of product values and a sprint plan for our agile process.

      Approach 1: Human-in-the-loop

      The platform is designed to allow humans and AI to work together in a feedback loop, where the consultants will need to review what the AI system extracts. The objective is not only to enable continuous crowd-training so that the machine learning accuracy can mature with more users and more data but also to ensure sufficient governance and oversight, which is the best practice of AI adoption in the market regardless of the accuracy level achieved as AI ethics is catching increasing attention with enforcement from many legislations.

      Approach 2: API, Microservices, and reusability

      The platform’s architecture adopts microservices and APIs orchestrated via Data Gateways for Machine Learning, Visualization, and Core Platform services. The platform not only reuses many MercerOS components but also generate some common renewal reports that may be shared with GBMA.

      Approach 3: Change management to drive adoption

      Our change management strategy includes key messages by stakeholder group to ensure a top-down and bottom-up approach to messaging. Key communication activities include: a. Monthly focus group meetings with over 80 consultants b. Bi-weekly consultant design feedback sessions and usability testing c. monthly usability testing d. Quarterly CSL market chats and monthly leadership calls

      How to Validate

      We decided to insert user testing into our sprint plan. After each major sprint, we will validate the design through 6-8 one-on-one usability testing and 1 small group feedback session with 20-30 consultants.

      Main Features

      Based on the user needs defined through the user research and the capabilities or constraints of Mercer OS, we were able to further delineate the main features of Mercer HDM.

      05.The 3rd Diamond: Deliver

      MVP Release

      We finally release Mercer HDM v1.0 in April, 2019. It is the first ever centralized data repository in Health business in the USA. The core feature is the ability to load unstructured carrier documents and extract key policy information directly into the platform. By bringing structured and unstructured data together, we can start to get real-time access to business intelligence that leadership and consultants can benefit from. An additional benefit to centralizing the data is the opportunity to move from cash to accrual accounting.

      It consists of a centralized data lake available to analyze and report on the centralized data in a consistent and visually impactful way to better inform consultants about key policies quoting trends by carrier.

      Scenario 1: Find Client's Data

      All the data are nicely loaded out on this platform and easy to find, edit or manage.

      Scenario 2: Felxibliblity

      The data sturcture suits all the needs that the client might have. It is easy to update or edit data. A lot of the data set has built in common calculation methods so the results can be generate instantly.

      Usability Testing

      We conducted usability testing after we finished mid-fidelity wireframes in each sprint. Each test is only testing a part of the user flow, sometimes it's even just to test with a specific group of persona. We usually start with listing out all the research questions, so that we understand what is the main goal of our test and what we should focus on. Then we will write out the script we are going to use for testing. During each test, we would share our interactive wireframes and ask participants to share their screen and perform tasks. We would take notes to collect both quantitative data and qualitative data. When analyzing the data we’ve collected, we would look for patterns and keep a count of problems that occurred across participants.

      Wireframes and Iterations

      To improve the glanceability of the content, and reduce the required effort for the user to find the information, a set of user interface iterations were made after we conducted usability testing. In version 1, we only showed plans information for each line of coverage, but we learned that consultants would need to find old data constantly, so we added the filter - in order to help them find historical data more easily. In version 3, we learned that most of the consultants focus on one or two lines of coverage, so they don't need to use the main navigation to switch around, so we changed the micro-interaction of the navigation so it's collapsible, and added breadcrumb titles on top of the name of this line of coverage.

      Post MVP

      To achieve more desirable user experiences, there are technical explorations as well as further user studies needed to be done.