Back to Blog

Tracking System Adoption and Usage

How do you measure the success of a design system?

October 7, 2020—There are many ways to asses the impact of a design system, varying from fully automated approaches—like scanning a codebase for design system code—to manual assessments that identify where and how the system is used. But the metric that underpins all these approaches is adoption. Product teams cannot get value from the system if they are not using it, and they cannot benefit from new capabilities if their implementation is not up to date.

At Morningstar we’ve been tracking adoption of our design system since the first release back in 2017, using a combination of surveys, coverage assessments, and direct engagement with product teams. While all these methods have value, all are labor-intensive, and provide an incomplete and subjective view of system use.

To more effectively serve product teams, we needed a comprehensive view, and we needed to replace time-consuming surveys with a self-service, automated approach.

To more effectively serve product teams, we needed to replace time-consuming surveys with a self-service, automated approach.

Guiding Principles

As we began our discovery, we had a few principles that guided our approach:

  • Automation was key. The design system needs to serve product teams, not the other way around.
  • We needed to look beyond simple adoption to understand how the system is actually being used. When we make a breaking change to a component, who needs to know? Are any teams still using older, out-of-date versions of one or more components? To make well-informed decisions, we need accurate data.
  • We needed a self-service approach that puts product teams in control. While we believe strongly in the value of our system to increase velocity, cohesion, and quality across our product line, product teams own their road maps. Our goal is to inform decisions, not dictate them.

Our goal is to inform decisions, not dictate them.

Introducing the MDS Adoption Dashboard

With the above goals in mind, the MDS team will soon be launching an automated, self-service dashboard to help product teams track their adoption of MDS version 3, following a three-step process that allows teams to adopt gradually in accordance with their road map.

Adoption Dashboard

The first two steps focus on planning. Product teams complete a brief survey to share the link to their repo(s) along with a few other details, and then follow up with the MDS team to ensure we have tools and resources to support their requirements. We will also use this data to provide targeted communications regarding updates and changes to the system.

The third step—for products to upgrade to MDS version 3 and remove dependencies on earlier versions—will be tracked automatically from usage data, based on the MDS dependencies defined for each project in source code.

The Truth is in the Code

Tracking MDS dependencies in source code ensures a single and comprehensive view of system usage. While components, capabilities, and even applications are often reused and deployed in multiple contexts, all source repositories are accessible through a common environment, allowing scans to be automated.

Component usage by product

Since each MDS Vue component is a separate package, we can identify the specific versions of each component used. This will allow us to identify which teams may be out of date with one or more components, and which need to be notified when new versions are released. It will also allow us to ensure we invest in the most used and most valuable components.

Usage Does Not Mean Quality

While usage metrics assess the the scope and impact of design system use, it is important to note that usage does not indicate coverage or quality. While the dashboard tells us which versions of MDS components a given application is using, it does not tell us how many instances of those components exist, how they are deployed, or whether they are used in accordance with brand guidelines. Just as a design system does not replace the need for designers, this dashboard does not replace the need for close collaboration with product teams to ensure the system is meeting real user needs. It simply provides another lens to guide and inform those efforts as the system evolves.

Just as a design system does not replace the need for designers, usage metrics do not replace the need for close collaboration with product teams to ensure the system is meeting real user needs.

What's Next?

Once the dashboard is live, the next step is to work with product teams to ensure all products and capabilities are accurately reflected on the dashboard. As adoption proceeds, we will follow up with tools to help product teams assess coverage and quality, and offer consulting to teams that require more focused and extended support from MDS to ensure they are getting the most out of the system?