S&C Electric Design Systems Dashboard Shipped ✅

DS Analytics
Dashboard

An internal tool to monitor design system health — designed in 7 days under significant data constraints.

Duration
7-day sprint
Role
End-to-End UX & UI
Team
S&C Electric Design System Team
Tools
Figma · Miro · Bitbucket
Outcome
+17% adoption · −13% detachment
DS Analytics Dashboard — component adoption monitoring tool
Overview

Making a Design System's
Health Legible

Design systems are only as valuable as their adoption. S&C Electric's Design System team had built a strong component library — but had no way to answer a basic question: is anyone actually using it?

I was assigned this as a single sprint ticket within a 2-week cycle at my internship. The requirements doc was blank. The data was scattered. And the answer to "where do we even get the numbers?" took two stakeholder conversations to figure out.

The result is a single-page dashboard that pulls from multiple alternative data sources to give the team a continuous picture of component adoption, detachment rates, and team-level usage — the visibility they needed to prioritize roadmap decisions.

Duration
7-day design sprint
Role
End-to-End UX & UI Design · Research · Program
Team
S&C Electric Design System Team
Tools
Figma · Miro · Bitbucket
Status
Shipped ✅
Final Design

Three Views.
One Clear Picture.

The dashboard is organized into three interconnected views: a high-level summary, a product-by-product adoption breakdown, and a component-by-component usage analysis. Together they answer both the "are we healthy?" question and the "where should we focus?" question.

DS Analytics Dashboard — overview summary view
View 01 — Dashboard Overview

The Health of the
System at a Glance

The top-level view surfaces summary cards for overall adoption rate, total component usage, detachment rate, and active contributors — giving design leads an instant read on whether the system is growing, plateauing, or declining. A timeline chart shows adoption trends across the quarter so changes can be correlated with team events or new component releases.

Summary Cards Adoption Trend Single-page
Usage breakdown by product — which teams are using the design system
View 02 — Usage by Product

Which Teams Are
Building on the System

The product breakdown answers a key stakeholder question: which product teams are actually adopting the design system, and how intensively? Bar charts compare adoption rates across product lines, making it immediately visible if a particular team is lagging — prompting a targeted conversation rather than a broad guess about where to focus enablement efforts.

Team Breakdown Adoption Rates Comparative View
Usage breakdown by component — which components are used, detached, or ignored
View 03 — Usage by Component

Which Components Are
Loved, Ignored, or Broken

The most actionable view for the design system team: a component-by-component matrix showing usage counts, detachment rates, and variant selections. A high detachment rate signals a component that's being used as a starting point but not staying in sync — a direct flag that the component needs to be redesigned or better documented to fit real use cases.

Detachment Rate Usage Counts Variant Analytics
The Problem

You Can't Improve
What You Can't Measure

S&C Electric's Design System team was flying blind. The component library was being consumed by product teams, but the team had no reliable way to know which components were popular, which were being detached, or which teams were actually adopting the system versus copying it one-off.

Without this visibility, roadmap decisions — which components to build next, which ones to fix, where to invest documentation effort — were based on gut instinct rather than evidence.

HMW

How might we monitor the success and health of S&C's Design System with limited data availability?

Research

Figuring Out the Why
Before the What

Before touching a single frame, I ran stakeholder interviews with management to understand what questions the dashboard needed to answer. The goal wasn't to design a dashboard — it was to surface the decisions the team was currently making blindly, then design exactly the visibility they needed to make those decisions well.

Six distinct use cases emerged from the research, each one representing a question the team needed to be able to answer on demand.

01
Component Popularity

Find out which components are used most frequently across all products — to validate roadmap priorities.

02
Detachment Signals

Identify components that are frequently detached — a signal the component doesn't fit real design needs and needs to be reconsidered.

03
Variant Usage

See how collaborators use specific variants — to understand which design options teams actually reach for in practice.

04
Under-used Components

Surface components that exist but are rarely used — either a discovery problem, a documentation problem, or a fitness problem.

05
Cross-library Comparison

Compare adoption rates between two component libraries to inform decisions about which library to invest in and which to deprecate.

06
Team-level Breakdown

See which product teams are using each library — to identify who needs enablement support and who can be case studies for wider adoption.

Use cases and data mapping from stakeholder research
Use case mapping — stakeholder interviews translated into dashboard requirements
The Data Challenge

To Build the Dashboard,
I First Needed the Data

This is where the real constraint surfaced. A dashboard needs data — but S&C Electric's data situation was fragmented. After the initial stakeholder interviews, I scheduled a dedicated session with the engineering team to map what actually existed.

What We Had
Incomplete by Design
  • Figma analytics tracked design-side adoption only — development implementation wasn't captured
  • Components migrated from Sketch (pre-Figma era) had no analytics at all
  • Data existed in scattered file locations — not linked, not surfaced
  • Engineering team had no centralized tracking system for component usage in code
What We Found
Alternative Sources
  • Bitbucket file locations were accessible and could be manually checked for component references
  • Figma usage data remained valid for design-side metrics — adoption, detachment, variant selection
  • The file structure was consistent enough to build a manual extraction process
  • Three distinct data sources could be combined to cover the six use cases
3
Key Finding

Three separate data sources — Figma analytics, Bitbucket file references, and manual Figma file inspection — could together cover all six business use cases identified in stakeholder research. The data existed; it just needed to be found, connected, and made legible in a single interface.

Notes from the engineering team meeting — mapping data availability
Engineering team conversation — mapping available data sources against the use cases
Data structure mapped to use cases
Data structure × use case mapping — connecting available sources to the six dashboard goals
Design Process

From a Blank Doc
to Three Explorations

With use cases confirmed and data sources mapped, I moved into design. Rather than committing to a single approach, I created three distinct explorations — each with a different structural logic — and presented them to the team before any polish was applied.

01
Stakeholder Interviews
Define the six business use cases directly from management needs.
02
Engineering Deep Dive
Map available data sources and find alternatives for gaps.
03
Wireframing
Sketch structure and information hierarchy before any visual decisions.
04
3 Explorations
Three distinct visual and structural approaches presented in parallel.
05
V3 Selected & Refined
Team selected V3 for user-friendliness; refined to high fidelity.
Wireframe draft showing early structural exploration
Early wireframe — establishing information hierarchy and the three-view structure

Design Explorations

Three approaches explored different trade-offs: information density, visual hierarchy, and navigational complexity. Each was presented to the team at the same fidelity so the decision could be made on structure, not on polish.

Design explorations V1, V2, V3
V1
Dense Grid Layout

High information density with all metrics visible at once. Power users appreciated the overview, but non-technical stakeholders found it overwhelming.

V2
Tab-based Navigation

Separated the three views into distinct tabs, reducing visual noise. The context switching required between views made comparison harder.

V3 — Selected
Scrollable Single Page

Single-page layout with summary cards at top and progressive detail below. The team selected V3 for its user-friendliness — everything accessible without navigation decisions.

Sample data structure used for design validation
Sample data structure — used to validate that the dashboard could surface real Bitbucket and Figma analytics
Impact

Numbers the Team
Finally Had

After shipping, the design system team gained the visibility they'd been lacking. The dashboard enabled data-driven prioritization for the first time — and the results of the decisions it informed were measurable within the quarter.

Component Adoption
+17%
increase in component adoption

With visibility into which teams were underusing the system, the design system team could reach out directly with targeted enablement. Adoption grew 17% in the quarter following the dashboard launch — a measurable outcome of having actionable data.

Component Detachment
−13%
decrease in component detachment

The component-level detachment view flagged four specific components with abnormally high detachment rates. Each was redesigned with input from the product teams detaching them. Overall detachment dropped 13% once those components were rebuilt to fit actual use cases.

Testimonial from the S&C Electric Design System team
Reflection

What This Sprint
Taught Me

Takeaway 01
Blank requirements are an invitation, not a blocker

Walking into a ticket with no requirements and a one-week deadline initially felt like a gap. It turned out to be a design problem in its own right — the first deliverable was clarifying what the dashboard needed to answer. Translating ambiguous asks into concrete use cases was the most valuable design work I did on this project.

Takeaway 02
The data problem is part of the design problem

In most product design work, the data exists and the challenge is presenting it. Here, finding alternative data sources was design work — shaping what the dashboard could even show. Design constraints come in all forms, and engineering conversations are often where the real design decisions happen.

Takeaway 03
More stakeholder research would have surfaced more use cases

I interviewed management but didn't reach the product teams who were actually consuming the design system. Their perspective on what data would change their behavior would likely have surfaced additional use cases — or reshaped existing ones. Time pressure compressed the research, but the tradeoff was real.

The best tool I could build wasn't a beautiful interface — it was the one that finally let the team make a decision with evidence instead of instinct.

— Shaw Chen, reflecting on the DS Analytics Dashboard sprint
Next Case Study
Product Sentiment Analysis → ← Back to all work