Prioritizing design systems

A step-by-step approach to managing and prioritizing requests in your design system.

This article is an adaptation of my talk from the Into Design Systems Conference earlier this year. I’ve been thrilled by the positive feedback and inspired by how many teams have applied their own versions of our process. By sharing this in written form, I hope to reach even more teams looking to bring structure and clarity to their evolving design systems. For those who didn’t catch the talk, or simply prefer reading, I hope you find this article helpful. Enjoy!

Agoda’s design system began in 2018 as a small developer-led side project and has since grown into a cross-functional team of nearly 20.

When I joined to lead the design of the system in 2022, a key challenge was helping leadership understand the value of a design system. As the Agoda Design System (ADS) grew and more teams adopted it, new challenges emerged, especially in managing increasing demands.

We now support over 60 product teams and 1,600+ designers and engineers, launching over 100 A/B tests across 4 platforms weekly. With nearly 100% monthly active adoption of ADS across our consumer-facing teams, a new challenge arose — dealing with all of the work!

As our workload grew, it became clear we needed a better way to prioritize requests. Conflicting needs from different teams left us struggling to justify why one request was prioritized over another.

At nearly 20 people, internal coordination became challenging, leading to misaligned priorities and dropped requests, frustrating both our team and stakeholders.

A common challenge we faced was managing timeline expectations from fast-moving, velocity-driven teams that relied on us. As Josh Clark put it in his great article Ship Faster by Building Design Systems Slower, “When a design system team isn’t delivering new features, components, or patterns as fast as product teams need them, the team believes it’s a bottleneck.”

We experienced this as well, compounded by the fact that many teams didn’t realize how many requests we were handling, which led to misaligned expectations. Additionally, a lack of transparency eroded trust, as teams struggled to track progress or understand our decision-making process.

To tackle these issues, we developed new processes, implemented a prioritization framework, and improved communication. Next, I’ll walk through these solutions and share resources to help your team.

This is how our new request journey looks like in ADS. It starts with the request, which can be submitted by anyone at Agoda.

Then our team will review, prioritize and groom the request. Once it’s been groomed, it can be picked up by our team or any contibuting team who needs the change — pretty straightforward so far.

Lets dive deeper into each part of this journey to learn more:

We’ve streamlined how we manage new requests by setting up a dedicated Jira board exclusively for requests. This board organizes everything by status and prioritizes requests from highest to lowest.

You can easily see the type of request and the total number, and there are filters to sort by status or platform for more specific views.

Out team supports a couple of different requests:

Feature requests — such as new components or added functionality in existing components, utilities or patterns.Visual assets — Icons, illustrations, country flags, logotypes and etc.Tokens — New design tokens or overhauls of existing ones.Tooling — Improvements to our documentation platform, Figma plugin or tech tooling that our team provides.

Anyone can submit a new request, and we’ve tailored the fields based on request type to gather key details, like a problem statement and proposed solution. We’ve also provided Figma templates that include design specs, use cases, and other essential information to help us fully understand the request.

While the process may seem detailed, it helps us ensure that we fully understand the requester’s needs and that the request is truly essential. We also offer plenty of other avenues for feedback, ideas, and bug reports — such as our Slack channels for each component, which serve as great spaces for discussions and inspiration.

Next, I’ll explain how we ensure unbiased and transparent prioritization, focusing on business needs:

To do this, we score requests based on four key criteria:

1. Product Area

2. Reusability

3. Alternative Solutions

4. Effort

Each criterion is rated on a scale from “high” to “won’t fix,” and the total score determines the final priority. We intentionally weight these criteria differently — reusability, for instance, carries more weight (15 points) than low effort (10 points) because of its long-term impact.

We chose to customize our framework to our specific needs — Although it has a lot of similarities with more established frameworks, such as the RICE scoring method which Stuarth Smith recommends in 3 ways we’ve energised our design system governance, which I can recommend to read.

To illustrate how this process works, let’s consider an example — a team requests an enhancement to our existing date picker component, asking for the ability to select years far in the past or future:

1 — Product area

First, we assess the Product Area. Does the request address a need from a business-critical project, product, or team?

Let’s say that this year selector was requested for a imporant upcoming launch — in that case it would receive the highest score.

2 — Reusability

Second, we check if the solution can be reused across multiple platforms, features, or teams.

To evaluate all requests fairly, we had to define exactly what reusability means for us:

Platforms & Funnels: For the year selector, it’s relevant across all platforms — hotels, flights, and activities. It could also be used for things like date of birth or passport expiry, so the reusability is high.End-user impact: Many users could benefit from this feature, but far from all use-cases. Therefore we would score the impact on end-users would be medium, as a majority might benefit, but not all.Impact on our supported teams: We’ve documented use cases from a handful of supported product teams in our organisation, so that’s medium as well.Change frequency: Given its complexity, we forsee a lot of tweaks and change requests down the line, which might make this feature high in maintenance. This is something we in general want to avoid, so we would score this as low.

Overall, reusability would land at medium:

3 — Alternative Solutions

Third, do we already have something in place that could solve the problem?

For the example of the year selector, a simple dropdown or input field might work in some cases. But in other cases it’s essential to keep the year selection within the date picker, so we score this as medium.

4 — Ease of implementation

How much work is required? We prioritize low-hanging fruits, as even small fixes can unblock teams quickly. We’d rather tackle 10 small blockers than focus on a single, larger issue.

Date pickers are complex, with different UX requirements across implementations, so this change would require significant work. Therefore the score for this feature is low.

Final score

Summing up the scores, we get a total of 30 out of 50, putting this request in the medium-priority category:

 

Leave a Reply

Az e-mail címet nem tesszük közzé. A kötelező mezőket * karakterrel jelöltük