Episode 210 | February 16, 2026

Operationalizing UX research at TruStage

Learn how TruStage’s design team operationalized UX research with an embedded, scalable model that built stakeholder trust, doubled research output, and sped up insights.

How TruStage's design team operationalized UX research

Most teams don’t struggle with believing in research—they struggle with making it work in practice.

Designers know research is valuable. Stakeholders want confidence in decisions. But somewhere between good intentions and real-world delivery, UX research often becomes inconsistent, hard to explain, or easy to deprioritize. That was the challenge facing TruStage before they rethought how research fit into their design practice.

In a recent episode of Insights Unlocked, leaders from TruStage’s design and research organization shared how they moved from fragmented efforts to a scalable, trusted system for UX research operations—one that empowered designers, increased stakeholder confidence, and more than doubled research output year over year.

TruStage is a leading provider of insurance, investment, and technology solutions focused on helping credit unions and their members achieve financial security. Headquartered in Madison, Wisconsin, the company serves customers across wealth, commercial, and individual insurance segments—including retirement solutions, lending protection products, and auto, home, and life insurance. 

Following a major brand transformation that unified multiple entities under one TruStage brand, the organization has focused on operating as a single, customer-centric enterprise, driving innovation and improving experiences across both B2B and B2C channels.

Play video

The challenge: from good intentions to inconsistent execution

Before their transformation, TruStage wasn’t lacking talent or ambition. The design team had experienced researchers, strong designers, and leadership support. What they lacked was a shared system.

Nick Higbee, who leads TruStage’s enterprise-wide UX and digital design practice, described their early state as a collection of disconnected efforts rather than a unified approach.

Research was happening—but it wasn’t embedded.

Historically, research lived outside the design team. Designers partnered with separate research groups, which introduced delays, handoffs, and uncertainty. Teams spent time debating whether to do research, when to engage, and how to scope it—often before any learning even began.

As Nick put it, the team needed to move away from “having conversations about whether or not to reach out to a research team” and toward a model where research could “pivot on a dime and fire up learning when we needed it.”

Benny Brooks, UX design lead and design operations leader at TruStage, captured the problem with a metaphor that quickly stuck, “Previously we were sort of running our research practice a bit like a potluck.”

Everyone brought something valuable to the table, he said, but nothing was standardized, shared, or repeatable. Stakeholders saw only isolated outputs, not a coherent practice. Designers relied on personal experience rather than shared guidance. And scaling research meant reinventing the wheel every time.

This made operationalizing UX research feel risky. Without consistency, research was harder to defend, harder to plan, and harder to trust.

Stream On

Share

Get Started Now

Contact Sales

The UX research methodology guidebook

The cost of fragmentation

The impact went beyond inefficiency.

  • Stakeholders questioned timelines and value because research wasn’t visible
  • Designers hesitated to recommend methods without a clear framework
  • Research felt like something that had to be “sold” rather than requested
  • Measurement focused on activity, not quality or impact

Even when research happened, it often showed up too late or was misunderstood entirely.

As Betsy Drews, senior designer at TruStage, explained, when stakeholders don’t see how insights are produced, “they’re more likely to question it.”

The team knew they needed a new approach, one that balanced rigor with speed, and standardization with creativity.

The solution: designing a system, not just a process

Instead of treating research operations as a documentation problem, the TruStage team treated it as a design challenge.

They asked a simple but powerful question: What would it look like if research was as easy to understand as it was to execute?

The answer became a visual, metaphor-driven framework built directly into the team’s everyday tools: a research “cookbook.”

Importantly, the metaphor didn’t come first. The thinking did.

Through months of philosophical discussion, iteration, and partnership with UserTesting, the team realized that different audiences needed different levels of detail. Designers needed execution guidance. Stakeholders needed clarity and expectations. Product teams needed repeatable plans.

That insight shaped the system.

Recipes, menus, and meal plans

The cookbook framework introduced three core artifacts:

  • Recipes for designers and researchers, outlining how to plan and run specific studies
  • Menus for stakeholders, explaining value, effort, timing, and outcomes
  • Meal plans that bundle research methods across stages of product development

As Benny explained, “Recipes are for the cooks and menu listings are for the diners. They both describe a dish, but one tells you how to make it and the other tells you what you’re getting.”

This separation reduced cognitive load and eliminated unnecessary detail for stakeholders, without compromising rigor for practitioners.

The system also made room for embedded UX research, allowing designers to move fluidly between learning and building. Research was no longer something you escalated for, it was something you planned for.

The 2026 experience survival guide: scaling human insight across every team

Built where the team already worked: Figma

A critical decision was to house the entire system in Figma.

Rather than introducing new tools or repositories, TruStage met designers where they already spent their time. Figma became the “Grand Central Station” for research planning, linking out to UserTesting, agile backlogs, and supporting documentation.

Nick described this as a turning point, “The moment we decided to use the tools that we loved and have fun doing things that we were good at, that’s when this really became part of our practice.”

This approach reinforced UX design and research collaboration, making research feel less like an external dependency and more like a core design capability.

Turning complexity into clarity

One surprising lesson was that adding structure didn’t make things heavier, it made them easier.

For example, what started as a single “user testing” method eventually became three distinct research cards: concept testing, comparison testing, and usability testing. This shift aligned research more closely with the double diamond and reduced confusion about when to use each method.

“It would have been more difficult to continue trying to pile these different things into a single card,” Benny said. 

The result was a system that supported research workflow optimization without sacrificing nuance.

The role of UserTesting and customer success

This transformation didn’t happen in isolation. TruStage worked closely with UserTesting’s Customer Success and consulting teams throughout the journey, using regular check-ins to refine their approach and pressure-test ideas.

As Benny shared during the episode, “I felt really supported. Half the time I felt crazy, and Natalie [Padilla] would be like, ‘No, this is cool, you should push.’”

That consistent partnership mattered.

Through regular check-ins, UserTesting’s Customer Success partner stayed close to TruStage’s goals and helped identify emerging needs early. As those needs became clearer, they brought in UserTesting’s consulting support to help scale the impact of insights across teams. Along the way, Customer Success helped broaden awareness of the platform across the organization—creating opportunities for more teams to understand what was possible and how to apply insights more effectively.

This collaboration reinforced a shared focus on time to insight in UX research, while still elevating quality and consistency.

The results: research that scales and sticks

The impact of the new system was measurable and immediate.

Within a year, TruStage more than doubled its research output—from 83 completed research stories to 201. But the numbers only tell part of the story.

“When we first started, we had a lot of anxiety about having to pitch research or sell research,” Nick said. 

But instead of designers advocating for research, Nick said, product leaders began requesting it. Research stories started appearing in backlogs initiated by stakeholders, signaling growing trust in the practice.

“It’s less of a push and more of a pull,” he said. “That’s a really good indicator for us.”

Designers also gained confidence. Whether seasoned researchers or newer practitioners, the cookbook gave everyone a starting point—supporting scaling UX research across experience levels.

Betsy highlighted how this changed stakeholder conversations, “We ran through our full research plan in about five minutes. The energy in the room changed. It removed a lot of effort for us and created immediate buy-in.”

Research became easier to explain, easier to plan, and easier to integrate into agile workflows, key markers of research operations strategy maturity.

Why this approach worked

Several principles made the difference:

  • Co-creation across the team, not top-down mandates
  • Time invested in shared philosophy and language
  • Visual artifacts that traveled easily across contexts
  • Iteration treated as a feature, not a flaw

Most importantly, TruStage didn’t try to force research into existing constraints. They redesigned the system around how teams actually work.

What began as a “potluck” evolved into something closer to a Michelin-star kitchen: structured, reliable, and still creative.

And the work isn’t done. The team continues to iterate, refine, and expand the system as new needs emerge.

As Nick reflected toward the end of the conversation, it is like iterating on designs within Figma, “We’re giving ourselves the latitude to just get started and evolve as we go. It’s been huge.”

Episode links: 

Frequently asked questions (FAQs)

What is UX research operations?
UX research operations (Research Ops) is the systems, processes, tools, and standards that enable teams to plan, execute, and scale research effectively. It ensures research is consistent, efficient, and aligned with business goals.

How did TruStage operationalize UX research?
TruStage embedded research directly within its design team and built a shared “cookbook” framework that clarified methods, expectations, and timelines for both practitioners and stakeholders.

What results did TruStage see?
Within a year, the team more than doubled its research output and shifted from “selling” research to seeing product leaders proactively request it.

Why does embedding UX research matter?
Embedded research reduces delays, improves collaboration, and enables faster time to insight—helping teams make confident, customer-centered decisions.

UXR maturity model