FUSE™: A Framework for Useful Learning Evaluation

FUSE™ is a brand-new learning evaluation model for a modern workforce.

A few years back, while coaching a group of instructional designers, Dr Heidi Kirby was asked "what learning evaluation model or framework should we be using?" This was a question she had dreaded. She had deeply explored the mainstream old and new evaluation models during her PhD program, but there wasn't any that stood out to recommend.

So she created her own to address the issues she found in other frameworks:
  • Adaptability: A solid framework should be both adaptable to your organization and your instructional design process, considering the modern work world after the pandemic and with the integration of AI. It should be able to be applied to large, long-term projects (like a 6-month leadership program) or small, one-off tasks (like a one-page job aid).
  • Accessibility: A well-adopted model isn't hidden completely behind a paywall. There should be enough of it public that people can use it successfully and without being hounded by salespeople. It should also be easy to use and follow visually.
  • Accountability: A great framework should consider how learning impacts everyone at the organization, not just its intended audience. There are team members, stakeholders, and SMEs who do important work, and when you focus only on the audience, you miss out on metrics that impress your leaders and most importantly, your CEO.

Join the wait list!

Sure, you can learn how to use the framework for free from this website, but this book is going to be jam packed with ways to measure, best practices for sharing impact, and examples of how you can apply the framework. Plus, Heidi was a writer before she did any of this learning stuff -- she's pretty good.

Don't trust the LinkedIn algo to keep you posted -- sign up for the Useful Stuff newsletter for book launch alerts and exclusive behind-the-scenes updates!

What is FUSE?

FUSE™ is a learning evaluation model designed to be fused onto the instructional design process you already use.

It consists of 5 steps, and is depicted fusing with ADDIE here for demonstration purposes.
Diagram of the FUSE™ Framework for Useful Learning Evaluation showing a five-step workflow—Identify outcome, Select measures, Collect data, Tell the story, and Iterate and refine—mapped across the stages of Analysis, Design, Develop, Implement, and Evaluate, with a table below categorizing metrics by Organization, Team, and Audience.

© 2026. FUSE™ is a trademark of Dr. Heidi Kirby. All images on this page are licensed under CC BY-NC-ND 4.0.

Following the Steps

  1. Identify the outcome - During your needs analysis phase or your intake process, determine the overall outcome and objectives of the learning project to guide your measure selection.
  2. Select measures - While you're determining solution design and the best delivery method, decide which measures from each area you’ll report on, and create a plan to collect or create those data points.
  3. Collect data - Before you launch your learning project, collect any “before” data points you need, and locate the data you’ll need. After the project, collect any “after” data points you need.
  4. Tell the story - Using the collected data, share with leadership the impact the learning project had. Avoid L&D jargon.
  5. Iterate and refine - Based on feedback and the numbers, make edits to the learning project accordingly.

The Measures

The measures are probably the most important and complex part of the framework.

Here's some important things you need to know before diving in:

  • There is NO hierarchy to this model. That's intentional. It's organized from large to small (organization to audience), and the measures are listed alphabetically.
  • You're never, ever meant to use all of the measures. You may use more as projects increase in scale, impact, and visibility, but you should only choose what applies for that project.
  • You should try to include one measure from each bucket (organization, team, and audience) to show triangulation of data you're collecting to prove impact.
  • There are both qualitative and quantitative examples of data given. You should aim for a mix of both for best results.
  • Some measures won't apply to certain projects, and that's just fine. For example, you might care to measure audience reception (how they feel about the learning experience) after an in-person training session, but you may not care to measure that for a quick one-page job aid meant to correct an unsafe behavior.


Select each tab below for specific definitions of each of the measures under each bucket. These will help you better understand what to choose for real projects.

Organizational measures show how a learning initiative improves performance, saves time, and supports long-term company success, which helps leaders see your team as valuable to the future. These measures look at the employees as one large group and are focused on what happens after learning.
Your true team in L&D consists of all the different people you work with on a learning project, including your teammates, manager, SMEs, and leaders. These measures ensure that the team is collaborating successfully and projects meet expectations. They are designed to measure the process, not just the output, and look at what has happened before the learning began.
Audience measures include how people react to and engage with learning projects, as well as what they take away from the experience. They look at how each person performs, contributing to a larger picture of how employees are responding and improving, and they capture what happens both during and after learning.

*Note* It’s important to remember that just because someone shows up doesn’t mean they’re learning (that's also why they're called the "audience" and not "learners" here).
Can I Use This?

Short answer: yes, of course! Feel free to make the concepts your own. You can add or remove measures to align with your company culture or client needs.

You can also share the image of the framework above with your manager, social media, or even Uncle Ralph. However, you cannot alter the image file in any way, and you must credit Dr. Heidi Kirby as the creator.

One more thing: To protect the integrity of this work, please do not upload this image into any AI platform, use it within eLearning modules, or copy/paste this website’s content into AI or training materials.
Stay tuned for even more!
In the coming days, we'll be adding even more info about FUSE, including Dr Kirby's keynote from the IDTX conference.

In the meantime, if you have any questions, want to share how you're using FUSE, or want to learn how to collaborate to integrate FUSE into your course or product, reach out at FUSE@getusefulstuff.com.
Notice Element
Here's a multi-purpose notification you can use for cookies notices, sales promotions etc.