top of page
Search

Understanding Use-Related Risk Analysis (URRA) and Use-Failure Modes and Effects Analysis (uFMEA) in Medical Device Development

Part I - Introduction: Why Risk Analysis Matters in Medical Device Development


Developing safe and effective medical devices requires a structured approach to risk management, particularly in human factors and usability engineering. Over the past few decades, the emphasis on Use-Related Risk Analysis (URRA) has increased significantly, recognizing human error as a major contributor to device-related adverse events. In parallel, Use-Failure Modes and Effects Analysis (uFMEA) has evolved from traditional risk management methodologies to systematically assess use-related failure modes.


A well-executed URRA and uFMEA framework is crucial for regulatory compliance and patient safety. These methodologies are integral to meeting the expectations set forth by FDA, IEC 62366-1, ISO 14971, AAMI HE75, and the EU MDR, ensuring that human factors risks are proactively identified and mitigated throughout the device development lifecycle.


But why do some medical devices feel intuitive and safe, while others lead to confusion, errors, and even patient harm? The difference often lies in how effectively human factors engineering and risk management principles were applied during development.


Why These Methods Matter Two Essential Risk Assessment Tools in Human Factors and Usability Engineering


  • Use-Related Risk Analysis (URRA) – A task-based, qualitative approach that evaluates how users interact with a device, identifying potential use errors.

    • A task-based, qualitative method used to identify potential use errors by analyzing how users interact with a product interface. URRA focuses on critical tasks, mapping user actions to potential hazards and associated harm, and is foundational in demonstrating safety and effectiveness in both device and combination product systems.

  • Use-Failure Modes and Effects Analysis (uFMEA) – A semi-quantitative risk assessment adapted from FMEA to systematically analyze failure modes associated with user interactions.

    • A semi-quantitative extension of FMEA tailored to analyze potential failure modes specifically related to user interactions. It evaluates risk using severity, occurrence, and detection ratings, and is often applied later in development to assess residual risks and support documentation aligned with ISO 14971.


Although both methodologies are critical, they are often misapplied or used in isolation – leading to regulatory setbacks, usability issues, and post-market safety concerns.


Why These Methods Matter


  • Regulatory Compliance – Align with FDA, EU MDR, and IEC 62366-1 requirements.

  • Proactive Design Improvements – Reduce usability-related failures before market release.

  • Error Mitigation – Address workflow inefficiencies, misinterpretations, and interface-related risks.


This article explores the history, definitions, methodologies, key differences, challenges, and best practices of URRA and uFMEA, providing medical device developers with a roadmap to integrate these essential risk analysis tools effectively into their product development process.


Evolution of Human Factors and Usability in Medical Device Risk Management: Use-Related Risk Analysis (URRA) and Why It Became Essential


For decades, medical device manufacturers relied primarily on FMEA to identify and mitigate risks. However, real-world experience showed that many adverse events weren’t caused by component failures but by human interaction errors.


The Turning Point: Understanding Use Errors in Medical Devices


Regulatory bodies, including the FDA and ISO, started seeing patterns in device-related incidents:

  • Complex interfaces confused users, leading to misinterpretation of critical information.

  • Unintended button presses, poor UI layout, and ambiguous workflows contributed to severe patient harm.

  • Even trained users made errors not due to negligence, but because of poor design.


Key Regulatory and Industry Historical Milestones That Shaped URRA

1980s–1990s: Identifying Usability as a Major Cause of Device Failures

Regulatory Awareness

  • The FDA, ISO, and AAMI began identifying user-related failures as a leading cause of adverse events in medical devices.

  • Traditional risk assessment methods, such as Failure Modes and Effects Analysis (FMEA), were found insufficient in accounting for human-device interaction errors.

Adverse Event Reports Highlight Human Error

  • Many device failures stemmed from:

    • Confusing or poorly designed user interfaces

    • Misinterpretation of alarms and system feedback

    • Workflow design flaws that led to incorrect operation

  • The industry began shifting focus from system-level faults to human factors and usability engineering.

2000–2010: Early Integration of Human Factors into Risk Management

2000: Introduction of ISO 14971

  • Established a formal risk management framework for medical devices

  • Focused primarily on device-related hazards and manufacturing defects

  • Lacked explicit guidance on usability and human factors risks

2007: ISO 14971 Update

  • Improved definitions for risk management processes

  • Still did not fully address usability-related risks

2009: AAMI HE75:2009

  • Introduced the first comprehensive usability engineering guidance specific to medical devices

  • Established key best practices:

    • Task analysis

    • Formative usability testing

    • User-centered design

    • Iterative mitigation of use errors

2011–2015: Formalization of Use-Related Risk Analysis (URRA)

2011: FDA Draft Guidance on Human Factors Engineering

  • Clarified the human factors engineering process and emphasized URRA, supported by task analysis, as a structured method within a risk-based framework

  • Defined critical tasks as those that, if performed incorrectly or not at all, could result in serious harm to the user or patient

2015: IEC 62366-1

  • Standardized the usability engineering process for medical devices

  • Required formal identification, analysis, and mitigation of use-related risks

  • Mandated human factors validation testing to demonstrate safe and effective use

2016–Present: Regulatory Adoption of Usability and URRA

2016: FDA Final Guidance on Applying Human Factors to Medical Devices

  • Made URRA a mandatory step in usability risk management

  • Formalized the integration of human factors engineering into medical device development and regulatory submissions

  • Reaffirmed that critical tasks are those that could result in serious harm if performed incorrectly or not at all

  • Required human factors engineering activities to be included in submissions (e.g., 510(k), PMA)

2017: EU MDR (Regulation 2017/745)

  • Strengthened human factors and usability requirements for CE marking

  • Made URRA essential for demonstrating compliance with General Safety and Performance Requirements (GSPR)

  • Required documented evidence of usability testing and risk mitigation

2019: ISO 14971:2019 Update

  • Reinforced the role of URRA in linking usability engineering to formal risk management

  • Expanded expectations for integrating human factors into overall safety processes

2021: IEC 62366-1 Update

  • Clarified validation requirements and strengthened traceability between user needs, risk analysis, and summative usability studies

  • Reaffirmed the importance of task analysis and user interface design

2023: AAMI HE75:2023 Expands URRA Methodologies

  • Expanded the scope of human factors engineering by requiring URRA to encompass the entire combination product user interface – including the device, drug or biologic component, packaging, labeling, and user training – as a unified system

  • Redefined the definition of critical tasks by removing the “serious” harm threshold. Tasks are considered critical if, when performed incorrectly or omitted, they could result in any level of harm to the user or patient – broadening the scope of what must be assessed and validated

  • Reinforced that human factors validation must demonstrate users can safely and effectively complete all critical tasks across the integrated system prior to regulatory submission or pivotal clinical studies

  • Integrated post-market surveillance findings into usability testing protocols and design refinement

  • Updated best practices for:

    • Formative and summative usability testing

    • Usability risk management processes

    • Task analysis and interface design

    • Integration of user training and labeling into risk mitigation

    • Real-world evidence incorporation from post-market surveillance

    • Holistic, system-level validation of the entire user interface configuration

2023: FDA Final Guidance on Human Factors Studies for Combination Products

  • Required combination product manufacturers to approach usability from a system-level perspective, with integrated URRA across drug, device, labeling, and packaging

  • Formalized how human factors evidence should be incorporated into clinical study protocols and submissions for FDA review

  • Emphasized holistic safety and effectiveness of the entire product configuration under actual use conditions


In Summary...

Category

2011 Draft GuidanceApplying Human Factors and Usability Engineering to Optimize Medical Device Design

2016 Final GuidanceApplying Human Factors and Usability Engineering to Medical Devices

2023 Combination Product GuidanceApplication of HF Principles for Combination Products: Q&A

URRA Scope

Focused on the device interface, with URRA structured around task analysis and identification of use-related risks tied to critical tasks

Maintains task-based URRA approach; emphasizes early integration of HF into development and full traceability to summative testing

Expands URRA to evaluate the entire combination product system, including drug/biologic, device, labeling, packaging, and training as a unified interface

Definition of Critical Tasks

Tasks that, if performed incorrectly or not at all, could result in serious harm to the user or patient

Same definition retained; requires validation of tasks with potential for serious harm

Refines the definition: critical tasks are those that, if performed incorrectly or not at all, could result in any harm—removing the “serious” qualifier and broadening what must be validated

HF Validation Study Requirements

Demonstrate users can perform critical tasks in simulated-use scenarios reflective of real-world conditions

Required for submission; must test final design with representative users and environments; study must align with URRA findings

Same expectations as 2016, but explicitly allows flexibility in timing (pre-, during, or post-clinical trials); validation must reflect full product system use, not isolated components

Integration with Clinical Development

Not explicitly addressed

HF validation required prior to marketing authorization; no formal integration with clinical trials

HF validation may be conducted before, during, or after clinical trials if justified; particularly important if user interface affects trial execution or interpretation

Regulatory Focus

Establishes HF/UE expectations; aligns loosely with ISO 14971 and IEC 62366

Aligns with international standards; sets expectations for design controls, URRA documentation, and validation testing

Tailored for combination products (CDER/CBER/CDRH reviews); aligns HF expectations with full product lifecycle and cross-center submission strategy

Remember…

  • URRA is now a central requirement in medical device risk management.

  • Global regulatory standards (FDA, ISO, IEC, EU MDR) now mandate and reinforce human factors and usability engineering practices to mitigate use-related risks.

  • Usability failures are recognized as major contributors to adverse events, leading to stronger regulations.

  • The integration of human factors into risk management continues to evolve, ensuring safer, more effective medical devices.



Table of Contents

Understanding Use-Related Risk Analysis (URRA) and Use-Failure Modes and Effects Analysis (uFMEA) in Medical Device Development
Part I
Introduction: Why Risk Analysis Matters in Medical Device Development
  • Two Essential Risk Assessment Tools in Human Factors and Usability Engineering

  • Why These Methods Matter

Evolution of Human Factors and Usability in Medical Device Risk Management: Use-Related Risk Analysis (URRA) and Why It Became Essential
  • The Turning Point: Understanding Use Errors in Medical Devices

  • Key Regulatory and Industry Historical Milestones That Shaped URRA

  • Remember


Essential Reading List for Human Factors in Medical Device & Combination Product Development

Regulatory & Industry Standards
  1. FDA (2016). Applying Human Factors and Usability Engineering to Medical Devices: Guidance for Industry and FDA Staff. U.S. Food and Drug Administration.

  2. FDA (2023). Application of Human Factors Engineering Principles for Combination Products: Questions and Answers. U.S. Food and Drug Administration.

  3. AAMI HE75:2023. Human Factors Engineering—Design of Medical Devices. Association for the Advancement of Medical Instrumentation (AAMI).

  4. ISO 14971:2019. Medical Devices – Application of Risk Management to Medical Devices. International Organization for Standardization.

  5. IEC 62366-1:2021. Medical Devices – Part 1: Application of Usability Engineering to Medical Devices. International Electrotechnical Commission.

Core Textbooks in Human Factors Engineering
  1. Wickens, C. D., Lee, J. D., Liu, Y., & Becker, S. E. G. (2015). An Introduction to Human Factors Engineering (2nd ed.). Pearson.

  2. Sanders, M. S., & McCormick, E. J. (1993). Human Factors in Engineering and Design (7th ed.). McGraw-Hill.

  3. Salvendy, G. (Ed.). (2012). Handbook of Human Factors and Ergonomics (4th ed.). Wiley.

  4. Norman, D. A. (2013). The Design of Everyday Things (Revised & Expanded Edition). Basic Books.

Task Analysis, Use Error, and Risk Methods
  1. Annett, J. (2003). Hierarchical Task Analysis. CRC Press.

  2. Stanton, N. A., Salmon, P. M., Walker, G. H., Baber, C., & Jenkins, D. P. (2013). Human Factors Methods: A Practical Guide for Engineering and Design (2nd ed.). CRC Press.

  3. Reason, J. (1990). Human Error. Cambridge University Press.

  4. Dekker, S. (2014). The Field Guide to Understanding 'Human Error' (3rd ed.). CRC Press.

  5. Hollnagel, E. (2012). FRAM: The Functional Resonance Analysis Method. Ashgate.

  6. Bogner, M. S. (Ed.). (1994). Human Error in Medicine. CRC Press.

Cognitive Systems, Design Psychology, and Applied Safety
  1. Vicente, K. J. (2003). The Human Factor: Revolutionizing the Way People Live with Technology. Routledge.

  2. Leveson, N. G. (2011). Engineering a Safer World: Systems Thinking Applied to Safety. MIT Press.

  3. Endsley, M. R., & Garland, D. J. (2000). Situation Awareness Analysis and Measurement. CRC Press.

  4. Gawande, A. (2010). The Checklist Manifesto. Metropolitan Books.


Follow us to learn more...

Interested in staying connected and learning more about medical human factors, sign up for our HFUX Research newsletter here on our website hfuxresearch.com and make sure to follow our LinkedIn pages HFUX Research and Safe & Effective Podcast for industry insights, career tips, and networking opportunities.




 
 
 
bottom of page