If misuse of your medical device can harm patients or clinicians, human factors engineering (HFE) and usability engineering (UX) are important for products intended for the U.S. market. While most people working on medical devices are aware of the need, there’s less certainty about the method and decision points relevant to their unique device.

What should be considered when designing the user interface (UI) for your next medical device?

For context, it’s worth taking a step back to appreciate the broader landscape of HFE in medical devices. Many discussions start with health industry statistics describing the number of patients injured by complications arising from user interface interaction. But the most damning statistics are usually inferred rather than direct measurements of negative outcomes. It’s hard to isolate usability from dozens of other variables in the clinical setting, including patient variability, clinician judgment, and concurrent therapies and circumstances.

A different way to understand the problem is to focus on the pivotal shifts driving the U.S. Food and Drug Administration (FDA) when they released guidance on the topic in 2016. Applying Human Factors and Usability Engineering to Medical Devices is the FDA’s nonbinding set of recommendations to the medical device industry.

A prominent concept in the guidance is the term use error, which should not be confused with user errors. User errors are the fault of the user and are non-systemic to the medical device. This includes misreading a display, mental error, insufficient training, and lapse of attention.

Use error, a term coined in 1995 in a medical device trade periodical, inverts the blame from user error and asserts that the device is responsible for mistakes made by a user; even if those mistakes are made because of traits that we normally associate with human shortcomings, such as mental error or inattention.

Users and the complicated real-world environments in which they work, impose a permanent set of requirements on the medical device. Many such requirements are substantially addressable by design modifications – bigger, brighter displays; large input buttons, safety-aware workflows – avoid dependence on device training.

To demonstrate how strict the recognized standards are and how strongly the FDA feels about this, consider the following passages from the 2016 guidance:

“3.9 Use error: User action or lack of action that was different from that expected by the manufacturer and caused a result that (1) was different from the result expected by the user and (2) was not caused solely by device failure and (3) did or could result in harm.”

The first clause, “a result that was different from the result expected by the user” is pretty sweeping; how often do devices do something that we don’t expect? This doesn’t just apply to the device actions but also the lack of device actions. If this wasn’t broad enough, the phrase “...did or could result in harm” obligates the manufacturer to address both observed problems and theoretical ones.

From IEC 62366, Annex A

“…use errors are the direct result of poor user interface design”

As for who is culpable, this statement puts the blame for missed expectations directly on interface designers.

Use errors causes

How does the concept of use error motivate our thinking when designing medical device interfaces? Consider the causes of use errors, such as:

1. Managing user cognitive load

Is the user multi-tasking? Will the user be interrupted in the middle of a workflow? Is the information being presented (at every step in the workflow) too broad? Is the information being presented too superficial?

2. Accounting for untrained users

Can a manufacturer guarantee that all users will be trained? If not, will untrained users make use errors? To what extent can the medical device mitigate the potential harm from an untrained user?

3. Allowing for urgent usage

Are there high urgency use cases? If so, does the device support immediate use? Can information collection be performed at the end of, or outside of a workflow?

4. Accounting for novel use or streamlining

Will an experienced person use the device differently from a novice? Sometimes, physicians will use the device differently than other users, for example by disabling therapeutic limits. Should the device tolerate these novel uses? How will the device make these novel uses safe?

5. Balancing device security with access and workflow

Is device security and user authentication balanced with the needs of the user environment? For example, in a hospital’s ER, is urgent use balanced by the need to prevent unintended users? Is device security cumbersome to the extent that it is disabled or defeated by clinicians?

6. Compensating for user variation (height, hand size, color-blindness, visibility, audibility)

Do small buttons lead to key entry mistakes? Is the minimum font size visible at a distance for a user with corrected vision?

7. Accommodating a physically encumbered user

Do the requirements of a sterile environment conflict with the means of inputting data? Is the touchscreen sensitive to users wearing double gloves? Does the clinician have a free hand available to manipulate the device while performing other tasks?

Safety premium

Fortunately, preventing harm is a pretty well defined standard that can be addressed with design. It’s also consistent with the normal risk management practices that are incumbent with medical device design. What might be less clear is a framework for eliminating use errors.

The FDA recognizes an international standard for dealing with human factors and usability engineering – Medical Devices - Part 1: Application of Usability Engineering to Medical Devices (also known by the designation IEC 62366). The standard describes a process of risk mitigation that closely mimics the process outlined in risk management standard IEC 14971.

At a very high level, the general steps of IEC 62366 are:

1. Define the intended use, users, and environment

2. Itemize and analyze the hazards associated with use

3. Prototype solutions to the risks

4. Conduct user studies and evaluate performance

© ICS | https://www.ics.com/

Figure 1

© ICS | https://www.ics.com/

Figure 2

Figure 1 presents a simplified flowchart showing the steps in more detail and how they iterate. The process is practiced and repeated until a predetermined threshold of safety has been achieved.

IEC 62366 contains in-depth guidance on how to perform each step in this process. A key part is declaring a documented description of the use environment and conditions that can support analysis. A useful component that supports analysis is a model for how a user interacts with an interface.

The diagram in Figure 2 is a good model for segmenting the responsibilities that compose interaction with a user interface. The interesting pieces often taken for granted are on the user side. They allow us to separate Information Perception from Cognitive Processing and efforts at control.

Decomposing the interaction this way allows risk management to handle limited solvable problems. For example, when considering the UI visible when therapy is applied, Information Perception prompts the question ‘is the clinically relevant information immediately available? If it’s not, but only available on another screen within the UI, this requires the user to navigate to it. The design-time awareness that this implies another six-step cycle of interaction for the user to access the clinically-relevant information is motive enough to seek an alternative means of access or display. Using this type of model to uncover and address interaction challenges is a great tool for anticipating use errors and optimizing the user interface.

Use errors, the chief consideration

While designing a pleasing user experience has benefits in providing a clean and appealing device interaction, the highest priority for medical devices is designing to minimize use errors. Use errors are strongly correlated to patient harm. Thus, tools and considerations leading to the detection and mitigation of use errors will always have the highest value in – and be the chief consideration for – medical device interface design.

Integrated Computer Solutions

Boston UX

About the author: Milton Yarberry is the director of medical programs at Integrated Computer Solutions (ICS) and Boston UX. He has more than 15 years of experience working with large and small medical device and IVD manufacturers. Yarberry may be reached at milton_yarberry@ics.com.