Managing the Big Risk – The User
Presented by MaryBeth Privitera, PhD FIDSA – October 11, 2018
Reading Time: 7 minutesMaryBeth Privitera:: I like to take pictures of things. I observe people for a living, and I observe the surroundings I live in.
These are just some of the funny pictures I take along my travels.
- Funny Pictures on MaryBeth Privitera Travels
Things are odd, but you have to look at them. We’re so busy going through life that, sometimes, you need to stop and just take a look around.
What I’m here to talk to you is about managing the big risk. And that is the user.
I work at HS Design. It is a certified design firm. I teach at the university. I like both worlds. Both worlds give me insights into a whole host of things, and given that I like to observe the world around me, that’s pretty prime.
MaryBeth Privitera is impressive.
She co-chairs the AAMI human engineering committee, is the Human Factors and Research Principal for HS Design, and teaches biomedical engineering at University of Cincinnati.
Naturally, she gave an excellent talk at 10x in October, here for your education.
About HS Design
HS Design is 40 years old. We do full service; we have engineers, designers that work there. I just wanted to give you the context of my world.
Overview
I’m here to talk about:
- the role of the user in risk management;
- how to uncover the users’ abilities, their actions and attitudes in the design process; and then,
- the application of human factors.
One of my other jobs is I co-chair the AAMI human engineering committee. Human factors and usability are near and dear to my heart. I’m an industrial designer by training, even though I teach biomedical engineering.
Role of the User in Risk Management
First and foremost, if you don’t have any use risk, you don’t have any problems.
The Lowly Scalpel
Think about a plain scalpel. Would a scalpel get approved today with our current methods of risk analysis?
No, it would never get approved because you could never get rid of all the associated risks.
Users are unpredictable, and that is the point. You don’t know how a user will behave, but you can predict it.
When we do our risk and our use-risk analyses, we consider all the various ways a user could behave, but we’re very creative as individuals.
The best example of our creativity is if I were to open up everyone’s cell phone: the apps you selected, the organization … we like to customize; we like to do things on our own. That is true in every person’s being in their way of life.
This is a really good question – If a user performs the task incorrectly or fails to perform a task, could it cause serious harm?
That’s really what the agency is looking for – Could it cause harm?
Definition of Harm – CDRH vs CDER
When I look at the discussions between CDRH and CDER – for those that are in the pharma versus the device world – their definition of harms varies.
So the definition of harm according to CDER is, “Yes. You know what, a little bit is okay.”
And that’s why the scalpel stays on the market – there’s no possible way I can do surgery without doing some level of harm.
Whereas the pharma world says, “Nope. No harm. No harm at all.”
So when we get into the combination device, it gets increasingly complex, which means that you must pay attention to our use-base risk.
Know Your Users: Apply Human Factors
So what does that mean? The first and fundamental thing about assessing the user-end risk is that I have to know my user in and out.
- Know Your Users – Apply Human Factors
If I don’t know my user, I can’t actually do an adequate risk analysis. It’s just fundamental, and that means everyone.
So good design is going to start with knowledge of the user – What are their capabilities, their education? What’s their training? What’s their bias? How long have they been there? Is there a difference between somebody that has been working in the hospital for six months versus 10 years?
Yes, of course, there is. What are they, and how does that information impact the design? How can I bring in some of the disciplinary bias?
So let me give you an example of what I mean by disciplinary bias. And we all have them.
If you were to spend a day with an anesthesiologist, you’d notice they have an entire box that they will mix and, then, they will administer their medications. If I give them an infusion pump that automates it, they don’t like it. They like to mix it. That’s a fundamental tenet of who they are.
So they (disciplinary bias) exist in everything, and it’s about control because we like control.
Hence, that’s why all of our cell phones are different. So you get to the element of creativity; you get to the element of how we go ahead and execute our life, how we execute our daily life and our daily practice. And it’s true for all of our users.
Why Apply Human Factors?
I have these two pictures from my academic world, and I think they are two really telling pictures.
- Why Apply Human Factors?
And the reason is, someday, God forbid, we may encounter the fact that we could be that person or a loved one could be the person in that bed, in the ICU over there.
And if you look at the number of screens that are available and the amount of information that’s coming to that critical-care physician team – which is the picture on the left that is a group of critical care physicians, residents, nurses, pharmacists on rounds – the amount of information going to these people is overwhelming. and they’re going to make the decisions that impact their life.
So the way we need to think about human factors is really just making it easier for them to make the right decisions; harder for them to make the wrong decisions; and making sure that, when we consider the patient at the end of the day, they’re doing the right work for their patients.
I’d like for them to not really look at a screen but, actually, look at the patient. Those are my thoughts.
It’s true throughout the hospital. This is an example of the amount of screens that you would find in anesthesia, and you can find complexities everywhere.
- All Devices For Anesthesia
If you look at the practice of medicine, it’s abundant. Even when we bring devices to the home care, they’re also complex. We’re not following the IKEA model of instructions for use. You’d find them to be pretty elaborate.
But this is just an example of some of the world in which we have developed.
One of the challenges that we have as device developers is, we look at our own device in isolation. We don’t look at our devices as part of an overall system, which is where we need to take a look at it from – a very specific “Here’s my user, but here’s the context with which it’s used in.”
So when we design in context, it becomes a little bit easier, a little bit better for us to consider how that user is going to make that clinical decision.
And we know, from a business perspective, if it doesn’t lead to clinical decision making, we don’t actually have a product to sell. So we have to be involved in that clinical decision and on the onset.
Uncover User Abilities, Actions & Attitudes
Someone’s going to prescribe our device; someone’s going to use our device; someone may be the recipient of that device. So how do we uncover our user’s abilities, attitudes, actions?
When we talk about considering capabilities and limitations, we’re talking really about –
- how big they are,
- their age,
- their dexterity,
- training and education,
- their experience,
- and their culture.
When I talk to physicians, they could be trained in a specific organization, and that is why they do what they do. It’s based on that training. They’ve always done it that way. If it’s not broke, don’t fix it.
And then, with more and more devices coming into the home healthcare, age becomes something we must consider. It could be a child that uses the device.
It could be something as simple as “how does the color go?” Color and design go beyond brand guidelines and preferences. We have to consider – are they going to be colorblind? How do we distinguish elements for one thing or another?
And that means we have to get the user feedback and asking them questions and how do they respond to the designs we put forward with them.
Physical and Sensory Characteristics
So physical and sensory characteristics include vision, hearing, manual dexterity, strength, and reach.
If you’ve ever designed a hand tool, one of the things with hand tools is “how do I actually reach all of the controls I need to reach and in what order?”
I had the fortunate experience of working for Ethicon. It was Ethicon Endo-Surgery back in the early 1990s. We did pistol grips, and it was pointed out that pistol grips for laparoscopic surgery were really ergonomically incorrect.
So, ergonomically incorrect meaning, I am like this when I do my cases.
- Image Showing the Ergonomically Incorrect Pistol Grip
And the edict was, “Why don’t you just lower the table then, doctor?”
Well, they’re used to it. Now they’re accustomed to holding it up.
If I went into laparoscopy today, I can demonstrate very easily that, biomechanically, this is the worst position that I want my physician to stay in for 45 minutes. But it doesn’t matter because that’s how they’re trained.
And if I showed them something that wasn’t a pistol grip, they probably would balk at it, and I would negatively change it.
So there’s very much to learning manual dexterity and strength.
When I go like this, the reason that’s so bad is, every time I have a joint deviation, I have negated my ability to provide the strength in pulling that lever.
So how does that translate into the real world? Let’s just say I’m back at Ethicon, and I’m designing a linear stapler. And in order to get the linear stapler to fire two lines of staples and cut in the middle, I need to produce 75 pounds of force coming on this lever.
So, all the engineers in the room can think about the math. It’s inputs to outputs, when I think about it from an engineering perspective.
Well, I’m not physically able to do that if I’m a 50th percentile female. The reality is most people going to medical school are becoming increasingly women. So it was designed for men, but now it’s for women.
So that’s when gender size, all of the biomechanics come into play to impact the overall usability of a design. So it’s important to pay attention to that.
I find that element of design to be the easiest. There’s a great standard. It’s called AAMI HE75. It has all of that in there. Those are the easy wins.
What’s more difficult are knowledge-based tasks, cognitive task, where we are having to remember something.
And that gets to the legibility and the discrimination – How do I hear an alarm? Can I hear an alarm over the top of what else is going on in the room? And I just demonstrated for you all of the complexities of the devices that are in the room. So how does one alarm sound versus another alarm? Can I provide more cues to those alarms? Can there be a visual with that auditory signal?
Recent Comments