“Some is not a number and soon is not a time”

By Fiona Campbell (Medical Student at the University of Calgary)

It was refreshing to hear all of the insightful closing comments from all of the Telluride East participants today, and exciting to hear what we all plan to work towards as we return to our schools. It’s easy to see why we would all come away with such momentum and inspiration. This week was full of eye-opening discussions and thought-provoking workshops. It’s easy to feel empowered by everyone with a shared passion, and to think that we really can make healthcare better around the world.

But it’s also easy to succumb to real life and let that momentum fizzle away. It’s easy to forget how important patient safety is when you’re once again surrounded by leaders who don’t value it. It’s easy to get caught up in all of the knowledge we are expected to learn at school and forget about pursuing initiatives that will improve the system.

I’m still in the newlywed zone and every day I am reminded of the vows that I spoke one short week ago. They weren’t ground breaking, but by speaking them out loud in front of so many loved ones, it helps me hold myself accountable to following them. Today, we all vowed to each other to take what we’ve learned here, bring it back to our institutions and create something from it. Let’s not let life get in the way of accomplishing what we promised to do, and let’s hold each other accountable for making change. But as Dr. Mayer pointed out, some is not a number and soon is not a time – we need to think in realistic milestones in order to hope to accomplish anything. So let’s create more specific goals for ourselves and share our successes and shortcomings along the way.

My first step will be to do a patient safety project with the Human Factors group at the University of Calgary. I will start by defining the scope of the project this month and come up with a manageable deliverable to be completed before I start Clerkship in March. I’ll come back to this blog at least twice along the way to share my progress and get inspiration. Thank you to all of the Telluride East participants and faculty for the knowledge, motivation, and support to work on making healthcare safer one project at a time.

Hope everyone has a safe drive home from the airport!

#TPSER9 Reflections: Day Three

By Mary Blackwell, Nursing Student, UPenn

By the end of day three my mind is saturated and I feel so lucky to have the opportunity to be here at the Telluride East Conference. Aside from the twins in utero, as a rising senior in UPenn’s undergraduate nursing program I am certainly the youngest conference attendee. As a student, and a nursing student at that, in the hospital I often feel like the lowest on the food chain. But in this environment of open communication the medical hierarchy collapses and it amazes me to see various healthcare professionals come together for the betterment of patient outcomes. Never before have I had personal connections or meaningful conversations with interdisciplinary healthcare students surrounding issues in healthcare. Because it is so clearly valuable to have these types of conversations, I wonder why academic programs don’t put a larger effort into connecting various healthcare students during their training. Having positive experiences with one another while we’re all still humbled by the title of student could change the culture of staff dynamics in our future clinical settings.

Over the course of the past two days we have heard several stories of near miss or sentinel events in hospitals across the country. As individuals focused on honing our clinical knowledge and skill, it is inevitable to put ourselves in the shoes of the providers in these cases. I shared the shame that several students expressed during these presentations. It was difficult to watch mistake after critical mistake during the tears to transparency videos. It made me cringe in frustration and anticipation as I watched medical professionals continue cause harm to trusting patients. It’s hard to imagine missing the telling vitals signs in the story of Lewis Blackman or passively allowing the sedated Michael Skolnik to sign a consent form. However the focus of our discussions have been that many of these errors arise from faulty systems, not faulty health professionals.

During Terry Fairbank’s discussion of human factors engineering I was able to understand how systems can keep professionals from practicing to their full potential. One of the great examples of systematic hazard was the nurse who accidentally hit the wrong button on the defibrillator in an emergency situation. Instead of delivering the necessary shock to the patient in cardiac arrest, the defibrillator turned off. This delay the life saving care for the patient decreased the patient’s chance of survival. I could imagine myself making this mistake as easily as I do when quickly pressing the wrong button on my cell phone or car. A normal error such as this could happen to anyone in a high stress environment. The instrument was poorly designed; the off button was green while the correct button in this situation was red and flashing. In our society, green means go and red flashing means stop so this design is counter-intuitive.

For the first time I appreciated my constant second-guessing as a student. It’s when patient care becomes routine, second nature that mistakes are easier to make without systematic support. It’s going to be great for us to all go back to our institutions hyper-aware of these systematic barriers.

Trust and Safety in Medicine: Part 2 by Matthew Waitner M2 Georgetown

Perhaps, as Terry Fairbanks said yesterday, we should look not to our individual pursuits but the healthcare system that is in place.  Individually, we are each committed to the reason we put on the white coat – to cure, heal, and do our best to care for each of our patients.  And yet collectively as a system we are failing to provide that very goal.  How is it possible that such dedicated individuals are systemically failing – it would appear to be impossible, and the numbers certainly show that its more than just a few bad apples.  Perhaps our system needs to be overhauled.

I was struck at the insight that Dr. Fairbanks shared.  As a human factor engineer he explained that every other system in the world accounts for the natural errors in humanity.  There are fail-safes embedded in most systems to catch the errors before they cause undue harm.  Such fail-safes are not present in the culture of healthcare.  While every hospital is claiming to be patient-centered, we often fail to see the humanity within ourselves.  I am firmly convinced at this point that a systemic culture change is the only solution to our never-ending problem of patient safety.  Dr. Fairbanks made it clear that skill-based errors (or better put, automated errors) are the key to controlling our out of control safety issues.  In fact, as a first year medical student we learned in our neuroscience course about types of memory: short-term, declarative (semantic and episodic), procedural, priming, associative, and non-associative.  The last of these (non-associative learning) is when someone is habitually exposed to the same event over and over again that the event is done by rote.  This should sound familiar, as it’s the same as skill-based tasks that require automation or limited cognitive input in order to be achieved.

From my class example, this is when you live over a flight path (which I actually do), and you tune out the sound of the overhead planes because you hear them every few minutes.  Every once in a while a plane will fly lower causing a louder that normal event that triggers you to notice them again, and again with habituation it disappears from your thoughts.  So it is clear that our mind can become habituated by automated responses that require little cognition, and it is only when there is a difference that your routine is noted.  Hence is the case discussed today of the NICU heparin incident – habituation caused major medical errors.  I am fascinated by the idea that this system error could have been avoided if there had been a slight difference (a different shaped vial, a different drawer, any change from the expected norm).  Dr. Fairbanks would argue that this is how well-meaning, caring professionals make simple mistakes that cost lives – and it is not the individual practitioner, it is the system involved regardless of how mindful we may be.  That same error was noted in the book “Why hospitals should fly” when the flaps were not engaged yet 3 people confirmed a 15,15 green (indicating the flaps were extended) – it was the expected result and therefore it was what was seen.  These sorts of systemic errors are where the most margin for improvement can occur – because like any job, medicine becomes routine and we become habituated, and our brains are physiologically wired to become habituated.

To say this conference has been insightful would fall short of its true meaning in my eyes as a future medical professional.  These past two days have shown me that our profession is far from perfect (even though each of us strives for such), and requires some major safety overhaul before the public catches wind of our missteps.  In fact, I’m not sure how much longer we can pull the wool over their eyes before they catch us red-handed, and I’m not exactly sure how we have been fooling them for as long as we have.  The numbers are out there, the mistakes make headlines, and yet we are still more trustworthy than a stranger on the street, and we deserve no such pride.  As medical professionals, its time to put egos aside and start doing what we swore an oath to do: do no harm.

Standards for Medical Technology by Fiona Campbell

We had a fantastic talk today by Dr. Terry Fairbanks on the role of Human Factors Engineering in healthcare. It was a very insightful presentation that sparked more questions than answers. Why do we insist, time and again, that people conform to technology and existing systems rather than designing with human limitations in mind? Why do we implement rules based on how work is supposed to be done rather than how work is being done, when we are all aware of the gap between the two? Why do we expect health professionals to achieve perfection when we accept errors from most other people?

It’s frustrating to see how far behind healthcare is compared to most other industries. We are slow to change and slow to adopt technology. It’s even more frustrating to see technology that we have adopted that looks like it was designed by a 10-year-old. As Dr. Fairbanks pointed out, we uphold healthcare products to a different standard than consumer products. When it comes to quality of materials and processing (eg sterility), healthcare products are often held to a higher standard, as they should be (and as is reflected in their price). But when it comes to intuitive user interface and logical design, this standard is often much, much lower. How can it be that virtually every website has a more intuitive user interface than the electronic medical records I’ve tried to use? Or children’s toys that seem to have more logic in their button design than the defibrillators used in situations when every minute matters? Is it because health professionals are supposed to be smart, educated people and therefore are up for the challenge of using more complicated technology than the average person? Perhaps those designing the technology don’t consult with or bother to try to understand their users? Well no matter how smart and educated I become, I think I would take a defibrillator that looks like it’s from Toys-R-Us over one that has the ability to turn itself off, costing 2-3 minutes and possibly a life, just because I pushed the wrong button in a moment of panic.