PsychSafety.info is starting a new series on psychological factors in patient engagement. Part One is available here. Are you ready to engage?
Our new site for the Consultation and Research Institute is up! Visit us at:
After a summer of upheaval, I have assumed a new position as the director of the Consultation and Research Institute (CRI) at Angelo State University. I am repurposing the agency to be focused on issues of patient safety and patient engagement. My vision is that CRI will be able to partner with my colleagues in the health care industry to do a variety of projects in these areas. Please contact me if you have any questions.
Many soldiers have paid the ultimate price in wars they didn’t start or understand. What can we learn about our search for safety from this reality? My thoughts on that issue are here.
Interesting article regarding the FAA’s scolding of UAL/Continental linked here. It is a good example of how one set of activities that are normal to business in so many ways (i.e., retirement, new hiring, etc.) can be easily scapegoated when regulations are violated. How much of the “problematic behavior” highlighted by the FAA might be due to power struggles between the union and the company? This is a potential side effect of bureaucratic safety – regulatory oversight forces some bad behaviors out while decreasing corporate adaptability and providing opportunities for employees to use safety as a means to achieve other ends. Solution? More training and oversight. There is much we don’t know, obviously, but this smells like good old Newtonian “broken parts” mentality.
Meanwhile, in another industry, we get the distinct scent of “system drift,” the gradual acceptance of abnormality as normal. PG&E will pay $1.6B in damages for the San Bruno gas explosion in 2010. The article targets “safety failings by the utility and lax oversight by state regulators,” suggesting inter-organizational drift that was likely fueled by years of drifting toward a riskier mode of operation and rooted in factors that had nothing to do with safety – motivation, costs, staffing, etc.
Punishing the “bad guys” still plays well in the papers, though. I wonder how much UAL and PG&E will actually learn? Or will they just be motivated to find sneakier ways to do what they want?
The site was down for a while – sorry about that. Nevertheless, I have added a new essay on the Germanwings disaster. As horrible as it is, I believe strongly we can learn from it.
I have added a new essay on the importance of knowing the strengths and weaknesses of the logic behind the investigation of adverse events.
How many times have we thought this before? An error surprises us, seems to jump out of the woodwork? I provide a psychological perspective on this phenomenon and what we can do in response to it here.
Occasionally I will post some links to current topics on patient safety along with some comments. This is the first of those posts.
- Leadership rounds are discussed in this brief piece. This is not a new idea – physically connect those with decision-making power with those who are immersed in the dynamic chaos of patient care. There are many psychological benefits to this in theory, but care should be taken when implementing. There are several cultural, structural and perceptual factors that could make leaders “on the floor” seem like prowling predators rather than helpful allies.
- Minnesota hospitals report relatively stable numbers of adverse events and iatrogenic deaths. Monitoring our systems for adverse events that are connected to “marginally safe” behavior by employees is admirable. However, as I have explained in other posts, these “year-over-year” studies are no more than rough approximations of the system that generates them. Errors are not a static category or species, and adverse events are only defined after the fact.
- Significant names in the PS world are arguing that our efforts need to be “rebooted.” This may sound surprising, but I argue that it is a normal phase in the development of a body of knowledge. Ultimately, “patient safety” is just “safety literacy”. Expertise in any area of knowledge requires consolidation and “pruning” from time to time, and it is encouraging to see that leading voices are admitting that.
- Is the “safety logjam” breaking? This is an insightful piece that makes a number of important points, couched in a healthy substrate of realism. “Evidence” for whether health care is safer will be extremely hard to come by, because the definition of “safe” is so fuzzy. Are airplanes safer to fly in now than 50 years ago? Yes, statistically. But the “why” of that statistic is fertile ground for what organizational psychologists call garbage can behavior – using an opaque issue to attach personal agendas that are primarily self-beneficial. Sometimes the answer is simple – safety is now valued more than it was. That alone will make a difference.
Please comment or contact me if you would like to discuss any of these points.
It is difficult for some to fully grasp the concept of the error as “emergent property,” something that only exists potentially and “pops” into reality abruptly. I thought a metaphor based on an experience I had this morning might help.
I drove to work this morning on a brisk day with the roads wet from rain and condensation. I was a little later than normal as it took longer for my daughter to get ready for school. The parking lot outside my office is a dark asphalt color and the sun was set in a cloudless sky at a severe angle to the ground at 8am. I park on the west side of a lot that is shaped like a heart – the entry and exit points merge at one end, and a concrete center median creates one-way traffic. Since I like to park facing east, I have to drive through the east side of the lot, turn around at the north end, and drive up the west side to my spot. The lot is sloped slightly to the north, so water tends to gather at that end.
I behaved as I always do when navigating this area – same speed, same angle of approach to the corner at the north end. As I neared the turn, I perceived that I was alone in the lot.
Abruptly, a colleague appeared in front of my vehicle, seemingly out of nowhere. I applied the brakes with force and stopped before striking him. Upon quick reflection, I realized that the sun was creating a “washed-out” area in my visual field as it reflected off the standing water and wet asphalt. He was standing in that washed-out area as I approached, and only when he moved out of that area was he visible to me.
If I had struck him, accident investigators would be likely to say that I was driving too fast, was not paying attention, or was even negligent. But in actuality, I was behaving quite normally given my assessment of the environment. On any other morning (i.e., cloudy day, sun at a different angle, driving my wife’s car instead of my truck, arriving at a more typical time, parking lot facing different direction, decision to park elsewhere, etc.), I would have seen the colleague and easily avoided a near-collision.
The message here is that every morning that I navigate that lot, potential errors can emerge at any moment. They don’t actually exist yet – they are only observed when the components that construct them align in just the “right” way. I hope you will consider how my trip through the parking lot relates to the phenomenon of error in your workplaces.