false
Catalog
Mitigating Implicit Bias in Diagnosing Patients Wi ...
Webcast: Mitigating Implicit Bias in Diagnosing Pa ...
Webcast: Mitigating Implicit Bias in Diagnosing Patients With Sepsis
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello, and welcome to today's webcast, Mitigating Implicit Bias in Diagnosing Patients with Sepsis. This webcast is funded by the Gordon and Betty Moore Foundation through a grant program administered by the Council of Medical Specialty Societies. Be sure to check out the companion podcast, which offers 0.25 hours of accredited continuing education. This content will be available on June 29, 2023 at 2 p.m. Central Time. My name is Mary Reedy. I'm Senior Manager of Research at the Society of Critical Care Medicine in Mount Prospect, Illinois. I will be moderating today's webcast. A recording of this webcast will be available within 5 to 7 business days in your MyLearning. To access the recording, log in to MySCCM.org, navigate to the MyLearning tab, and click on the Mitigating Implicit Bias in Diagnosing Patients with Sepsis course. You will find the handout, evaluation, and recording in the course section. A few housekeeping items before we get started. There will be a Q&A at the end of the presentations. To submit questions throughout the presentation, type into the question box located on your control panel. Please note the disclaimer stating that the content to follow is for educational purposes only. And now, I'd like to introduce your speakers for today. Varun Ushedi is an intensivist at the Cleveland Clinic Foundation, Bearview Hospital in Shaker Heights, Ohio. Anne E. Thompson is Dean at the University of Pittsburgh School of Medicine in Pittsburgh, Pennsylvania. And now, I'll turn over to our first presenter, Dr. Ushedi. Good afternoon, everyone. My name is Varun, and I'm soon going to be starting work at the Cleveland Clinic. And up until just a few days ago, I was with UPMC Hammett in Erie, Pennsylvania. I'm going to be talking about implicit bias, but first, I want to say that I have no disclosures, but I do have my own biases, as does everybody else. So I want to talk about why implicit bias matters, how it matters, and what we can do to change. I will be giving an overview, a big picture perspective of what implicit bias is. But before I go any further, I want to give a shout out to my co-speaker, co-presenter, Dr. Thompson, who was one of the people who inspired me to learn more about implicit bias and get started on this journey. So thank you, Dr. Thompson, and you'll be hearing from her after me. So what is implicit bias? Implicit bias, like the name says, it is not explicit bias. It is not the explicit racism, sexism that we so often hear about. It is bias that is automatic. It is unconscious. It is a prejudice that we hold that we learn without our control. Everybody has implicit bias. Every single one of us has implicit bias. We might not have the same bias, but it is, everyone has it. And often, contemporary bias, especially in our circles, is largely unconscious. So we will dive in a little bit more about, to talk about why we have implicit bias and how it matters. So this is an evolutionary trait. One of the theories, and there are several theories of thinking out there. One of the theories to explain implicit bias is the dual process theory. It is used to explain this quite often. And the dual process theory really talks about two different kinds of thinking. The first type is the intuitive, associative thinking, which is very fast. Most humans spend most of our time in type one thinking. Type two thinking is more deliberate, analytical, and slow. And of course, it takes up more energy. So we spend less time in type two thinking. Humans spend more time in type one, that is the intuitive thinking, especially in times of stress or when we are sleep deprived. For example, if you're sleep deprived in the ICU, you will lean towards making associations from what you've learned in the past. The more reps you get in, in your analytical, slower type two thinking, you will eventually develop those associations and then start to employ type one thinking, that is the associative thinking. A good example of type one thinking is, if you're in the ICU and you see a patient with low blood pressure, fever, and infiltrates on the lung, you'll think, okay, this patient is probably having septic shock or pneumonia, you'll send blood cultures and start them on antibiotics. So that's one example of type one thinking. So this is the basis of understanding implicit bias. Implicit bias is a form of associative type one thinking. How does implicit bias affect us? The same traits, the evolutionary traits of intuitive thinking that have helped us survive, that have helped us make associations, sometimes don't work so well for patients because we are not in the same set of situations that we were evolutionarily in. So implicit bias does lead to health disparities. It begins with these individual biases that often then lead to structural disparities. So I'm going to be talking about implicit bias under the purview of health disparities and career implications, health disparities. So we're going to just go over some of the most, some of the representative studies. There are lots of studies out there that demonstrate implicit bias and its potential consequences on patient outcomes. I won't be talking about all of the studies because they're just too many to count, but we're going to be focusing on some representative ones. So one of the studies was interesting where women and black men were less likely to be referred for cardiac catheterization than white patients. And this was in the New England Journal of Medicine. When you dive into the statistics of this, you find out that it was black women were more like, were less likely to be referred for cardiac catheterization than any other group. But also there was a study by Mazarin Banerjee and Green that connected implicit bias and showed that increasing pro-white bias was related to a higher likelihood of recommending thrombolysis in acute coronary syndrome for white men, as opposed to women and black people. And of course, this was an older study when we were still using thrombolysis for acute coronary syndromes. Another study looked at, and this was a clinical vignette where physicians were more likely to recommend knee replacement for men than for women for the same set of symptoms. A retrospective study, a cohort study showed that Hispanics were twice as likely to not receive opioid medications than white patients. And you will see this trend replicated in several studies. There was another study which revealed pediatricians with higher pro-white bias were less likely to prescribe opioids to black children than to white children. So these are not people who are obviously racist. And often these are people who will say that they believe in equality, that they are not racist, but we grow up with these biases that are out of our control, which is why it is so important for us to know that even well-meaning people can make decisions that will lead to poorer patient outcomes. And again, there were two studies by Casquino et al, and both those studies revealed similar things that women and black patients, women of all races and black patients were less likely to be referred for LVADs. Now it's important for us to remember that we are talking about these studies in the North American milieu. Implicit biases in India or in the Middle East might have a different flavor. There might be different characters in play over there, but the bottom line is that even well-meaning people will carry some biases as they grow up that is out of their control. I want to talk about a study that was by a former co-fellow of mine, Andrea Elliott and team. They studied non-mobile communication of physicians with patients. This was a simulated study. They gave patient vignettes to different physicians, and they not only did they record what the physicians were saying to the patients, but they also looked at non-mobile cues. How far were they standing from the patient? Were they leaning in? Were they closed up? So they were giving points to each of these non-mobile cues as well as mobile cues. And what they found was very interesting that physicians said the same things to black patients and white patients, but their non-mobile cues were far worse for black patients than for white patients. And I think that's really important, and it's really important for us to keep that in mind going forward that we might think that we are doing the right thing, but our body language might be saying something else. Another study, and this is sepsis related, that Dr. Thompson will talk about more granularity compared to whites, critically ill black people and Hispanics were less likely to receive inter-hospital transfers. And this was a large retrospective cohort study that looked at Medicare data of thousands of patients, and they found these significant associations. And for us as professionals, implicit bias has career implications. Letters of recommendation for urology residency for men were more likely to express personal drive, work, and power than letters of recommendation for women. Another really interesting study was when all members of the OSU College of Medicine Admissions Committee had a strong pro-white implicit bias. And I want to give them kudos for even doing this study, because it's not easy to kind of say, all right, we're going to test our bias as a group and then see what kind of bias it turns out. It needs a certain amount of courage and honesty. And following this study, they saw that they had more diversity in the incoming class. So this, I think, is a really important study that demonstrates knowing one's own implicit bias. So how do we mitigate bias? And Dr. Thompson will dive into this more, but I want to talk about the implicit association test. So the first step to mitigating bias is really knowing yourself. And the implicit association test, the link is on the screen, is a test that looks at, that determines what kind of bias you have. One of the most common implicit association tests used is the black-white bias, where you have to click two buttons on your keyboard. So the first time they'll say, OK, every time you see a black face or black person, you click words like joy, wonder, glorious. These are positive words. And every time you see a white person, you click words such as agony, terrible, evil. And then it switches it. Every time you see a black person, you click words such as agony, horrible, evil. And basically what it does, it compares how fast are you making those associations. So it brings about a certain amount of automaticity. A lot of people have used this test. There are many different versions of the implicit association test. And really, I would strongly recommend that you take this test. When I took this test, I was surprised and embarrassed to find out that though I consider myself a feminist, I had a moderate bias against working women. So it was eye-opening to me. And it's important, because that helps me know something about myself. And it informs me on where some of my decisions might come from. And if I'm in a position of power, if I'm in a position to make policy, I will keep this in mind when I'm making these decisions. So I think this is a really good place to start. There are several other ways to mitigate bias. There is no one silver bullet. But really, knowing the individual really helps overcome a lot of the biases and the stereotypes that we have. As we know, stereotypes might be applicable to people in general, but are not applicable to individuals. Or think about recalibration, which is just a fancy word of saying, just be aware that, oh, I have a moderate bias against working women. I'm just going to keep that in mind when I'm going to work with women or when I'm talking to women and make sure that I'm aware of this. Know that implicit bias increases during times of stress, which is basically what we are doing. We are in the ICU. We are in high-stress situations often. And know that implicit bias increases. So if, for example, you take the implicit association test and realize that you have an implicit bias against black people or in favor of white people, if your black patient is in pain, you might want to stop yourself and say, OK, let me just make sure that this patient is actually in pain or has something, make sure that I'm treating their pain appropriately. Because there's just so much in the literature about black people and how they're not heard and how their pain's not taken care of appropriately. And all those are real experiences with real implications for patients. One really interesting solution I found was implicit bias rounds like mortality and morbidity. Institutions and departments can identify cases that they can discuss on a monthly basis or a quarterly basis and talk about how implicit bias may have played a role in certain patient outcomes or in patient experience. Checklists are a great, checklists are the great equalizer. Atul Gawande has written about this as well. And we all know how great checklists are and how beneficial they are. They're also helpful in mitigating bias. So there are several strategies we can employ. What my takeaway message would be that know that everybody has bias, that it is important for us to be aware of our bias, because without being aware of our own biases, we can't bring about change. If this was a drug or if this was a test, we'd be jumping on this much sooner, because it has such profound implications on patient outcomes. And remember that implicit bias is worse under pressure and that we need to work together as individuals and as groups to come together and to find solutions institutionally and regionally on how we can mitigate bias. So with that, I would like to conclude my presentation. Thank you so much for your attention. I want to now hand over the reins to Dr. Thompson. Dr. Thompson, please take it away. Thank you, Varun. And thanks for your initial comments. I didn't know that I'd have any effect on you. Anyway, we'll move on now to discussing the impact of bias in ICU diagnosis, management, and importantly, interactions among patients and members of the ICU team. We have developed a whole set of expectations and assessments of people, events, and information we get from our background, and sometimes our very distant background related to culture, race, ethnicity, socioeconomic status, and so on. But it also includes recent experiences, perhaps a negative experience with someone who in some way seems like a member of the healthcare team or someone who is like this patient or family member. And of course, we all know that when there's a common infectious disease circulating, we tend to think that all pneumonia is influenza, and if we've recently missed a diagnosis, we are inclined to see it everywhere we look. And the consequence of that is that we may not listen fully to a patient's history. People like this aren't reliable historians. We may not interpret symptoms as well or listen to a team member's concern about a patient if for some reason we don't think well of that team member. But it goes beyond patient care. It interferes with hiring the best candidate for a job, maybe listening to someone's idea about changing processes in the ICU, giving fair job reviews, or providing a mentoring to a younger colleague. And again, as Tarun said, it's not a matter of good people or bad people. It's just part of us all. And even if we have a conscious intent and commitment to fairness and make an effort to behave without prejudice, we can still have unconscious biases that get in the way of our vision. Just talking a little more about how our brains work, we—and I like to think of the ways we make clinical errors being very similar to the ways bias affects our everyday lives. We have these type 1 processes that, as you've heard, are fast, unconscious, intuitive, and take very little cognitive resource or brain energy. They're a bunch of mental shortcuts, heuristics. You could consider them habits of mind. And they allow us to make rapid decision making. The good thing is that they increase with our growing expertise in medicine. The trouble is that most decisions result from type 1 processes, and these are the ones that are most vulnerable to error. So to think about that, how that relates to patients. You know, when we meet a patient and we recognize a pattern, we make a quick assessment of what's going on with that patient, just as Varun suggested, our quick diagnosis of septic shock. But if what we see is unfamiliar to us, it's a much slower process of gathering and analyzing information. But even in the first instance, if we're the careful clinicians we mean to be, we sort of hold on to a little piece of our brain for considering what we might be missing. So if I make a diagnosis of meningococcemia, I try to think about, even as I'm initiating treatment, I try to think about what other infections could look like this to be sure that I cover all the possibilities. But similarly, when we assess our patients, their families, and our colleagues as people, and maybe the literature we read, a similar process needs to occur. If we quickly assess them as unfamiliar, not like me, other, and we use our type 1 processing, they are vulnerable to our biases, and we're likely to make errors in our assessment of them. When we talk about diagnostic errors, we often think about premature closure on a diagnosis. And there are a bunch of cognitive factors that are known to contribute to that. For one thing, we find somebody has faulty immediate recognition of a pattern. They may have, or we may have, lack of knowledge or experience with something, or we may not have collected sufficient data and put it together correctly. As you've heard, stress, fatigue, and time pressure contribute energetically to making these errors. And as far as I'm concerned, that's sort of a definition of our everyday lives, stress, fatigue, time pressure. There are, if you take these same cognitive factors and think about it from the point of view of bias, you know, we make immediate assumptions about someone's intelligence, knowledge, skills. We take inappropriate shortcuts to assessing them. And we may, if we've had limited experience with certain kinds of people, we see them as not like me or other, and fail to learn enough about the specific patient or colleague, we are pretty likely to make errors in our assessment of either a patient, a family member, or a colleague. Other contributing factors are absence of self-evaluation, incapacity to recognize our own weaknesses, and just a little too much confidence. And the equivalent in bias, I think, is when we haven't really thought about our thinking, when we haven't learned about our biases, and when we're unwilling to consider bias as a component of decisions we're making. You know, we fall back on our mental models, our associations, and our previous experiences. And when we have too much going on, too many things demanding our attention, and we haven't learned enough about the people involved and their experience, we make biased decisions, which I think you could view as diagnostic errors, about people and about diagnoses and treatment. You know, it's important to think about who's providing this information to me. Is it someone I view as high-performing, or is it someone who, for some reason, I don't think is particularly strong? Just because somebody made a mistake once doesn't mean that all information is correct. Just because somebody made a mistake once doesn't mean that all information they provide to you is not to be attended to. Is this person experienced? Is this person a member of a group that you experience as other? If either a patient or a colleague, a staff member, is from a minority or racial group, we may not hear the information they provide to us as well as we might hear from someone we consider like me. Maybe a student has a great idea, but, you know, what do they know? And certainly, we have biases among us in terms of specialty. You know, internists aren't so sure they value anything that comes from a surgeon, and surgeons aren't so sure they value anything that comes from an internist. We tend not to take input at face value, and our assessment or bias about the person can get in the way of our assessment about the information. If we don't have a high opinion of the source of the information, we may not observe carefully and openly. Let's look at just a few examples of biases that have affected diagnosis or management. These are not necessarily directly related to sepsis, but they occur, they can occur in the ICU, and I think they deserve attention. You heard just a little bit about the diagnosis of myocardial infarction. Women are more likely to seek medical attention, but less likely to be told their symptoms are cardiac in origin, and less likely to receive guideline-supported care or undergo critical diagnostic or therapeutic procedures, particularly if they are members of a minority. Then they have higher in-hospital mortality. Interestingly, their symptoms have been described as atypical, which is a little funny when you consider that women represent half the population, and I guess it reflects the old idea that myocardial infarction was a man's disease. You sort of have to wonder, why did we ever think that anyway? There was an emergency department study looking at a fairly large number of patients that found that the time, these were patients with severe septic or severe sepsis or septic shock who were admitted to a medical ICU, and the mean time to the first antibiotic delivered was 30 minutes longer in women than in men. There was no difference by race or ethnicity overall, but the numbers were not adequate to evaluate Black women and Hispanic women in comparison to other women. Clearly, this is directly related to our experience in the ICU, but we did have a ICU, but one of the questions that I have is, does it occur for women or some other group transferred to the ICU, not from the ED, but from a routine unit in the hospital or in transport from another institution? Does it even occur in the ICU itself with new-onset sepsis in patients already in the ICU? You've all been aware of the many papers about the importance of, or lack of importance, of early goal-directed therapy. I don't want to go through all of those studies, but one thing that I thought was worth, well, they're all worth thinking about, but one thing that drew my attention for this particular discussion is the study by Corl in 2019, which showed that protocol completion for black patients was lower than for white patients. The most interesting thing here was that the differences didn't occur within hospitals, but among those primarily serving minorities. These are institutions with fewer resources, less QI infrastructure, more ED crowding, and unfavorable nurse-to-patient ratios. I think the implication is that these disparate outcomes are related to structural inequities in our healthcare system that most likely reflect biased policy development rather than necessarily individual practitioner bias. We don't, in our policies, assure that institutions serving poor minority people have the resources they need to provide excellent care. Another study looking at adults with severe sepsis requiring mechanical ventilation found, among other things, that black and Hispanic patients were less likely to be transferred to a higher level of care. Now, there are a lot of reasons why patients might choose not to be transferred or where transfer might not be appropriate. Nonetheless, I think the racial and ethnic findings are troubling. The question we might ask is, do we have similar problems with transfers within a hospital? One of the most disturbing papers I've read recently revealed that physicians perceive scientific evidence differently when it pertains to a politicized treatment compared to when that treatment isn't identified. This study asked physicians to evaluate a research abstract based on the TOGETHER trial that demonstrated lack of benefit of ivermectin for COVID-19. Some physicians evaluated the abstract with the drug referred to as GL-22. For others, the ivermectin was evaluated by name. Very conservative physicians evaluated the same evidence as less informative, less rigorous, and more likely to be the work of biased authors if they evaluated the abstract with ivermectin named. They were five times more likely than liberal or moderate colleagues to administer the drug. What this says is that political ideology, another form of bias, has become increasingly relevant for understanding beliefs and decisions that clinicians make. We also have a study that shows that we tend to evaluate research coming from high-income countries more highly than if it comes from a low-income one. Doctors evaluated the same abstract as showing greater relevance, strength of evidence, and were more likely to recommend it to a colleague if it came from a high-income country. What that probably means is that much of the research from low-income countries, no matter how well done, tends to be discounted prematurely and perhaps unfairly. I've caught myself looking at the source of a paper whose title interested me before I decided I'd read it because I saw where it came from and that meant it was going to be good. Of course, we've all had the, we've learned a lot about the fallibility of pulse oximeters since the onset of the pandemic, even though the fact that they might not be reliable in patients with dark skin has been recognized for decades. The question for me is, have we changed our practice? Are we more careful in calibrating pulse ox against blood gases in individual patients? And moreover, does this weakness apply for other devices? Given the growing diversity of the U.S. population and, frankly, the world as a whole, I think we need to make sure that evaluation of devices across different populations is improved. So we're talking about if this risk of less good patient care or at least disparate patient care is fairly common, what can we do to mitigate our biases? While opening our eyes to our own biases is enormously important, in fact, absolutely essential, it is simply not sufficient to do the whole job. The current strategies for modifying bias are controversial. It's not clear how we can actually modify biases, but what we can do is modify our practices that may be vulnerable to biased action. If we can view our biases as habits of mind, like other habits, we can break them, but it takes intentional, persistent effort. And frankly, overcoming biased judgments and actions will probably take lifelong hard work. The first step is to make sure that you and your institution truly want to treat patients, families, and colleagues equitably, and don't just assume that you do already. I think we all think we come into work, we do our very best with everybody, that we're already providing equitable care, but too frequently the data say we're not. Reaching that internal commitment that we really want to be sure is the most powerful incentive to change. We each need to become aware of our own biases, take responsibility for personal change. My sense is that it's really important to read, read, and read about the impact of biases on individual patients and our colleagues. The more you read, the more you realize that this is a very critical issue. And then once we've become somewhat more aware, we can help others become aware and with them speak up and become upstanders and allies for those who are the targets of our biases. We need to recognize bias within our institutions and point it out, and really examine policies that might well have been well intended, but have inequitable results. And then if we find inequities, we need to work with others to make institutional changes. One of the things that's become clear is that we need to increase diversity in clinical trials. We're getting better at it over the past couple of decades in particular, but it's still far from adequate. Overall numbers of underrepresented groups remain relatively low, so that we're very limited on subgroup analyses by race and ethnicity, and sometimes sex. We need to make sure we include women and minorities in research in numbers that are great enough that the data apply to them, including within non-subsets of the data. One of the things that I think is worth doing is that when you encounter a study that done elsewhere seems to reflect bias, consider repeating it in your own institution. Is it happening here? When we read the literature, we need to make sure that we are avoiding bias against lesser known investigators and against low-resourced institutions and regions. Let's just evaluate the data, not necessarily where it came from. And then if we find that we're doing well within our own institutions about some problem that's been identified elsewhere, see what of our practices we might share with poorly resourced hospitals. We do that to improve care of patients in rural and safety net hospitals. We learned a lot during the pandemic about what we can do with telemedicine. Are there ways that we can use our greater resources to assist people in settings that don't have them? We may find that we then receive patients in better conditions when they're transferred to us. But sometimes our individual efforts are simply inadequate. Becoming aware of our own biases is an important step, but sometimes what we need to do is to get together and work together to influence policy. We took on the world in the surviving sepsis campaign. Can we do something similar to advocate for equitable care? If we look at these inner hospital transfers, there are a lot of possibilities for the disparate outcomes. It may be related to individual practitioner bias. It may be that minority patients and families are less likely to request or be willing to be transferred. There may be language barriers, worries about immigration status, a whole host of reasons why transfers that might have benefited the patient don't occur. But I think what we want to do is, if there's a difference in outcome, look at the determinants of the disparity. Which elements in individual practitioners, local centers, and institutional structure are appropriate and which ones need to be modified? I think I touched on this earlier, but there are plenty of unanswered questions in the ICU. Our treatments and recommendations to patients are limited by our knowledge, experience, whether positive or negative, available guidelines and treatment protocols. Our decisions can be affected not only by patient wishes, but also by our assessment of the patients. And we just need to make sure that it's the patient's well-being driving the decision rather than some unconscious judgment about the patient's worth. And finally, treatment really shouldn't be affected by our political views. It's important to recognize data when it is good data. We need to attend to it even if it challenges one of our biases. Biases are formed over repeated exposure, and they are reinforced by the same communities we live in, the media we attend to, ongoing exposures to current events. Overcoming them will require repeated exposure to alternate perceptions and our continued attention to trying to change. Like other bad habits, it's an extended challenge and a lasting one that will only be overcome by maintaining awareness. I like to think that we have a rather well-developed institutional approach to addressing errors in medicine. Could we adapt these mechanisms to help us recognize bias as a source of potential error? As I said before, the impact of current strategies to decrease bias is controversial. We need better-powered research into what strategies actually decrease implicit bias and cognitive biases, especially over the long term. We know that internal motivation is really crucial, and individual work is very important. But I think we also need to address wider socioeconomic, political, and structural barriers faced by individual patients who need intensive care. All in all, we have a great deal of work to do. Thank you. What are your thoughts on how our clinical checklists and decision support algorithms may have been developed with bias in them? Wouldn't this in turn lead to flawed decision-making? Yeah, I think I can answer that, and Ann, you can follow. Yeah, absolutely. I mean, if you think about your GFR calculation that includes race in it, that is now shown to be inaccurate. And removing race from that calculation is not simple because that was leading to inaccuracies in the other direction. So there was a great study in the New England Journal which looked at cystatin C and creatinine in calculating GFR, and that led to more accurate assessments of the GFR and did not need race. This comes in, this came in because of the presupposition that race was biological and that race had a biological impact in our physiology. So absolutely, these individual biases will eventually lead to structural inequities that keep propagating themselves. I think also, even if you think about getting consent from patients and families, this isn't exactly a checklist, but we often have a standard way of presenting information to people and asking them if they have any questions and do you, is this okay, can we proceed? And if we haven't learned how they may be hearing things differently or their prior experience has been negative, this applies across all patient populations, but it's particularly likely to occur if the practitioner and the patients don't see each other as like me. And I think we, even with a checklist about how you get permission, we may miss, we may be refused permission and think this patient isn't very cooperative, when really the issue is we haven't met them where they are. Okay, I have another question here. This is specifically for Varun. When did you get so passionate about this topic? Well, I was first made aware of my privilege when I started working with Doctors Without Borders in India. And then when I moved to the U.S. and with Michael Brown's shooting and then the other shootings that followed, I started paying attention to bias and bias in general in society. So and then I kind of worked into race in medicine, I did a couple of talks on that and then Dr. Thompson's talk in fellowship really sealed the deal. So it's been growing for several years now. So that's really how I got interested. Thank you. We have another question and it is, what are your thoughts, or no, I'm sorry, that's the wrong one. What are your thoughts on the bias on proceeding and executing comfort care in the ICU? Is there a way to measure ourselves? That's a really interesting question. I don't think we have anything close to adequate information about the best way to do that. I think the study that Varun mentioned early about the body language of physicians in talking with patients and surrogates about care was directly about that. It was focused on that these were, these standardized patients had terminal illness with identical findings and identical wishes. And yet the connection between the clinician and the patient or the surrogate might well not have been as good. Physician was holding back, who knows what the surrogate and patient were experiencing. We know that particularly with some of our minority patients, both black and Hispanic, the feeling is that there's serious lack of trust. And if somebody perceives that we are giving up on a patient because of their race or because of their background, they're not going to be inclined to accept comfort measures only. I think building trust with each and every patient and family is the only way we're going to manage it. And then building a better healthcare system in general so that people have less reason to be distrustful. I don't know if that's a very wishy-washy answer, but I think it's the problem. No, I think it's a great answer. And really, just to piggyback on that, I want to share an anecdote. I was doing ICU rounds and there was a patient that was post-arrest and had not woken up yet. And I hadn't seen the patient at this time. And the nurses all told me, several people told me that the family was difficult, quote unquote difficult. So when I went in for my family meeting and I realized there was a black patient and a black family, it was a husband and two children, one male, one female. And I just developed such a connection. I just saw that they just passionately loved their mother, he passionately loved his wife. And I found out how she was as a person. I found out that she loved to dance in the kitchen to music. I found out that she loved to walk in the fall and see the fall leaves. And it developed a connection, it developed a trust. And really, this is that mention in the literature, is knowing people as individuals will help you overcome some of your biases. And really to end this anecdote, she did end up waking up and getting extubated and getting, leaving the ICU. But my experience with the family was that they were not difficult at all. They were just completely passionate and advocates for their family members. Thank you. We have another question here. How do we overcome the bias patients have in believing that hospitals and doctors, pharmacy companies only care about making money and not caring for their needs, especially and justifiably among minority communities? Oh, man, that's a curveball. Anne, do you want to go first on that or do you want to take it? I think, you know, patients being humans have biases, too. And sometimes they aim them right at us. And they can be difficult. Again, I think finding common ground, you know, why asking more about what experiences they've had that bring them to the positions or the opinions they're expressing, you know, hearing them out often really helps get past all that. And then at some point, a question of the sort of, you know, I know you're worried about whatever the issue is right now. I think we're both concerned about this patient's getting better. You know, can we just focus on that right now? But I don't think you can ask that question until you've given them some time to vent their opinions. Yeah, and I try to introduce nuance into the conversation. So, you know, if patients say that, I say, yeah, I mean, we know lots of pharma companies are, you know, go behind pursue profit. But that doesn't mean that these treatments don't work, that we actually see this play out. You know, when they talk about the COVID vaccine and with my experience in pediatrics and pay parents who are resistant to vaccines, we would talk about, yeah, vaccines are not 100 percent, but the chances of side effects are literally one in a million. And we follow those patients. So I try to introduce some nuance. So trying to meet them where they are, where I'm not completely dismissing their concerns, but saying that, hey, you know, let's just look at what the facts are and what we see every day. OK, thank you. We have another given big data, is there a way to tease out and then work to mitigate these biases and care disparities? I think it's I think the answer to that is, are we collecting that information in the first place? And, you know, we probably have to start looking forward is, all right, now I want to know my next question is, you know, how is my implicit is my implicit bias? How is it connecting to patient outcomes? Can I make that connection in more and more patients? Can I make the connection between implicit bias and prognostication? So I think that's a fantastic question. And those are the new avenues of research that we need to focus on. I think the very fact that we can see these differences by way of big data help guide us to smaller, more precise studies that let us look at what the factors might be that result in it. You know, is it the individual? Is it patient differences that aren't picked up in the big data sets? Is it some policy and so on? OK. Well, thank you very much. That concludes our Q&A session. We want to thank Dr. Shetty and Dr. Thompson. Thank you very much. And thank you to the audience for attending. Again, this webcast is being recorded. The recording will be available to registered attendees within five to seven business days to access the recording. Log in to my SCCM.org, navigate to the My Learning tab and click on the Mitigating Implicit Bias and Diagnosing Patients with Sepsis course. You'll find the handout evaluation and recording in the course section. Remember to check out the companion podcast on the SCCM Diagnostic Excellence Program page. This content will be available on June 29th, 2023 at 2 p.m. Central Time and offered 0.25 hours of accredited continuing education. Also, keep an eye out for the resources which will be available on the SCCM Diagnostic Excellence Program page. And lastly, please join us for the next CMSS funded webcast scheduled on July 26th, 2023 at 1 p.m. Central Time. That concludes our presentation today. Thank you all.
Video Summary
The webcast titled "Mitigating Implicit Bias in Diagnosing Patients with Sepsis" discussed the importance of recognizing and addressing implicit bias in medical practice. The speakers, Dr. Varun Ushwade and Dr. Anne Thompson, discussed how implicit bias can impact patient care, leading to health disparities and unequal treatment. Implicit bias refers to unconscious biases and prejudices that individuals hold, which can influence their decision-making and behaviors. The speakers emphasized that everyone has implicit bias, and it is important to be aware of these biases in order to address them and make more equitable treatment decisions. The webcast highlighted several studies that demonstrated how implicit bias can affect patient outcomes. For example, studies showed that women and minority patients were less likely to receive certain treatments or interventions compared to white patients. The speakers also discussed the impact of implicit bias on end-of-life care decisions, such as the tendency to withhold comfort care for certain patients. They stressed the importance of building trust and establishing open communication with patients and families to overcome biased decision-making. Strategies to mitigate bias include taking the Implicit Association Test to become aware of personal biases, practicing recalibration by being mindful of biases during decision-making, and implementing checklists and protocols to standardize care. The speakers also emphasized the need for more research to understand and address implicit bias in healthcare. Overall, the webcast provided valuable insights and practical strategies to mitigate implicit bias and promote more equitable patient care.
Keywords
Mitigating Implicit Bias
Diagnosing Patients with Sepsis
Recognizing Implicit Bias
Addressing Implicit Bias
Medical Practice
Health Disparities
Unequal Treatment
Unconscious Biases
Decision-Making and Behaviors
Implicit Bias in Healthcare
Society of Critical Care Medicine
500 Midway Drive
Mount Prospect,
IL 60056 USA
Phone: +1 847 827-6888
Fax: +1 847 439-7226
Email:
support@sccm.org
Contact Us
About SCCM
Newsroom
Advertising & Sponsorship
DONATE
MySCCM
LearnICU
Patients & Families
Surviving Sepsis Campaign
Critical Care Societies Collaborative
GET OUR NEWSLETTER
© Society of Critical Care Medicine. All rights reserved. |
Privacy Statement
|
Terms & Conditions
The Society of Critical Care Medicine, SCCM, and Critical Care Congress are registered trademarks of the Society of Critical Care Medicine.
×
Please select your language
1
English