So let's talk about implicit or unconscious bias,
it's called both, for a second.
So you probably, when you left home today,
weren't threatened by say, a saber tooth tiger outside your door.
But implicit bias is really a reaction
to that ancient sort of threat
that our species had going back to the days
when we lived in caves.
And certainly didn't have all these fancy drugs and MRI's around.
So it's an ancient reflexive system.
It's really based in the brain stem,
and the way it was designed in human beings,
it was really to detect danger.
So you know, go outside my cave.
You hear a grrr! You see some claws.
Get back in cave as fast as possible.
Because, you know, that means the tiger is right there.
So I don't have to think about that.
I'm just doing it.
And that's how we stayed alive.
And therefore, it's hard wired in all of us
that we have this bias.
Now, you know, we're not--you know, necessarily,
and I hope not,
have that level of danger presented to us on a daily basis.
But that implicit, that bias system is still in place.
The brain system is still making snap decisions every second,
every day, just walking down the street,
and allows our brain to create social stereotypes
just based on appearances alone.
And it is nothing, it's unconscious.
So therefore, you know,
it's really hard to control that part of yourself.
Let's talk about bias.
So that's just one thing, you know,
yes, it's great for dangerous situations.
It doesn't necessarily help with our day to day lives anymore.
Could it really reflect you know, in patient care?
Well, I'm not racist.
So therefore, I'm not gonna practice differently
based on a patient's appearance, right?
Well, this is a study that looked at two kinds of cases
that presented to medical students and residents.
And these are classic cases for health disparities, why?
Because we know for a fact
in study after studying in clinical medicine,
that patients who are black or Hispanic
are less likely receive adequate pain control
versus white patients for conditions
exactly like kidney stone and leg fracture.
So these students were actually given
some false statements as well
about black versus white individuals.
So for example, they were given a false statement on their exam.
Black people's blood coagulates more quickly than whites.
Yet, 39% of students and residents thought that was true.
Blacks' skin is thicker than whites'.
A majority thought that statement was true.
It's actually false.
They were also given four true statements.
Such as, whites are less susceptible to heart disease than blacks.
We discovered the fact that blacks have higher rates of heart disease.
But only 43% of students and residents got that one right.
Blacks have denser stronger bones than whites.
And it's true, blacks have a lower rates of osteoporosis
and fracture compared with older white adults.
But only 39% of the students and residents got that one right.
So overall, they found that half of respondents endorsed
at least one false belief about health based on race.
And I was interested to know that half of residents didn't know
that blacks were at higher risks
of cardiovascular disease than whites.
So that really shows there's a knowledge gap there.
By the time you are resident, you should know this stuff.
After this talk, you'll definitely know this stuff, right?
And very disappointingly but not surprisingly,
the black patient in the case scenarios,
if you think back to the kidney stone and the fracture case scenarios,
there were less likely to receive analgesics
compared with the white patient.
But the critical link that this study did
that other studies have not demonstrated in the past,
was that a higher rate of false beliefs
correlated with worse pain management for the black patient.
So therefore, more bias equaled worse pain management.
So it does absolutely present
in the way that we take care of patients,
in the way that I take care of patients,
and the way that you take care of patients.
We have to acknowledge that we all have a bias.
And the first step in doing something about it
is acknowledging it.
Now, can we get rid of it? No.
I think it's constantly something that just needs attention
and needs regular re-evaluation,
but it doesn't mean we can just snap our fingers
or you can listen to this talk and say,
"I'm now unbiased. I'm gonna provide
perfectly equal care to all my patients."
It's impossible, and it very much is personal as well,
because everybody has different levels of bias
and for different things.
It's not just about race and ethnicity,
it maybe related to sex, or sexual identity.
It maybe even related to body type.
There's a lot of ways that we are biased.
And I think that the important thing is to own your biases.
So in terms of combating bias,
a good way to go about it is to gather data.
We are data driven business.
And that's positive.
So in your hospital,
what are the rates of PCI and do they differ by race and ethnicity?
What about pain control?
You know, are your female patients as satisfied as your male patients?
What do you doing about transgender health issues?
So getting some data is a good place to start.
Creating awareness around that data.
Going through a medical executive committee and saying,
"Hey, this is something we really should pay attention to.
We have a disparity within our own system."
And that could be a-two-person practice
or that could be a 2,000 provider, you know health system.
But creating awareness is a good step,
so that again, we can know that there is bias out there.
We can own it.
You can't trust your intuition too much.
Unfortunately, your intuition, that gut feeling,
"I just know something's going on with this patient
but that one's okay."
That's exactly where bias lives.
So putting all your emphasis on intuition,
that's always part of practice is kind of having
some sense of when danger might be going on.
But relying on it too much leads to bias care.
Instead, using algorithms or other evidence based tools
in a systematic fashion,
that's gonna reduce the risk of bias influencing your patient's care.
And of course, having diverse input into decisions.
So if your hospital system is considering a new treatment algorithm,
you have a patient sitting on it.
Because maybe if they are going through dialysis
or they're the ones applying insulin,
they're gonna have a very different perspective,
because they are actually the ones doing it,
know some of the actual day to day positives and negatives
into those treatment decisions.
And then one thing that I would really like to see expand
is the use of evidence-based decision tools that mean something.
So in real time, as you are seeing a patient entering information
into your electronic record,
you know, it will tell you,
"here are the vaccinations recommended,
here's a possible hypertension treatment you could consider."
And if that algorithm is placed in a non-biased way
where it got diverse input into those treatment decisions,
that will help us avoid our own inherent biases
in terms of maybe not giving that vaccination,
or recommending a treatment that's inappropriate
for hypertension, or not recommending a treatment at all.
And then finally, this talk is all about knowing your biases
and owning them.
You know, we understand those health disparities are out there.
The best way to really combat them is to start with yourself,
because we all have these biases.
And I think when you own them,
and you're providing a more even care to your patients,
I think that's not only better for your patients,
but that's better for you, too.
And it's gonna allow you to practice,
I think in a more satisfactory manner,
where you're happier with you know, with your work.
There's a quote from Barrack Obama that I really like,
because it does start with us that we, you know,
we can't wait for somebody else to swoop in and save us.
We can't wait for the next generation.
And I believe in that at all.
I think we should all be working on this issue together
starting with ourselves.