In addition, white males are often the ones presenting scientific knowledge, whether through lectures, textbooks, or other forms of media. This means that minorities and women are not only excluded from the medical field, but from the discourse surrounding it. This can be seen in the fact that many medical textbooks contain illustrations of white males, and that only white males are often seen as experts in medical fields. This paints an inaccurate picture of who has the power and knowledge in the medical field and excludes those who have different perspectives and experiences.
And it’s white males are often the ones in charge of setting medical policy and making decisions about what is covered by insurance and what is not. These decisions often favor those who are already privileged and allow those with the means to access advanced medical treatments that those without the means cannot. This further entrenches the idea that white males are the centerpiece of modern medicine, and that males are the ones we turn to for our healing, not Mother Earth. To me, this seems like a type of betrayal of our mother. This article isn’t to suggest that modern medicine should not be looked to first, as one would not go to a herbalist for a serious illness or traumatic injury. But I’m simply saying that we may benefit from expanding our worldview to include some common-sense and verifiably tested alternatives.