Growing up in Satkhira, I spent a lot of time watching families wait in rural clinics. Their voices were often quiet, filled with the kind of anxiety that comes from a long day of uncertainty. Now that I work as a researcher, I find myself thinking about those voices in a new way. If one of those patients speaks to an AI today, will the system actually hear them?
In Bangladesh, we are rapidly building AI tools to help with dengue fever, maternal care, and disaster relief.
While these tools are exciting, they carry a hidden risk called epistemic injustice. In plain language, this happens when a person is treated as if their own knowledge doesn’t matter. In a medical setting, it means an algorithm might systematically ignore what a patient knows about their own health.