Identifying bias when sensitive attribute data is unavailable: Techniques for inferring protected characteristics

To evaluate whether decisions in lending, health care, hiring and beyond are made equitably across race or gender groups, organizations must know which individuals belong to each race and gender group. However, as we explored in our last post, the sensitive attribute data needed to conduct analyses of bias and fairness may not always be … Continue reading “Identifying bias when sensitive attribute data is unavailable: Techniques for inferring protected characteristics”

Identifying bias when sensitive attribute data is unavailable

The perils of automated decision-making systems are becoming increasingly apparent, with racial and gender bias documented in algorithmic hiring decisions, health care provision, and beyond. Decisions made by algorithmic systems may reflect issues with the historical data used to build them, and understanding discriminatory patterns in these systems can be a challenging task [1]. Moreover, … Continue reading “Identifying bias when sensitive attribute data is unavailable”