To evaluate whether decisions in lending, health care, hiring and beyond are made equitably across race or gender groups, organizations must know which individuals belong to each race and gender group. However, as we explored in our last post, the sensitive attribute data needed to conduct analyses of bias and fairness may not always be available . To address this problem, numerous techniques have emerged for inferring individuals’ protected characteristics from available data.
Bayesian Improved Surname Geocoding
One example of such a technique is Bayesian Improved Surname Geocoding (BISG), a methodology that uses last names and geographical information to generate race probability estimates ,. Using a form of Bayes’ Theorem, BISG computes the probability a person belongs to each race group (e.g., White, Black or African American, Asian, etc.) based on demographic information associated with their last name and then updates this probability using demographic information associated with the census block group they live in . The combined approach taken by BISG has been shown to work better than approaches that rely on just last name or geographic information ,,.
Intended and actual use of BISG
BISG was developed to examine racial and ethnic disparities in the domain of health care; importantly, it’s developer has noted that BISG was developed not to estimate the race of particular individuals, but rather to look at the possibility of larger group disparities . Nonetheless, BISG was prominently used by the Consumer Financial Protection Bureau (CFPB) in a 2013 lawsuit against Ally Financial . In connection with the Department of Justice’s finding that Ally Financial had overcharged hundreds of thousands of minority customers on auto loans, and lacking data on the race of individual borrowers, the CFPB used BISG to identify customers who were likely to be members of minority racial groups .
Although BISG is a prominent technique considered among the best known for inferring race and ethnicity in the absence of sensitive attribute data , its accuracy has been questioned, with some researchers highlighting the possibility that it can overestimate racial disparities ,,,. In our next few posts, we will analyze a methodology similar to BISG in the context of mortgage lending using a methodology similar to that of Chen et al. (2019). Under the Home Mortgage Disclosure Act, the CFPB collects and publishes data annually on mortgage applicants in the U.S., including their race and the area they live in. By matching that geographical information with data from the U.S. Census Bureau, we can infer the race of each applicant and analyze the performance of such a technique.
: Bogen, M., Rieke, A., & Ahmed, S. (2020, January). Awareness in practice: tensions in access to sensitive attribute data for antidiscrimination. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 492-500).
: Elliott, M. N., Fremont, A., Morrison, P. A., Pantoja, P., & Lurie, N. (2008). A new method for estimating race/ethnicity and associated disparities where administrative records lack self‐reported race/ethnicity. Health services research, 43(5p1), 1722-1736.
: Elliott, M. N., Morrison, P. A., Fremont, A., McCaffrey, D. F., Pantoja, P., & Lurie, N. (2009). Using the Census Bureau’s surname list to improve estimates of race/ethnicity and associated disparities. Health Services and Outcomes Research Methodology, 9(2), 69.
: Bureau, C. F. P. (2014). Using publicly available information to proxy for unidentified race and ethnicity: A methodology and assessment. Washington, DC: CFPB, Summer.
: Koren, J. R. (2016). 08. Feds use Rand formula to spot discrimination. The GOP calls it junk science. Los Angeles Times, 8.
: Andriotis, A., & Ensign, R. L. (2015). US Government Uses Race Test for $80 million in Payments. Wall Street Journal, October, 29.
: Baines, A. P., & Courchane, M. J. (2014). Fair lending: Implications for the indirect auto finance market. study prepared for the American Financial Services Association.
: Zhang, Y. (2018). Assessing Fair Lending Risks Using Race/Ethnicity Proxies. Management Science, 64(1), 178-197.
: Chen, J., Kallus, N., Mao, X., Svacha, G., & Udell, M. (2019, January). Fairness under unawareness: Assessing disparity when protected class is unobserved. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 339-348).