A new study from Women’s World Banking found that credit scoring artificial intelligence (AI) systems employed by global financial service providers are likely to discriminate against women, excluding women from loans and other financial services. The study’s findings suggest that financial technology companies are missing a major opportunity to close the existing $17 billion gender credit gap and help reach the nearly 1 billion women who remain unbanked.
The study, Algorithmic Bias, Financial Inclusion, and Gender, funded by the Visa Foundation, explores the promises and pitfalls of using digital tools to open up new credit to women individuals and entrepreneurs. Specifically, it examines where biases in AI emerge, how they are amplified, and the extent to which they work against women.
The financial services industry needs to act immediately to address sexism in credit scoring technology – not only because it’s the right thing to do but also to better equip the industry to take advantage of a $17 billion market opportunity given the gender credit gap,” said Mary Ellen Iskenderian, CEO of Women’s World Banking. “This issue isn’t hypothetical - sexist credit scoring systems pose a real threat to women’s livelihoods, their families, the growth of their businesses, and the health of the economies to which they could contribute.”
Women’s World Banking researchers examined the data that app-based digital credit providers collect in order to create algorithms and conducted interviews with thought leaders and academics as well as digital credit practitioners, including data scientists, entrepreneurs, app developers, and coders. Women’s World Banking has also created a free interactive tool that allows researchers and practitioners to explore various bias scenarios.
The study found:
· Algorithms themselves are often biased because the individuals creating them have unconscious biases that they code into the algorithms.
· Biases also emerge because of incomplete, faulty, or prejudicial data sets that companies use to “train” the algorithm.
· The majority of data sources are vulnerable to gender-based bias.
· Data scientists and algorithm developers on the whole (U.S.-based, male, and high income) are not representative of the end customers being scored.
“Women have historically suffered from discrimination in lending decisions – and we can’t allow that to continue into the digital realm. Alternative credit scoring data can be a boon for women entrepreneurs who are often denied credit because of a lack of information. We need AI technologies to help women, not work against them,” Iskenderian added.
Algorithmic Bias, Financial Inclusion, and Gender recommends easily implementable and inexpensive strategies that financial institutions could use to reduce bias, including:
- Identifying gender-based discrepancies in data by producing regular reports evaluating the issue.
- De-biasing scoring models by creating audits or checks to sit alongside the algorithm, and/or running post-processing calculations to consider whether outputs are fair.
- Making bias everyone’s responsibility to address—from data scientists to the CEO. One way to do this is by establishing an internal committee to systematically review algorithmic decision-making.
To read the full study, Algorithmic Bias, Financial Inclusion, and Gender, visit https://www.womensworldbanking.org/insights-and-impact/algorithmic-bias-financial-inclusion-and-gender/.