Algorithm bias is worsening in the United States these days as people of color remain the victim of falsely orchestrated coding.
A recent investigation conducted by Markup that was released in collaboration with the Associated Press made some astonishing revelations regarding bias in a mortgage approval algorithm.
The investigation showed that people of color are discriminated against in mortgage approval. It shows that Black Americans are less likely to get approval of their mortgage as compared to white Americans.
This is not the first time that Black people are the victim of racially orchestrated algorithms, and probably not the last time.
Markup conducted a statistical analysis of mortgage applications that showed that mortgage forms from people of color are rejected due to a biased algorithm, while the forms of white Americans are approved easily.
Findings show that the lenders are more likely to disapprove of the mortgage loans of people of color with comparable financial profiles.
The reporters from Markup analyzed nearly two million mortgage applications from 2019, where it was found that mortgage loans to black Americans, Latinos, Asian Americans, and Native Americans were rejected even if they had a better financial profile compared to their white counterparts.
Different factors of the mortgage approval mechanism were studied during the research, and everything portrayed that there was a bias present in the mortgage approval algorithm.
Biased Algorithms in Mortgage Approval: Another Example of Marginalization of Black Communities
There can be a human element involved in this disparity, but the lending is becoming increasingly algorithm-driven, so it is easy to establish that there is something wrong with the algorithm.
Reporters involved in the research say that lenders are not rejecting the applications due to color discrimination but due to the debt-to-income ratio and combined loan-to-loan ratio.
Some key findings of the study are:
- Black applicants have 80% more chances of getting rejected
- Native Americans have 70% more chances of getting rejected
- Asian/Pacific Islanders have 50% more chances of getting rejected
- Latino applicants have 40% more chances of getting rejected
All of the above percentages were found in comparison to white Americans, and it was also found that any individuals from the above-mentioned groups with a better financial statement and a good income-to-loan ratio as compared to a fellow white American were also rejected.
The study analyzed nearly seventeen different factors included in the complex statistical analysis of more than two million mortgage applicants.
In every case out of the two million cases studies, it was found that the applicants looked almost the same in terms of their financial statements, but the only differences were of color, ethnicity, or origin.
The financial characteristics of some applicants of color were found to be even more impressive, but they were still denied by the lenders.
Regional Variation in Mortgage Biases: Another Thing to Worry for People of Color
The Markup also conducted a regional study that was even more shocking compared to the national level study. The regional study included the analysis of towns and cities across the country and found that the disparity in loan application approval was common everywhere.
In Charlotte, for example, it was seen that lenders were 50 percent more likely to disapprove applications of Black applicants with financial profiles similar to white applicants.
Similarly, in Chicago, the disparity rate was found to be 150 percent and 200 percent for Latino applicants in Waco, Texas. Asian/Pacific Islanders were rejected by 200 percent as compared to whites in Port St. Lucie, Florida, and Native Americans in Minneapolis were rejected by a 100 percent greater ratio as compared to White Americans.
Similar disparities were found in 89 metropolitan areas all across the country. Alderman Matt Martin, who represents Chicago’s 47th Ward, says that these disparities are historically rooted.
The practice of banding Black and other immigrant groups as a financial risk started in the 1930s and could also be found during the 1970s, especially in Chicago, when the Home Mortgage Disclosure Act was being established.
The technique used now is different from what was used back in the 1930s or 1970s, but the outcome remains the same. Martin believes that we should not continue to tolerate this historical and deeply rooted form of discrimination.
Historical Biases: The Prominent Cause of Disparity
The loan decisions are made by lending officers and the software at every institution, most of which are paired with quasi-governmental agencies like Freddie Mac and Fannie Mae.
However, they cover half of all the mortgages in the United States and are considered to be the trendsetters of the industry.
Freddie Mac and Fannie Mae use specific credit scoring algorithms to determine the financial capacity of the applicants.
The algorithm used by Freddie Mac and Fannie Mae considers data that is almost three decades old. This data is considered to be detrimental to people of color while rewarding traditional scores, which automatically gives an upper hand to white Americans even with weak credit scores.
It even penalizes people for past medical debts, even though the debt has been paid for.
The Federal Housing Finance Agency and other regulators allow the use of the traditional procedure and have rejected every request from advocates, industry, and even Congress to update the system.
Updated credit models also exist; for example, the one developed by “Big Three” to compete with the FICO score.
Big Three estimated that its model would provide credit to almost 37 million applicants rejected by the traditional model of FICO, and a third of them would be Black or Latino.
Most of the companies rejected answering the question asked by Markup during a study on why they are reluctant to use the Big Three model instead of FICO.
The color-blind software by Freddie Mac and Fannie Mae was launched in 1995 with the promise that this algorithm does not consider the race and ethnicity of the applicant.
However, it was seen that this color-blind software makes decisions considering criminal risk assessment and medical healthcare of the applicants.
Experts believe that the data that is installed into the software is based on historical discrimination and thus becomes automatically discriminating at the other end.
No one outside the company, including the regulators of Freddie Mac and Fannie Mae, knows the mechanism on which the underwriting software is used.
This is why the study by Markup does not include decisions made by these companies. Such data is to be reported to the government, but in the case of Freddie and Fannie, the Consumer Financial Protection Bureau scrubs the data, arguing that it could disclose the private information of the applicants.
It is a matter of grave concern that, after sidelining the Black community in various avenues, now mortgage approval is also influenced by racism.
The need for the hour is to eliminate all these embedded racist regimes in the algorithm before the social credit system takes place in its full effect. Otherwise, the discrimination would be mammoth if the crisis remains untackled.
Eli is a Political Data Scientist with over thirty years of experience in Data Engineering, Analytics, and Digital Marketing. Eli uses his expertise to give the latest information and distinctive analysis on US Political News, US Foreign Affairs, Human Rights, and Racial Justice equipping readers with the inequivalent knowledge.