Six fintech companies and the National Community Reinvestment Coalition are asking the Consumer Financial Protection Bureau for guidance on how to use artificial intelligence in lending decisions without violating fair lending laws.
In a letter sent to the CFPB on Tuesday, the six companies – Affirm, LendingClub, Oportun, PayPal Holdings, Square and Varo Bank – publicly commit to upholding the "multiple effects" legal doctrine, which is largely unpopular among financial institutions. The differential impact legal theory, which is part of the Fair Housing Act and other fair lending laws, asserts that a policy is illegal if it has a discriminatory effect on a protected class, but it cannot be illegal if it has a business purpose that serves a "significant, legitimate, non-discriminatory interest".
The signage will be posted at the Consumer Financial Protection Bureau (CFPB) headquarters in Washington, DC, USA on Monday, March 4, 2019. The Chair of the House Financial Services Committee, Maxine Waters, will hold a hearing this week on the CFPB's semi-annual review. hold off. Photographer: Andrew Harrer / Bloomberg
Andrew Harrer / Bloomberg
The letter marks the first time a group of lenders seek a common opinion with consumer advocates on how to effectively apply legal theory, with varying implications, to credit models that use artificial intelligence, machine learning algorithms, and alternative data.
"Disparate Impact deals with discrimination that can occur when decisions are the result of algorithms or data rather than human intent," the letter said. “We appreciate the statistical, results-based approach of disparate impact to identify discrimination. We also believe that this results-based approach creates different impacts as an innovation-friendly framework to prevent discrimination. "
The fintech's request for guidance comes just days after the Department of Housing and Urban Development overturned a Trump administration rule that weakened the various effects.
The letter is in response to a March request for information from five federal agencies on the use of artificial intelligence and machine learning, which could be a first step towards cross-agency policy. The comment deadline for this public request for information has been extended to July 1.
The letter also comes as the CFPB sets out how to update the Equal Opportunities Act, which bans discrimination in credit and credit decisions. The Office issued a request for information a year ago on whether the 1974 Act and its implementing regulation, Regulation B, should be revised in light of two separate Supreme Court rulings on gender identity and differential impact discrimination.
The Equal Credit Opportunity Act covers willful discrimination, and the CFPB has argued that unintentional discrimination, especially the differential impact theory, also applies under the ECOA. Many experts dispute this view. The Supreme Court has found that the Fair Housing Act contains a different standard of effectiveness.
The letter asks the CFPB to explicitly point out that models that use artificial intelligence have different effects. In addition, companies seek clarity about inequalities in credit decisions and when these escalate into potentially discriminatory grounds. The six fintechs are on the NCRC's Innovation Council for Financial Inclusion. The NCRC is a national network of advocates for fair lending, fair living and consumer rights.
"We want to lead the financial services industry toward effective use of fair credit analysis and an effective differential framework to get this right," said Brad Blower, general counsel of the NCRC, in an interview Tuesday.
The fintechs and the NCRC also want the CFPB to provide clear guidelines on when and how lenders should look for alternative models or dates. They also ask the office to explain how lenders can collect demographics for their own lending compliance checks.
Several of the fintech CEOs publicly reiterated their commitment to different impact regulations as essential safeguards against racial bias in financial services.
"We know that racism in the financial services industry has contributed to much of the economic inequality we see today," said Colin Walsh, CEO of Varo Bank, in a press release.
Others said that different rules of action are crucial for both society and industry.
"Strong regulatory protection against discrimination in lending is critical to a just society and the success of technology innovations," said Armen Meyer, head of public policy at LendingClub. "Fintech has come together to help strengthen these regulations that address the risk of digital redlining."
The fintech lenders said the CFPB's guidelines on how anti-discrimination laws will affect digital lending would also give investors confidence to invest in technology.
"The CFPB can protect consumers from digital discrimination by speaking directly to the public about how it will enforce fair lending practices in an increasingly digitalized market," said Jesse Van Tol, managing director of the NCRC, in a press release. "We cannot stand by and allow algorithms to revive old prejudices in new packages or introduce new forms of discrimination that are hidden in proprietary code."
The day-to-day use of artificial intelligence and machine learning systems makes it "critical that (the CFPB) update its regulations to hold lenders responsible for building insurance systems that prioritize consumer rights, and especially for historically disadvantaged and underserved groups" said Van Tol said. "We must not allow complexity to be neither an opportunity nor an excuse for digital discrimination."
Fintechs may also be arguing over staking out their positions before Rohit Chopra, the Biden government's candidate to run the CFPB, takes over the helm of the agency following Senate approval. Chopra, a Federal Trade Commission agent, has written extensively on technology companies and many pundits say he wants to peek under the hood of the black box algorithms used by many lenders that promote "inclusivity."