All lenders must explain their rationale when denying credit to loan applicants, the Consumer Financial Protection Bureau affirmed Thursday, after looking into whether companies claiming exemption because they rely on “complex algorithms” were in violation of federal anti-discrimination law.
The requirement also applies to mortgage lenders, legal experts say. The CFPB, a government watchdog agency, said lenders aren’t absolved from adverse action notice requirements under the Equal Credit Opportunity Act if they use complex algorithms, which Rohit Chopra, the director of the CFPB, has dubbed “black-box models.”
The CFPB issued a statement to warn creditors that technology being “too complicated” is not an excuse for noncompliance.
“Creditors who use complex algorithms — including artificial intelligence or machine learning technologies — to engage in credit decisions must still provide a notice that discloses the specific, principal reasons for taking adverse actions,” CFPB said in a news release.
“There is no exception,” it added.
The bureau also urged whistleblowers in the tech field to come forward with information about companies violating ECOA.
According to Kris Kully, an attorney at Mayer Brown, the notice will affect mortgage lenders because many of the underwriting systems lenders use to measure default risk contain proprietary algorithms, or “black-box models.”
This case study explores how Fulton Mortgage Company achieved its goal of delivering a more personalized, digital mortgage experience for borrowers, while also increasing production and return on assets.
Presented by: Mortgage Coach
“The designers historically have been unwilling to tell lenders all the factors used and how they are weighted,” Kully said. “They often just say that they ‘don’t consider any prohibited factors, and it’s very predictive of default,’ and that’s about as much information as the lender gets, making it difficult for lenders to tell a borrower the reasons why an application is denied.”
Kully expects tweaks will be made so the systems “start to provide information, so that lenders can make those required disclosures.”
In his own statement Thursday, Chopra said companies have “legal responsibilities when they let a black-box model make lending decisions.”
“The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand,” Chopra said.
The CFPB has raised concerns in the past that algorithmic models perpetuate discriminatory mortgage lending practices.
In October 2021, a multi-agency effort to combat redlining was announced between the CFPB, the Department of Justice and the Office of the Comptroller of the Currency. At the time, Chopra said the effort also would include combating digital redlining.
“These algorithms are black boxes behind brick walls,” Chopra said during a press conference in 2021. “When families and regulators do not know how decisions are made by these algorithms, we are unable to participate in a fair and competitive market, free from illegal bias.”
What tangible results came to fruition because of the multi-agency effort remain unknown. The CFPB did not immediately respond to a request for comment.
The government watchdog in February also announced a potential clamp down on the use of automated valuation models by lenders and appraisers.
The CFPB said it is concerned automated valuation models may reflect bias in design and function. Specifically, the bureau said mathematical models may rely on biased data, resulting in inaccurate valuations.
Without proper safeguards, these models could digitally redline neighborhoods and perpetuate historical disparities, according to the agency.
To address those potential risks, the CFPB worked with other regulators — including the Office of the Comptroller of the Currency, the Federal Deposit Insurance Corporation, the National Credit Union Administration and the Federal Housing Finance Agency — to consider a potential interagency requirement. Under it, institutions would establish policies, practices, procedures and control systems to ensure their AVMs comply with applicable nondiscrimination laws, the bureau said.
But before introducing the proposed rule, the CFPB, in accordance with federal law, convened a review panel to get feedback from small businesses that could be affected by the proposal.
The feedback from the small business review panel was mixed, according to a summary published by the CFPB. Some panel members expressed concern about the cost of complying with the AVM rule and recommended the CFPB explore options for lowering compliance costs.
Others expressed support for the use of AVMs, while some said they prefer valuations by licensed appraisers because it is easier to understand than AVM methods.
Panel members also stressed the need for greater clarity about how the Government Sponsored Enterprises, the Department of Housing and Urban Development (HUD),
Department of Veterans Affairs (VA) and other agencies and investors will allow originators
and aggregators to rely on AVMs in the future.