Artificial Intelligence and Fair Credit Decisioning: Federal Regulators Lean In

Guest Blog on AI and Fair Credit Decisioning by Melissa Richard with headshot

Look up the definition ofinnovation” in the Merriam-Webster Dictionary: A new idea, method, or device (novelty); the introduction of something new. In recent years, but particularly during the 2019-2020 cycle, federal financial services regulators – CFPB, FRB, OCC, FDIC and NCUA – are stepping up efforts to partner with industry stakeholders, technology experts and housing experts to address, through innovation, the chronic disparities in US homeownership between white and minority communities.

US Census Data for 3Q 2020 reported that while the overall rate of home ownership has improved to 67.4% in 2020 compared to 63.5% in 2016, Black and Hispanic households saw negligible improvement compared to White households during that same time period.

 

3Q of YearUS National Average (%)White Household (%)Black Household (%)Hispanic Household (%)
202067.475.8     (+8.40)46.4     (-21.00)50.9    (-16.50)
201964.873.4     (+8.60)42.7     (-22.10)47.8    (-17.00)
201663.571.9     (+8.40)41.3     (-22.20)47.0    (-16.50)

 

Home ownership is a primary source of wealth in the US. However, according to published studies by Brookings Institute, Urban League and others, the cost of this wealth gap to the US economy is in the trillions of dollars. The CFPB foresaw this trend among an overall disparity in access to affordable and fair credit programs for minority communities and small businesses back in 2015. In its May 2015 published study on “Credit Invisibles,” the CFPB found 26 million (1 in 10) US adults had no credit record with the nationwide credit bureaus and another 19 million US adults had too little information to be evaluated by a widely used credit scoring model. Based on these findings, the CFPB suggested that the regulatory community combine with financial services industry and consumer protection stakeholders to explore how Artificial Intelligence and Machine Learning used in credit scoring and credit decisioning might expand credit access to this population. At the same time, CFPB stated its hope that AI modeling innovation would increase transparency in credit transactions while raising the level of financial data privacy protections.

Interagency Statement Supporting Innovation in Fair Credit Decisioning

Building off that 2015 study, the CFPB along with FRB, OCC, FDIC and NCUA (collectively, the “Agencies”) published their Interagency Statement supporting use of Alternative Data in Credit Underwriting on December 13, 2019. In it, the Agencies termed “Alternative Data” as information not typically found in consumer credit reports or standard credit applications, but that could reliably predict one’s ability to repay debt obligations. For example, cash flow data drawn from reliable sources like bank account records (for those having a bank account), cell phone and utility bills can be effective in predicting ability to repay debt obligations.

While the Agencies show support in their Interagency Statement for innovating credit decisioning solutions, they caution that alternative data models must be designed, developed, tested and monitored in compliance with relevant consumer protections laws, primarily ECOA, FCRA and UDAAP. After all, alternative data algorithms are built by humans, and the data selected for use can have unintended discriminatory affects for the population it was designed to serve. With that, the Agencies instruct financial services industry stakeholders to develop compliance management programs around alternative credit decisioning models that addresses the risk of compliance with ECOA, FCRA and UDAAP among other laws, and continuously monitor for unintended discriminatory effects on protected borrowers.

For industry stakeholders, this means that a compliance management program must identify which alternative credit decisioning models be used and in what loan programs and then prescribe the controls to be put in place to measure and monitor the quality and suitability of alternative data elements being deployed for each loan product in order to identify and respond appropriately to consumer protection risks.  The Interagency Statement points to the Supervisory Guidance on Model Risk Assessment that have been published in recent years by FRB, OCC and FDIC as an outline for creating the compliance management program.

Additional CFPB Efforts to Spur Innovation in AI Modeling and Fair Credit Decisioning

In 2020, CFPB went further in its guidance and support of AI and Machine Learning innovation for fair credit decisioning.  On July 7, 2020, CFPB’s Office of Fair Lending and Equal Opportunity teamed up with CFPB’s Office of Innovation to published an Innovation Spotlight highlighting actions the Agency is taking to address disparities in homeownership and overall access to credit through innovation and collaborations with stakeholders. Among the CFPB’s 2019-2020 efforts to promote innovation while reducing the risk of regulatory uncertainty were:

  • No Action Letter (NAL) Policy (Revised) – To streamline the Agency review process focusing on consumer benefits and risks of the product or service in question, affording regulatory compliance certainty in favorable NAL’s.
  • Trial Disclosure Program (TDP) Policy – Allowing parties seeking to improve consumer disclosures to conduct in-market testing of alternative disclosures for a limited prescribed time period.
  • Compliance Assistance Sandbox (CAS) Policy – Providing temporary “safe harbor” relief from risk of liability under TILA, EFTA and ECOA for innovators testing a new financial product or service where there is regulatory uncertainty. Innovators must abide by prescriptive terms of CFPB approval.
  • Pronouncement on ECOA/FCRA Adverse Action Notice Requirements when applying AI/ML algorithms – These laws require creditors to provide consumers with main reasons for a denial of credit applied for or other adverse action. Advancements in AI/ML modeling bring forth questions of how creditors meet their adverse action notice obligations when an AI decisioning is one of the main reasons. Here, the CFPB announced its finding that the existing regulatory framework has built-in flexibility that can be compatible with AI algorithms. For example, the Official Interpretation to Reg B provides that a creditor need not describe how or why a disclosed reason (factor) adversely affected an application [see, 12 CFR pt. 1002, Comment 9(b)(2)-3] or for credit scoring systems, how the factor relates to creditworthiness [see, at 9(b)(2)-4]. Thus CFPB writes, “[t]his flexibility may be useful to creditors when issuing adverse action notices based on AI models where the variables and key reasons are known, but which may rely upon non-intuitive relationships.”

During 4Q 2020, the CFPB rolled out two additional initiatives to help spur innovation in the use of AI algorithms for credit decisioning. The first was innovation Tech Sprints. CFPB hosted its Fall 2020 Adverse Action Tech Sprint on October 5-9, 2020 whereby teams of 5 across technology, legal and policy fields interacted virtually with CFPB expert panelists to develop new ideas for delivering meaningful AA notices based on AI  and Machine Learning credit decisioning. At the end of the week, teams were invited to continue their efforts by applying to CFPB’s TDP to field test their innovations. Then on November 30, 2020, the CFPB finalized its Advisory Opinion Policy, whereby CFPB will provide written advisory opinions on topics of legal uncertainty and answer interpretive questions about consumer protection laws and rules under its administration. Advisory opinions from CFPB are a useful way for reconciling new innovation methods with pre-existing regulations (administered by CFPB) in order to minimize the risk to reputation derived from enforcement action.

Requests for advisory opinion must be submitted by email to advisoryopinion@cfpb.gov.

HUD Diverges from Agencies’ Support in Its Final Disparate Impact Rule

HUD has not embraced innovation in the area of fair credit decisioning like the other Agencies in 2019-2020. In September 2020, HUD finalized its Disparate Impact Rule, leaving out a proposed defense to discriminatory effect allegation surrounding the use of algorithmic models in credit decisioning where shown to be unbiased. HUD’s states its reasoning for not including algorithmic modeling as a defense as follows: “HUD expects there will be further development in the law in the emerging technology area of algorithms, artificial intelligence, machine learning and similar concepts. Thus, it is premature at this time to more directly address algorithms.”

This is curious for two reasons. First, HUD’s important counter-cyclical role of providing housing finance through its FHA loan program to US adults who do not have access to traditional, conventional mortgage financing for home ownership. Second, HUD has not subscribed to the definition of Innovation – new systems, products and services going beyond what existing laws and regulations were originally designed for. Under HUD’s premise, innovation should wait for laws and regulations to catch up rather than be reactive to unintended risks resulting from the innovation.

Conclusion and Challenges for Mortgage Lender and Investor Stakeholders

So, what do mortgage lenders and, more importantly, government and private loan investors do with this Agency support for innovation in credit decisioning? Is 2021 the optimal time to embrace AI and Machine Learning for predictive ability to repay credit decisioning?

Arguments can be made at this point both for and against implementation in 2021. One positive factor is the MBA forecasts an incremental risk in mortgage rates during 2021-2022 that will result in a sharp decline in the historical refinance loan volume seen in 2020 while purchase money volume will rise and perhaps beat historical records. While not every underserved, underbanked US adult described as credit invisible in CFPB’s 2015 study can or should achieve home ownership through mortgage financing today, both the estimated size of that population coupled with the Agencies’ assertive efforts to promote innovation in order to bridge the home ownership and wealth gaps in the US are compelling reasons to cautiously move forward.

Bottom line?

The key to bringing alternative mortgage programs to market for qualified US adults who are underbanked or underserved is for the public and private loan investor communities, in combination with federal, state and local housing regulatory agencies, to collaborate with fintech innovators on development of fair credit decisioning models. For mortgage lenders that see 2021-2022 as ripe for boosting purchase money loan production by reaching out to these communities, follow the Interagency Statement and Supervisory Guidance published by the Agencies for developing compliance management programs addressing each model and corresponding loan program being deployed, and what testing, monitoring and controls will be utilized to ensure compliance with ECOA, FCRA, UDAAP and other affected consumer protection laws.

Leave a Comment