Second Circuit Affirms Dismissal of Organic Baby Formula Suit on Preemption Grounds
Jenner & Block Launches FinTech Industry Group

Legal Considerations When Using Big Data and Artificial Intelligence to Make Credit Decisions

Big-data-creditIn an article for Lending Times, Jenner & Block Partner Kali Bracey and Associate Marguerite L. Moeller discuss potential legal risks that may arise as a result of companies using big data to make credit extension decisions.  The article explains that despite the growing trend of using artificial intelligence and machine learning to make unbiased credit determinations and model credit risk, big data can in fact lead to inadvertent disparate impact on protected classes.  Lenders must ensure that they abide by the Fair Housing Act (FHA) and the Equal Credit Opportunity Act (ECOA) to avoid discriminatory impact in terms of race, gender or other protected classes in lending decisions.  The FHA prohibits discrimination in securing financing for housing, while the ECOA prohibits discrimination for credit transactions.  They must also comply with the Fair Credit Reporting Act (FCRA), which requires lenders to disclose to consumers if they deny credit based on a consumer report and to disclose to consumers if they charge more for credit based on a consumer report.  The authors recommend that companies incorporate federal fair lending and credit laws into their algorithmic models.  While it is unclear how the current administration will address these issues, federal regulators are paying attention to the emerging field of big-data-based lending.  Furthermore, private plaintiffs and state attorneys general may still take action in the form of seeking punitive damages and equitable and declaratory relief or enforcing state statutes that protect fair lending.

To read the full article, please click here.