AuthorGikay, Asress Adimi

CONTENTS 1. Introduction 1 The Context 1 The Key Claim 3 Raison D'etre 4 Structure 6 2. Automated Consumer Credit Scoring 7 2.1. Consumer Credit Scoring 7 2.2. Automated consumer Credit Scoring--The Rise of Algorithms 7 2.2.1. From Algorithms to Machine Learning 9 3. The Opportunities and Challenges in Automated Consumer Credit Scoring 13 3.1. The Advantages of Automated Consumer Credit Scoring 13 3.1.1. Efficiency 13 3.1.2. Impartiality--From Algorithmic Score to an Impartial Loan Officer 15 3.1.3. Financial Inclusion 17 3.1.4. An Open Question? 19 3.2. The Risk of Automated Consumer Credit Scoring 19 3.2.1. Inaccuracy 19 3.2.2. Bias and Discrimination 21 3.2.3. The Risks of Automated Consumer Credit--A Recap 23 4. Effective Consumer Protection in Automated Consumer Credit Scoring in the EU and the US 24 4.1. GDPR's General Prohibition of Solely (Individual) ADM 24 4.2. Consent and Consumer Protection 28 4.2.1. Fundamental Rights Approach to Data Protection in the EU and Consent 29 4.2.2. Market Oriented Approach to Data Protection in the US and Notice and Choice 32 4.3 Transparency and Automated Consumer Credit Scoring 35 4.3.1. Theories of Transparency 35 A. The Black Box Theory 35 B. The Opacity Theory 36 C. The Disparate Impact Theory 39 4.3.2. Transparency--the Rules in the EU and the US 39 A. The EU Approach to Transparency 39 B. The US Approach to Transparency 44 i. The EU-US Privacy Shield Framework 44 ii. The Fair Credit Reporting Act's Transparency Provisions 45 iii. Oldy but Goody 48 5. The Limits of the Law and the Future of Automated Consumer Credit Scoring Regulation 48 5.1. Machine Learning Consumer Credit Scoring--A Dead End 48 5.2. Risk-Based Approach to Regulation 50 5.3. Regulatory Sandboxing 52 6. Conclusion 55 1. INTRODUCTION


Any financial institution that engages in the business of lending money in the European Union (EU) or the United States (US) has the right and obligation to ensure a thorough assessment of the borrower's capacity to repay the loan. In the aftermath of the global financial crisis of 2008, attributed in part to the subprime mortgage crisis, (1) the responsibility to assess consumer borrowers' ability to repay has been more standardized and strengthened in both jurisdictions. (2) In modern credit risk assessment processes, the likelihood of the borrower defaulting is evaluated through a statistical method using the consumer's credit data and is reduced to a specific number--the credit score. (3) Over the years, creditworthiness assessment has evolved from interview-based assessment and decisions made by loan officers, (4) to automated decision-making with minimal human intervention. These decisions are based on data collected from the consumer, but also much more unlikely sources such as social networks. (5) These automated decisions in financial services have attracted the attention of scholars, regulators, and consumer advocacy groups who are often concerned that by using algorithms and big data, financial institutions may circumvent legal regimes that protect consumers and other vulnerable groups. This is due to the financial institution's use of predictive analysis that may bypass decision-making which is based on the objective assessment of the individual consumer's circumstances. (6)

In the EU, the most significant legal instrument governing automated consumer credit scoring is the General Data Protection Regulation (GDPR) (7) which contains a few provisions tailored to Automated Decision Making (ADM). (8) The GDPR has triggered a great deal of change in how businesses manage personal data not only for entities established in the EU but also non-EU entities with a business link to the EU. (9) While it is yet to be proven whether the GDPR provisions on ADM achieve their intended objective of protecting the consumer from potentially arbitrary and opaque algorithmic decisions, it has been touted as a model for the regulation of not only data privacy in general but also automated consumer credit scoring in the US. (10)

In 2016, the EU Commission and the US Department of Commerce implemented the EU-US Privacy Shield (PS) Framework, under which US-based organizations to whom EU-based data controllers transfer data self-certify (11) to comply with the key principles of the GDPR. (12) The final document of the EU-US PS Framework excluded the principles of GDPR on ADM. (13) In the US, the most significant federal statutes pertinent to ADM are the Financial Services Modernization Act of 1999, commonly known as the Gramm-Leach-Bliley Act (GLBA), (14) the Fair Credit Reporting Act (FCRA) and the Equal Credit Opportunity Act (ECOA). (15) Despite the existence of these sector specific legislation the EU Commission found to be satisfactory, studies claim that automated credit scoring is inadequately regulated in the US. (16) This aligns with the overwhelming sentiment that the US is lagging behind in terms of protecting consumer privacy. (17) In fact, in a New York University Law Review article, Hertza called for GDPR- inspired reform of the legal regimes governing ADM in consumer credit reporting in the US. (18)

The enactment of the first comprehensive privacy law in California in 2018--the California Consumer Privacy Act (CCPA) (19)--also seems to be suggestive of the pressure being felt by lawmakers in addressing privacy concerns in the US. This article argues that while US privacy law in general may require reform, the US does not need specific rules for automated consumer credit scoring. The existing literature calling for reform in the US is based on flawed premises that (a) the US legal rules governing consumer credit are incapable of addressing technology-driven legal challenges, and (b) the GDPR provisions on ADM effectively protect the consumer. Neither assumption has been closely examined or validated based on empirical evidence and the actual enforcement cases.


This article argues that despite the differences in the approach to regulation of ADM in consumer loan underwriting in the EU and the US, the two legal jurisdictions respond to the phenomenon in a fairly similar manner. This article examines statutory and relevant enforcement cases including judicial decisions in both jurisdictions.

First, the article aims to provide a comprehensive comparative analysis of the two legal regimes with respect to automated consumer credit scoring. Second, it shows that contrary to the prevailing view, the lack of recently implemented legal regime governing ADM in the US does not mean that US consumers are worse off when compared to their EU counterparts. Third, the challenge presented by the increasing sophistication of Artificial Intelligence (AI), especially machine learning, puts both the EU and the US in the same regulatory and legal quandary as neither jurisdiction is equipped to respond to autonomous, unpredictable, and unexplainable algorithms making critical decisions. (20)


There are three main reasons behind writing this article. First, there are theories for reform in the US inspired by the GDPR for the regulation of ADM in the consumer credit industry, (21) whose validity requires scrutiny. Existing literature portrays the GDPR as a good model for reform--a view that this articles questions. The theory is tested by analyzing the legal regimes in the two jurisdictions as well as enforcement cases (including judicial decisions) and empirical evidence on consumer behavior. In the two years since the GDPR has been implemented, no such work has been undertaken, despite academics not being shy about alluding to the superiority of the GDPR in regulating ADM.

Second, current legal developments could potentially depict US data privacy law as completely inapt to cope with technological challenges. On July 16, 2020, the Court of Justice of the European Union (CJEU) struck down the EU Commission's decision which held that the US has a privacy legal regime that provides adequate protection to EU consumers--'the Adequacy Decision' (22)--that has been valid since 2016. (23) Under this judgment, Facebook Ireland and, as a consequence of the judgment, other EU data controllers were prohibited from transferring data to the US under the Adequacy Decision. (24) This judgment is likely to amplify the sentiment that US data privacy law in general is weak. The court found the Adequacy Decision invalid only because data subjects whose data are transferred from the EU do not have the same level of protection due to lack of protective safeguard for consumer rights vis-a-vis public authorities. (25) These rights include access and enforceable rights, as well as channels for an effective remedy in the context of data processing by public authorities in pursuit of national security interest and law enforcement. (26) In other words, the prevalence of state surveillance under various legislations, such as the Foreign Intelligence Surveillance Act (FISA) (27) and Executive Order 123 3 3, (28) enables public authorities to access data from private actors without sufficient safeguards indicating that the US does not provide adequate data protection to EU consumers. (29) In the aftermath of the ECJ judgment, confusions about how good a model the GDPR is for reforming data privacy law pertaining to ADM are likely to reign, whilst the specific reasoning of the court is likely to be neglected.

Third, the general contentment with the provisions of GDPR governing ADM has the effect of deterring further necessary works that must be done to revise the rules. This article cautions about the false sense of security that seems to be prevailing regarding the level of consumer protection which the GDPR can provide.

It is not within the purview of this article to show the extent and manner in which US data privacy law should be reformed. (30) But, it argues that the rules governing ADM in the consumer credit industry available under the GDPR should not cloud...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT