Credit Score calculation and Data Privacy concerns

The article discusses credit scores and the data privacy issues that crop up due to credit scoring activities.
S.S. Rana & Co - Vikrant Rana, Anuradha Gandhi
S.S. Rana & Co - Vikrant Rana, Anuradha Gandhi
Published on
7 min read

Introduction

On May 6, 2024, the Hon’ble Supreme Court issued notice in the case of Surya Prakash v. Union of India and Others. The plea was filed accusing four foreign credit information companies (TransUnion CIBIL, Experian Credit Information Company of India, Equifax Credit Information Services, and CRIF High Mark Credit Information Services) of the following:

  1. In collusion with the Reserve Bank of India (RBI) and Central ministries, these companies are blatantly violating the right to privacy of more than one billion individual private citizens and all the business organizations operating in the country;

  2. These companies are propagating the operation of a parallel underworld economy due to credit scoring, while running an unethical, mutually beneficial business relationship with companies like Bank Bazaar, Paisa Bazaar, My Loan Care, Loan Adda and Credit Mantri

  3. Violations have been claimed under the Credit Information Companies Regulation (CICR) Act, 2005. The CICR Act and related rules and regulations together form the regulatory framework governing the manner in which credit information is accessed by players of the lending ecosystem.

  4. These companies are violating the principle of data localization, since the computer servers and even data storage systems of these companies are located outside India (click here to read the RBI's notification on localization of financial data).

  5. It was also emphasized that these companies are involved in generating credit scores and preparing the credit history of each and every customer of every bank and financial institution of this country. With the generation of credit scores and credit history from personal data, these companies pass judgment, convict citizens as 'untouchables' and discriminate against them based on creditworthiness, the plea said.

What are Credit Scores?

A credit score is a figure in a numeric range, based on an individual’s credit history. A credit history includes information like the number of accounts, debt repayment history, etc. Lenders use this credit score to quickly evaluate an individual’s creditworthiness or, simply understood, the likelihood of repayment of loans in a timely fashion.

What are the Privacy Concerns with Credit Scoring?

I. Processing of Sensitive Personal Information

The most sensitive financial data are credit histories, as they reflect not only the present economic state but give a roadmap of an individual’s economic journey. Statisticians can potentially build scores that allow the credit performance of population members to be predicted with high degrees of accuracy.

Traditional credit scoring models used the following information to assess borrowers: payment history, outstanding debt, and length of credit history, types of credit in use, new credit, and credit utilization

Alternative credit scoring on the other hand collects data from several data points. An applicant’s digital footprint can be a powerful form of alternative data for credit scoring models. From social media footprints, telecommunications data, digital payment data, and online behavioral data, to calling patterns, a person’s digital payments, e-commerce activity, geo location data, SMS and browsing history, education and contacts - all of this could be indicators of a person's income or their ability to repay a loan. These technologies have further enabled these institutions to harness the power of big data and machine learning to draw insights into consumer behavior online. Yet, the lack of transparency in the operation of such credit scoring system raises concerns about their propensity to produce arbitrary and discriminatory results.

II. Discriminatory Practices and profiling

In a study conducted on real-world mortgage data, economists Laura Blattner at Stanford University and Scott Nelson at the University of Chicago showed that differences in mortgage approval between minority and majority groups is not just down to bias, but is also linked to the fact that minority and low-income groups have less data in their credit histories. This can further the creation of biases when this information is added to the system. When this data is used to calculate a credit score and this credit score is used to make a prediction on loan default, that prediction will be less precise. It is this lack of precision that leads to inequality, not just bias.

A study on the effect of machine learning on US credit markets observed that “gains from new technology are skewed in favour of racial groups that already enjoy an advantage, while disadvantaged groups are less likely to benefit in this data set.”

How do Data Privacy Legislation help?

Data privacy greatly reduces the statistical utility of credit scores, thereby reducing the number of observations available to statisticians to build their credit scoring models on.

The European Artificial Intelligence Act has also classified credit scoring activity as High Risk, which requires establishing special protections (click here to read further: EU Parliament gives final nod to landmark AI law).

On December 7, 2023, the Court of Justice of the European Union (CJEU) considered complaints against a German scoring firm, Schufa - a private company that provides businesses with information on the creditworthiness of consumers by creating a credit score.

In the current case SCHUFA provided a credit institution with a score for the applicant, which served as the basis for the refusal to grant the credit for which the applicant had applied. The applicant then requested for  the erasure of  the entry concerning her and to grant her access to the corresponding data. However, SCHUFA merely informed her of the relevant score and the broad outline of the principles underlying the calculation method for the score, without informing her of the specific data included in that calculation or of the relevance accorded to them in that context, asserting that the calculation method is a trade secret .

The issue was put to the CJEU, which asked if credit scoring fell under the purview of automated decision making for the purposes of Article 22(1) of the GDPR, where a third party draws strongly on that credit score to reach a decision. CJEU held that only mathematical and statistical procedure was applied by SCHUFA, with no option of individual evaluation and assessment by a human being in establishing the score. Thus it falls under  automated decision within the terms of Article 22 GDPR, the CJEU held.

Secondly, it should also be noted that such refusal is likely to have a legal impact on the financial situation of the data subject, altering their status

Further, the method used by SCHUFA provides a score based on certain criteria, as a result of which conclusions can be drawn regarding the creditworthiness of the data subject. Therefore, the CJEU held that creating a credit score can be classified as prohibited profiling under Article 22 of the GDPR.

SCHUFA had refused to disclose to the certain information concerning the applicant and the calculation method on the ground that it was a trade secret. The Court, however, held that the Controllers had an obligation to provide ‘meaningful information about the logic involved’ and that this information must include sufficiently detailed explanations of the method used to calculate the score and the reasons for a certain result. In general, the controller should provide the data subject with general information, notably on factors taken into account for the decision-making process and on their respective weight on an aggregate level, which could also be useful for him or her to challenge any such decision.

Notably, the following rights of the data subject over his personal data were reiterated in this judgement:

1. Rights of the data subject under General Data Protection Regulation or GDPR

a. Right to access his personal data (Article 15)

b. Right to object to processing of his personal data (Article 21)

c. Defined profiling (art 4 predictive analysis based on personal data.)

d. Right to Erasure and rectification of personal data 

e. Protection against a sole Automated Decision without human intervention especially, when it produces legal/ significant effects (Article 22(1) read with recital 71)

2. Data controller must ensure (for data processing relating to that of the data subject)

a. Data Subject’s explicit consent for processing for specific purposes or for performance of a contract, or legitimate interests of the controller.

b. Purpose compatibility: As to the purpose the data was initially collected. Any subsequent processing must relate to the initial purpose for which the data was collected.

The Indian Stance

SPDI Rules

The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011  impose enhanced obligations on the collection and processing of “sensitive personal data or information” which include data relating to the financial information such as bank account, credit card, debit card, or other payment instrument details.  Under the SPDI Rules, consent must be acquired in writing regarding the purpose of usage of information before the collection of Sensitive Personal Data or Information. The data subject should be informed of :(i) the fact the SPDI is being collected; (ii) the purpose of collection of SPDI; (iii) intended recipients of the SPDI; and (iv) the address of the agency collecting or retaining the SPDI.

In the Digital Personal Data Protection Bill, 2022, certain situations had been set out where seeking an individual’s consent for processing of their personal data is “impracticable or inadvisable due to pressing concerns." In such situations, the individual’s consent is assumed or deemed. Further, they are not required to be notified of such processing. One such situation is for processing in ‘public interest’. The Bill included ‘credit-scoring’ as a purpose, in Clause 8(8)(d). This part is not present in the current DPDP Act.

Conclusion

With the introduction of the Digital Personal Data Protection Act, 2023, the rights of the data subject have been as underlined, as already done under the European Privacy laws. Further, the Central government may, by notification, restrict the transfer of personal data by a Data Fiduciary for processing to such country or territory outside India.

To read further about data localization and the handling of data by banks, please refer to our article: RBI action on one of the renowned Indian Banks- Is your data in other banks safe?

The World Bank has also given guidelines as to processing of data for credit scoring and has balanced the right of the consumer over his data.

About the author: Vikrant Rana is the Managing Partner of S.S. Rana & Co. Anuradha Gandhi is a Managing Associate at the Firm.

Research was conducted by Ahana Bag, Former Junior Associate at the Firm.

If you would like your Deals, Columns, Press Releases to be published on Bar & Bench, please fill in the form available here.

Bar and Bench - Indian Legal news
www.barandbench.com