Criticism of credit scoring systems in the United States

Credit scoring systems in the United States have garnered considerable criticism from various media outlets, consumer law organizations, government officials, debtors unions, and academics. Racial bias, discrimination against prospective employees, discrimination against medical and student debt holders, poor risk predictability, manipulation of credit scoring algorithms, inaccurate reports, and overall immorality are some of the concerns raised regarding the system. Danielle Citron and Frank Pasquale list three major flaws in the current credit-scoring system:
 * 1) Disparate impacts: The algorithms systematize biases that have been measured externally and are known to impact disadvantaged groups such as racial minorities and women. Because the algorithms are proprietary, they cannot be tested for built-in human bias.
 * 2) Arbitrary: Research shows that there is substantial variation in scoring based on audits. Responsible financial behavior can be penalized.
 * 3) Opacity: credit score technology is not transparent so consumers are unable to know why their credit scores are affected.

The scoring system has also been critiqued as a form of classification to shape an individual's life-chances—a form of economic inequality. Since the 1980s, neoliberal economic policy has created an inverse correlation between the expansion of credit and a decline in social welfare—deregulation incentivizes financing for the consumption of goods and services that the welfare state would alternatively provide. Credit scoring systems are seen as scheme to classify individuals creditworthiness necessitated by the loss of these collective social services. The credit scoring system in the United States has been compared to, and was the inspiration for, the Social Credit System in China.

The use of credit information in connection with applying for various types of insurance or in landlord background checks (for rental applications) has drawn similar amounts of scrutiny and criticism, because obtaining and maintaining employment, housing, transport, and insurance are among the basic functions of meaningful participation in modern society, and in some cases (such as auto insurance) are mandated by law.

Discriminatory effects
Credit scores are widely used as the basis for decisions to allow or deny individuals the opportunity to do things such as taking out loans, buy houses and cars, and open credit cards and other kinds of accounts. This has been criticized as a practice having discriminatory effects. Credit companies purport to measure creditworthiness by looking at information like: the number of accounts held, the age of associated credit accounts, consumer payment history of borrowed money, and the punctuality and consistency of payments.

As credit scores have become necessary to maintain credit and purchasing power, this system has been criticized as a wall between favored and disfavored classes of people. The expansion of accessible credit can come with a downside of exclusion as people with poor credit (those that are considered high risk by credit scoring systems) become dependent on short-term alternatives such as licensed money lenders (the home credit industry), pawn brokers, payday lenders, and even loan sharks. Credit scores can function as a form of social hierarchy that creates opportunities to exploit poor Americans. This can also prevent people from ever escaping their poverty or a poor financial past.

Credit scoring systems also act as a way to treat individuals as objects that are subject to a particular set of quantifiable attributes. In addition, they have a degrading potential that celebrates calculability over human needs. Discriminatory responses to poor credit create a self-fulfilling prophecy as it raises costs for future financing which increases the likelihood of being unemployed or insolvent. Since credit scores aim to classify people, other markets have expanded its applicability for use as a screening or assessment tool. Credit is no longer used just for financial products such as mortgage loans, but is increasingly being applied cross-institutionally for other services such as:
 * car insurance
 * health insurance
 * starting utilities (electricity, natural gas, water, etc.)
 * employment
 * rental housing
 * small purchase financing (e.g. cell phones, appliances, etc.)

Alternative credit scoring systems can use data such as rental payments, utility payments, subprime credit, and cell phone bills. Other sources are social media activities, internet browsing history, employment history, student history, past loan application dates and locations, or the method one uses when purchasing gasoline. Scores have also used for bespoke purposes such as dating. Prior to the formation of the Fair, Isaac and Company (FICO) or the Fair Credit Reporting Act of 1970), early credit scoring systems such as the Retail Credit Company (now Equifax) in Atlanta, Georgia gathered information on individuals' sexual lives, disabilities, their political ideologies, and social behaviors. Today, some scoring systems such as those developed by Versium Analytics are moving far beyond scores for financial products to measure probabilities that a consumer will commit fraud, cancel a subscription, be at risk of identity theft, buy environmentally friendly goods, donate to charity, among others.

Racism
Credit score systems are well known to contain racial bias and have been shown to increase racial disparities as studies show that African American and American Latino populations have substantially lower scores than the white American population on average. Racial discrimination also results in impacts on the credit scores and economic security of communities of color—that ultimately, "entrenches and reinforces inequality by dictating a consumer's access to future opportunities".

Numerous studies have found racial disparities in credit scoring:
 * 1996 study found African-Americans were three times as likely to have FICO scores below 620 as whites and that Hispanics were twice as likely.
 * 1997 study found Black, Indigenous, and people of color [BIPOC] neighborhood consumers had lower credit scores.
 * 2004 study found high Black, Indigenous, and people of color [BIPOC] zip codes to have significantly worse scores than non-Black, Indigenous, and people of color [BIPOC] zip codes.
 * 2004 study found that African American and Hispanic consumers constituted over 60% of the consumers having the worst credit scores.
 * 2004 study found the median credit score for whites in 2001 was 738, but the median credit score for African Americans was 676 and for Hispanics was 670.
 * 2004 research study found fewer than 40% of consumers who lived in high-Black Indigenous and people of color [BIPOC] neighborhoods had credit scores of over 701.
 * 2006 studied US counties with high Black, Indigenous, and people of color [BIPOC] populations determining that those countries had lower average credit scores than predominantly white counties.
 * 2007 study by the Federal Trade Commission found that African Americans and Hispanics strongly overrepresented in the lowest scoring categories regarding auto insurance company's use of credit scores.
 * 2007 report found significant racial disparities in 300,000 credit files matched with Social Security records with African American scores being half that of white, non-Hispanics.
 * 2010 study found that African American in Illinois zip codes had scores of less than 620 at a rate of 54.2%. In zip codes that were majority Latino, 31.4% of individuals had a credit score of less than 620, and only 47.3% had credit scores greater than 700.
 * 2012 study examined the credit scores for about 200,000 consumers finding the median FICO score in majority minority zip codes was in the 34th percentile, while it was in the 52nd percentile for low minority zip codes.

The outcomes for Black Americans because of this bias are higher interest rates on home loans and auto loans; longer loan terms; increased debt collection default lawsuits, and an increase in the use of predatory lenders. FICO has defended the system stating that income, property, education, and employment are not evenly distributed across society and it is irrational to think an objective measure would not exhibit these discrepancies. Tamara Nopper, sociologist at The Center for Critical Race & Digital Studies has stated that to solve the true issue of racism is not just to regulate it, as politics focus on, but to eliminate it in favor of public-owned banks that serve the community instead of shareholders.

A related concept of insurance scoring has also been shown to discriminate along racial lines, disproportionately harming black and Latino populations.

Employment
Employers are unable to access credit scores on the credit reports sold for the purposes of employment screening but are able to acquire debt and payment history. Credit reports are legal to use for employment screening in all states, although some have passed legislation limiting the practice to only certain positions. John Ulzheimer, president of The Ulzheimer Group and the founder of CreditExpertWitness.com, stated in a CNBC report that, "[credit scores] indicate if you're in financial distress. These are attributes that are important to employers. For example, would you want to hire someone in your accounting department who can't manage their own obligations?". This approach has been noted as a discriminatory issue as the decisions can prevent one from gaining employment. Eric Rosenberg, director of state government relations for TransUnion, has also stated that there is no research that shows any statistical correlation between what's in somebody's credit report and their job performance or their likelihood to commit fraud. The National Consumer Law Center (NCLC) has stated that credit scoring perpetuates economic inequality by controlling access to opportunities in the future as well as important necessities such as employment.

In 2009, TransUnion representatives testified before the Connecticut legislature about their practice of marketing credit score reports to employers for use in the hiring process. Legislators in at least twelve states introduced bills, and three states have passed laws, to limit the use of credit check during the hiring process.

Medical debt holders
Medical debt is often a barrier to obtaining credit, housing, and employment. Because medical situations are often unexpected, they can cause an individual or family to experience financial distress, especially when unanticipated or "surprise" bills are unable to be paid.

The debt is reported to credit bureaus due to payment delays, insurance disputes, confusion, or the dysfunctional nature of the US healthcare finance system.

Credit scores treat medical debts the same as any other debts despite their involuntary nature (unlike opening a credit card for example). Some states have implemented laws to protect consumers against medical debts affecting their scores ranging from:
 * Prohibiting the reporting of medical debt for a certain time period after billing.
 * Protections within payment plans for consumers.
 * Restriction of reporting of medical debt for uninsured or underinsured patients or for patients that are negotiating disputes with their health insurance company.
 * Requirements of notice when debt is reported.
 * Protections focused on vulnerable patients (such as children).

The NCLC recommends eight key requirements for policy reform: 1) expansion of public financial assistance; 2) financial assistance minimum standards; 3) large health care facilities must screen for eligibility for insurance; 4) language assistance for understanding the financial process; 5) payments start after 90 days; 6) clarification of contractual violations for a hospital's forgiveness of a patient's copay, coinsurance, etc.; 7) protecting family members from a loved ones debts; and 8) enforcement of the statute through a private right of action.

Student debt holders
The non-profit organization Student Debt Crisis along with Summer, a social impact startup that helps student debt holders published a national survey in 2018 that found 59% of respondents were prevented from making large purchases, 56% from buying a home, and 42% from buying a car. 58% reported that their credit scores had declined due to the debts, 28% were unable to start a business, 10% reported failing a credit check for a job prospect, and 13% failed a credit check for an apartment application. Rental application rejections and the inability to find sufficient housing is a well known consequence of credit scores as it leaves college graduates unable to participate in society. Even if loan payments are never late, debt-to-income ratios can be too high for landlords to approve an application. Buying a home can be even more difficult, if not impossible, as student loans are often as big as or larger than an average mortgage.

Inaccuracies and algorithmic subjectivity
Consumers in the US have very little control over how they are scored and even less ability to dispute unfair, biased, or inaccurate credit report assessments. Scoring is automated, which results in potential consequences, often lacking oversight. Credit reports by the three largest companies are commonly found to be incorrect with thousands of cases going to court each year. Federal law requires agencies to investigate disputed information; however, "the agencies have operated for decades with systems that make it nearly impossible to conduct a comprehensive investigation, attorneys and consumer advocates say. The law is so nuanced, they say, that credit bureaus can essentially wash their hands of meaningful review." In 2020, 280,000 complaints were filed to the CFPB regarding credit reporting error issues. One of the alleged reasons for the excess of errors, according to Matt Litt, consumer campaign director with U.S. Public Interest Research Group, is that the credit reporting agencies are not incentivized to fix them because consumers are not the customers, but are instead the product—lenders, landlords, and other businesses seeking credit information are the customers. CNBC reported that there is an "astounding number of errors in the credit reports that are the result of misaligned economic and legal incentives", and a public poll by the Morning Consult indicated (74%) a demand for new laws or regulations to deal with credit bureaus. CNBC proposed three solutions to the issue of inaccurate reports:
 * Liability for incorrect data must be changed as currently, there is no one held accountable and no penalties for not investigating disputes.
 * Credit reports should be free and proactively available for consumers to monitor for inaccuracies.
 * Expand the information usable by reports using big data.

A large percent of credit scores are estimated to have inaccuracies. A portion of the inaccuracies stem from misattribution errors from the intermixing of data due to similar names or information. Alternative data using personal data outside of the scope of traditional credit scoring is also known to contain inaccuracies. Further, none of this data collection, the methods, or the parameters used to determine creditworthiness are public information. Unfair judgements of creditworthiness creates an unfair and socially unjust system that restricts participation in society. These algorithmic inaccuracies driven by big data can have serious implications for human identity and status in society, a concept known as the "scored society".

Inaccuracy
Because a significant portion of the FICO score is determined by the ratio of credit used to credit available on credit card accounts, one way to increase the score is to increase the credit limits on one's credit card accounts. This has been criticized as it acts as a way to incentivize accumulation of debts and deincentivizes people from financing purchases themselves through saving, as well as normalizes the credit-debt system and consumerism.

Credit invisibility
The concept of "credit invisibility" (a term used by the Consumer Financial Protection Bureau, the CFPB ) is factored into this as there are many individuals who do not use or need credit (usually the elderly), avoid using credit, or avoid participating in the credit system. Being credit invisible puts consumers at a disadvantage. Hispanic Americans are typically more likely to pay in cash and pool resources with extended family. None of this is visible to credit reporting agencies and therefore leaves Hispanics without the ability to make major purchases. Another group of Americans that are, "left in digital poorhouse," a phrase coined by social scientist Virginia Eubanks, are young—in particular, millennials. This is due to access versus ownership—unable to purchase because of low credit, they seek alternatives to buying cars or houses. They also do not use credit cards as much as cash and rely on mobile payment apps like Venmo. None of these transactions are captured by credit reporting agencies and leave students' credit invisible. Further, millennials report believing that being debt free is a sign of financial success. To build a credit score requires one to take on debt, acting effectively as a debt score.

Alternative scoring systems
Credit invisibility combined with the rise of big data and artificial intelligence has given rise to a new market that challenges the traditional FICO model of credit scoring. The use of alternative data has been pursued as a means to access more consumers, a form of market competition in an industry seeking greater profits. Controversy exists regarding the invasive nature of the technology. Some of the issues are summarized here:
 * Violation of due process may result as artificial intelligence scores may miscategorize consumers. Due process laws along with regulations based on this tradition must be used as a protective measure.
 * Credit scoring systems using AI lack transparency in decision making as the technology is patented.
 * Predictive algorithms run a high risk of being inaccurate and unfair, affecting peoples lives in discriminatory or arbitrary ways.
 * Alternative data collection can be invasive as it collects data beyond the scope of financial transactions (such as paying utility bills) to generate "digital characters" based on social media accounts or internet browsing history.
 * Violation of consumer protection and fair lending laws (as well as human and civil rights violations) may result as privacy and security may be jeopardized.
 * Big data is attempting to address the issue of traditional credit scoring's inability to accurately predict risk, "credit invisible" populations, and "thin file" populations (people that have very limited or outdated credit histories). The aim is to build credit histories based on alternate information; however, it may result in lower scores instead of no scores (especially for people who are low-income) due to financial prioritization such as getting behind on utilities for high-cost months in favor of critical items.
 * Policymakers and regulators must focus on data accuracy, verifiable predictiveness, and the potential for discrimination. Research strongly indicates none of these are being met by alternative credit scoring companies.
 * Redlining may return due to hidden biases in the algorithms.
 * The more data points used for a credit assessment, the greater the difficulty in transparency.

Poor predictor of risk
Credit scores are enhanced by having multiple credit cards, the use of credit cards, and having installment loans. However, financially secure individuals who do not use multiple credit cards, or who self-finance expenses, may be inaccurately assessed a lower credit score. Some have blamed lenders for inappropriately approving loans for subprime applicants, despite signs that people with poor scores were at high risk for not repaying the loan. By not considering whether the person could afford the payments if they were to increase in the future, many of these loans may have put the borrowers at risk of default. Some banks have reduced their reliance on FICO scoring. For example, Golden West Financial abandoned FICO scores for a more costly analysis of a potential borrower's assets and employment before giving a loan.

Non-transparency
Credit scoring technologies are not public information as they are proprietary trade secrets of the companies that invent them.

Regulation
Very little regulatory framework exists to ensure credit scoring algorithms are fair. It has been suggested that scored individuals need to be granted rights for the various steps in the scoring process such as the method of data collection, how the score is calculated, to whom the score is disseminated, as well as how the score is used. The Federal Trade Commission has also been targeted as the institution that should have greater regulatory oversight of the credit-scoring process as well as have access to credit-scoring systems to ensure fairness and accuracy.

Ethics, morality, and inequality
Credit scores have been criticized as a systematic way to measure morality. They track consumption choices over time and so they are used to reflect a person's ability to manage money. The classification system of credit scores "rewards consumers who belong to the right category", and excludes those who are on the fringes of classification; credit scores nominally intended as a gauge of reliability as a lender becomes instead a gauge of morality. Companies keep records of purchasing behavior, which suggests certain behavior patterns, some of which are rewarded and others are punished—usually in ways that broaden the economic and (perceived) moral gaps between richer and poorer persons. These punishments can include higher premiums, loss of privileges, poorer service, or higher interest rates, which ultimately affect credit score and purchasing power. This idea is similarly expressed with the Social Credit System in China as it acts as a tool to, [fix] moral decay" and "encourage positive economic and moral behaviours". The parallel between the two systems is that China's is outside of the market, while the United States' is within the market, so it goes noticed as an issue of morality. Jonathan Cinnamon of the University of Exeter states the unfairness of credit scores and how they impeded our ability to function in society: "Inability to secure a loan, mortgage, job, or health insurance due to inaccurate placement in a ‘risk’ category is clearly unfair, however the accuracy of the classification is perhaps unimportant in the context of social justice—accurate or not, personal scoring systems ‘make up people’ (Hacking 1999); they produce new social categories of difference and restrict our ability to shape our own sense of self, a clear threat to parity of participation in social life." Jackie Wang of the University of Southern California writes in Carceral Capitalism about how credit scores ultimately make moral judgments that increase inequality: "Nowadays, credit scores have a number of often invisible effects on our lives. Credit scores (and even more dubious 'e-scores' determined by private data mining companies) are often used for hiring purposes because employers believe that credit scores are a reliable way to index a person's level of responsibility. Yet considering that medical debt is the most common cause of bankruptcy in the United States, and that there are racialized structural barriers to accessing nonpredatory forms of credit, it is outrageous to use credit scores as a way to measure someone's personal character and make moralistic judgments about them. You could have a terrible credit score simply by being an uninsured black or brown person (without accumulated wealth) who gets into a bicycle accident. In short, using credit scores to punish poor people exacerbates already-existing socioeconomic inequalities." Marion Fourcade of the University of California Berkeley and Kieran Healy of Duke University discuss the concept of credit scoring as a tool for moral judgement, übercapital, as well as a form of class struggle. "In the 1960s, there was a debate centered on the notion that ‘‘the poor pay more’’ (Caplovitz, 1963). With the Great Society and the expansion of welfare programs, it waned. But its main idea—that being poor costs money, that firms looking to do business with the poor know this, and systematically exploit it—is worth retooling for a neoliberal era. Debt has become more accessible, but also a lot more expensive at the bottom end of the social scale. And now it is not simply the ‘poor’ that pay more, but much more specific categories of people, measured and targeted by moralized market instruments and differentiated market institutions. Classification situations may have become the engine of modern class situations." , a legal expert on artificial intelligence, algorithms, and machine learning and Danielle Citron of the University of Virginia School of Law contend that the algorithms used to decide credit scores need moral justification because of the large impact they can have on individuals. "Predictive scoring may be an established feature of the Information Age, but it should not continue without check. Meaningful accountability is essential for predictive systems that sort people into 'wheat' and 'chaff,' 'employable' and 'unemployable,' 'poor candidates' and 'hire away,' and 'prime' and 'subprime' borrowers. Procedural regularity is essential given the importance of predictive algorithms to people's life opportunities-to borrow money, work, travel, obtain housing, get into college, and far more. Scores can become self-fulfilling prophecies, creating the financial distress they claim merely to indicate. The act of designating someone as a likely credit risk (or bad hire, or reckless driver) raises the cost of future financing (or work, or insurance rates), increasing the likelihood of eventual insolvency or un-employability. When scoring systems have the potential to take a life of their own, contributing to or creating the situation they claim merely to predict, it becomes a normative matter, requiring moral justification and rationale."