User:CRCools/sandbox

The Black Box Society: The Secret Algorithms That Control Money and Information is a 2016 academic book authored by law professor Frank Pasquale that interrogates the use of opaque algorithms—referred to as black boxes—that increasingly control decision-making in the realms of search, finance, and reputation.

Pasquale uses the term "black box" as a metaphor with dual meanings: a black box can describe both a recording device (such as a data-monitoring system), as well as a system whose inner workings are secret or unknown.

The 319-page academic book, published by Harvard University Press, contains six chapters. Chapter one introduces the challenge of investigating technologies whose functions are overwhelmingly complex and incredibly mysterious. Chapter two examines citizens’ digital reputation and the automated decision-making that can perpetuate systemic disadvantage for some while advantaging others. Chapter three exposes the hidden mechanisms of profit-driven search engines through a series of disputes over bias and abuse of power in Silicon Valley. Chapter four investigates the deeply problematic, unethical use of automated decision-making in the finance industry. Chapter five deconstructs the need to open black boxes, while chapter six stresses the emergent threat that black boxes pose to democratic societies and capitalist economies, as well as the need for an informed, autonomous citizenry.

The Black Box Society has been reviewed in several academic journals by experts in the field, who largely praise the book for both its originality and timeliness as well as its vital contributions to the areas of law, technology, and social science. However, the book has received some critical feedback on its conception of transparency as a solution to black boxes—raising questions surrounding privacy protection and ethics—as well as its operationalization of the term "black box."

Definition of "Black Box"
In academic discourse, the usage of the term “black box” dates back to at least 1963 with Mario Bunge’s work on a black box theory in mathematics.

The term “black box,” as used throughout The Black Box Society by author and law professor, Frank Pasquale, is a dual metaphor for a recording device such as a data-monitoring system and for a system whose inner workings are secret or unknown. According to Pasquale, in a black box society, individuals encounter these dual meanings every day: as people are pervasively tracked by private firms and government, they come to lack a meaningful understanding of where their data travels, how their data is used, and the potential consequences of data usage and disclosure by these organizations.

Central Thesis
Published by ‎Harvard University Press in 2016, The Black Box Society has six chapters, totalling 319 pages. In his review of the book for Business Ethics Quarterly, law professor Alan Rubel identifies Pasquale’s central thesis: the algorithms which control and monitor individual reputation, information seeking, and data retrieval in the search, reputation, and finance sectors embody some combination of too much power, too little subtlety, and too much obscurity.

Chapter One: The Need to Know
Chapter one of The Black Box Society introduces the epistemological problem that emerges from the increasing adoption of enigmatic technologies: “We cannot understand, or even investigate, a subject about which nothing is known.” The author contends that American law has grown silent as questions of personal privacy increasingly arise, instead prioritizing corporate secrecy. This incongruity of secrecy and complexity in the information economy is the focus of The Black Box Society. Pasquale declares that a black box society is “closely resembles a one-way mirror.” In other words, processes of Big Data collection, usage, and disclosure by private and public organizations loom invisibly. Pasquale argues that this is especially true in the areas of reputation, search, and finance—increasingly expressed through algorithms—thereby compromising individual freedoms and market fairness.

The author's discussion of the power of secrecy is informed by the work of legal scholar Jack M. Balkin, who argues that one of the most important forms of power entails the privilege to interrogate others without being interrogated oneself. Pasquale repurposes this constitutional phenomenon to argue that Internet companies that are resistant to regulation collect increasingly mass quantities of sensitive personal data on their users, with no way for those same users to exercise meaningful consent over their "digital dossiers." The concept of digital dossiers is associated with the work of privacy expert Daniel J. Solove, who analyzes the pervasive collection and aggregation of personal information by private sector entities.

According to Pasquale, while companies assemble digital dossiers about the intimate lives of private individuals, the decisions these entities make for themselves are influenced as Internet and finance businesses become decision-makers about how they can or should live their lives by way of credit scoring, predictive analytics, and recommendation engines.

Chapter Two: Digital Reputation in an Era of Runaway Data
In Digital Reputation in an Era of Runaway Data, chapter two of The Black Box Society, Pasquale declares that private data is a significant source of profit for other people, but often at the expense of the person to whom the data belongs. This assertion is supported by scholars John Gilliom and Torin Monahan, who argue that in the wrong hands, the collection and analysis of your data will, in fact, cost you.

According to the author, data brokers use data mining to analyze private and public records in order to draw inferences about people, largely unrestrained by government laws; it is only leaks, investigations, and complaints that crack open the black boxes of reputation analysis. The author labels the exchange of sensitive personal data between commercial and government organizations as a form of “dataveillance." This is a term first used by Roger A. Clarke in 1988 to describe the replacement of the traditional surveillance apparatus—specifically, physical surveillance—with data.

Pasquale explains the pervasive process of digital scoring, wherein “many of our daily activities are processed as ‘signals’ for rewards or penalties, benefits or burdens” by algorithms. Pasquale’s main concern here is that original black boxed techniques in credit reporting are now becoming used in other relatively unregulated sectors including medicine, law enforcement, marketing, and employment. The author deems this “runaway data” as not only “creepy” but also having substantial costs.

Pasquale highlights the concern related to the emergent racial bias inherent in automated decision-making alongside other forms of identity-based prejudice and discrimination. These decisions are “subtle manipulations that are nearly impossible to discern for ordinary citizens not privy to the internal computer code.” Reputation-ranking algorithmic systems are programmed by human beings who cannot easily separate the embedding of their implicit biases and values into the software that they code.

The author cites a 2012 research study on algorithmic discrimination by computer scientist Latanya Sweeney, former director of the Data Privacy Lab at Harvard University and now senior technologist at the Federal Trade Commission, who found that online advertisements suggestive of arrest records appear more frequently when searching for names that are commonly associated with African American cultures.

Chapter two concludes with a call to action in regard to justice system reform: “It is time to reclaim our right to the presumption of innocence, and to the security of the light.”

Chapter Three: The Hidden Logics of Search
Chapter three of The Black Box Society, The Hidden Logics of Search, exposes the new masters of the Internet: search engines that influence, reshape, and answer billions of queries every day. Pasquale asserts that search engines have a dark side: opacity. Platforms make decisions daily that impact personal, commercial, and democratic matters by way of automated personalization. The author thus asks how far the public can trust the people who make these black boxed systems that possess the social, cultural, and political power to include, exclude, and rank people in the digital economy. Furthermore, Pasquale emphasizes that market forces are inadequate in protecting optimal levels of privacy, writing, “In an era where Big Data is the key to maximizing profit, every business has an incentive to be nosy.”

Socially constructed as neutral and objective, the author highlights how these algorithmic systems regularly make value-laden decisions imbued with bias that come to reflect people’s perception of the world. This chapter examines four areas wherein the behaviour of major search engines and companies raises concerns over trust—transparency, competition, compensation, and control—using case studies of disputes over bias and abuse of power in Silicon Valley.

One case study interrogated by Pasquale is the deflection of concerns by microblogging platform X, then named Twitter, over its automated decision-making regarding trending hashtags. On X, hashtags can nominate certain terms as “trending”—or interesting and relevant—and tweets containing trending hashtags become recommended and broadcast to all users. According to Pasquale, in late September of 2011, the Occupy Wall Street protest movement began gaining media traction, and despite the hashtags #OWS and #occupy being used in numerous tweets, these tweets did not appear on Twitter’s official Trending Topics list. Supporters and activists accused Twitter of algorithmic suppression, condemning the censoring of a politically controversial movement. Twitter subsequently released a statement, explaining that Trending Topics is dependent on tweet velocity, not popularity, and the contention settled, sparking discussions surrounding media literacy prompted by researcher Tarleton Gillespie. Still, Pasquale asks a series of important questions surrounding search: “[A]t what point does a platform have to start taking responsibility for what its algorithms do, and how their results are used? […] Shouldn’t we know when they’re working for us, against us, or for unseen interests with undisclosed motives? […] Who are the men behind the curtain, and how are their black boxes sorting and reporting our world?”

Key inquiries in this chapter coincide with researcher Kate Crawford’s finding that private technology businesses such as Facebook, Google, and TikTok function with paltry oversight into their secret processes of artificial intelligence (AI), while failing to provide meaningful opportunities for public contestation when perpetuating forms of disadvantage.

Chapter Four: Finance’s Algorithms: The Emperor’s New Codes
In chapter four of The Black Box Society, Finance’s Algorithms: The Emperor’s New Codes, Pasquale investigates the increasing computerization and consequent lack of transparency in the finance sector over the past several decades. The author contends that the worldwide 2007-2008 financial crisis exposed the hidden practices of large banks: bad data, bad apparatuses, devious corporate structures, and shoddy insurance.

According to Pasquale, secret algorithms are “obscured by a triple layer of technical complexity, secrecy, and ‘economic espionage’ laws that can land would-be whistle-blowers in prison,” thereby restricting citizens from truly knowing how major financial firms collect, use, and disclose their personal data. Citizens also cannot know when complex, trade-secret mathematic calculations including rating formulas are drawing from biased, incomplete, or inaccurate data. This obscurity causes instability and conflict, according to the author.

This chapter explores two sources of obfuscation and opacity resulting from the black boxing of the finance industry: illegality and algorithmic complexity. Problematically, black box finance opens endless possibilities “for self-serving or reckless behavior” among insiders in the finance sector. In supporting this argument, the author quotes sociologist William Davies, who asserts that the financial actors “tasked with representing, measuring, and judging capitalist activity are also seeking to profit from capitalist activity.” Pasquale builds upon Davies’ argument, concluding that strong moral principles of honesty and decency are not safeguarded in capitalist activity. Pasquale emphasizes that dissent from whistle-blowers in financial corporations is often silenced, demonstrated using a case study of the Countrywide Financial political loan scandal. In this scandal, business executives and U.S. politicians allegedly received favourable mortgage rates.

Pasquale ultimately asserts that contemporary finance depends confidently upon the use of mechanisms to obscure important information from borrowers, lenders, clients, regulators, and the general public.

Chapter Five: Watching (and Improving) the Watchers
In the fifth chapter of The Black Box Society, Watching (and Improving) the Watchers, Pasquale argues that societies need to improve the major business firms controlling the flows of information and media as well as individuals' financial destinies.

This chapter calls for transparency, with the author declaring that solutions must build upon successes in opening black boxes in distinct corporate sectors, and subsequently apply these successful methods to the entire black box society—thereby illuminating Big Data-driven decision-making from credit scoring to digital advertising and employment. Pasquale insists that “[d]ata is the fuel of the information economy, and the more data a company already has, the better it can monetize it,” warning of the price that this monetization will cost individuals. Hence, the first step toward de-obscuring the reputation, search, and finance sectors is to learn more about their corporate practices, which need to be reformed to level commercial playing fields and ensure integrity in business transactions.

The author contends that opening black boxes raises key issues surrounding the who, when, what, and why of disclosure—viewing depth, scope, and timing as a spectrum between preserving secrecy and providing transparency.

Pasquale perceives online notice privacy policies as failed disclosure regimes, aligning with research by scholars Jonathan A. Obar and Anne Oeldorf-Hirsch. Obar and Oeldorf-Hirsch find that “the biggest lie on the Internet,” known anecdotally as “I agree to the terms and conditions,” is not just a reality but a pressing concern.

According to Pasquale, notice policy merely perpetuates a fiction of privacy. Pasquale’s assertion is substantiated by Jonathan A. Obar’s concept of “the fallacy of data privacy self-management,” which is the misconception that self-governance on behalf of digital citizens through notice policy is possible in a Big Data digital universe. Therefore, the author proclaims that citizens, in digital and physical spaces, deserve the right to inspect and correct their data in reputation, search, and finance, as societies must ensure that independent citizens who do not perform operations of surveillance apparatuses themselves have some role to play in processing the overwhelming quantity of information that even a data oversight program would produce.

Chapter Six: Toward an Intelligible Society
Pasquale begins the sixth and final chapter of The Black Box Society, Toward an Intelligible Society, by referencing Cory Doctorow’s story Scroogled. In this story, Doctorow imagines Google intertwined with the United States Department of Homeland Security, creating a politically powerful institution of information control. Speaking to America’s current reality, Pasquale writes, “Doctorow’s story confronts us with a stark question: Do we permit Google to assert trade secrecy to the point that we can’t even tell when a scenario like that has come to pass?”

The author argues that black boxes embody the paradox of the “information society,” wherein data has become a vast, valuable resource, yet these resources are available only to the watchers. In a black box society, Pasquale claims that a rule of reputational scores and bets, separate and unequal economies, invisible powers, and wasteful and unfair competitions is created. This chapter consequently unveils “the lure of the black box,” “surfac[ing] its seductions—and their limits." Summarizing Pasquale’s proposals for a way forward in our black box society in an academic book review for Church, Communication and Culture, business professor Kevin de Souza writes that “[i]n matters of reputation we have to focus not so much on controlling the collection of data but its use.”

Pasquale acknowledges the efficiency afforded by black boxes but proposes reforms to make ordinary transactions slower by delegating responsibility back to human beings from algorithms. The author argues that “only humans can perform the critical function of making sure that, as our social relations become ever more automated, domination and discrimination aren’t built invisibly into their code.” Pasquale states that massive Internet and finance companies “present a formidable threat” to necessary values of fairness, integrity, and privacy, with this threat increasingly intertwined with government powers and far too frequently obscured by the secrecy of black boxes.

The author emphasizes how in capitalist democracies, algorithms achieve the longstanding desire humans have to predict the future, “tempered with a modern twist of statistical sobriety” via algorithms. Pasquale concludes that despite their wonder, black boxes have created a threat to democratic society and the economy—insisting that only a knowledgable citizenry can combat this danger.

Positive Reviews
In one academic book review in Church, Communication and Culture, business professor Kevin de Souza writes that Frank Pasquale’s book is a “thought-provoking” result of ten years of research—a contribution to the theory and practice of social justice covering aspects of social science, law, and technology. Similarly, in one academic book review in Business Ethics Quarterly, law professor Alan Rubel asserts that Pasquale’s greatest accomplishment in the book is “connecting several forms of obscurity (actual secrecy, legal secrecy, and deliberate obfuscation) to broader social phenomena.”

Moreover, in the European Data Protection Law Review, business professor Alessandro Spina’s academic book review praises The Black Box Society as “an important, intelligent and timely contribution” in legal doctrine. Spina argues that the book serves as a crucial reference point for informing public debates over data protection and privacy, especially where new legislation has emerged such as in Europe with the General Data Protection Regulation (GDPR).

Critical Reviews
Rubel states that the extensive scope of The Black Box Society is impressive; however, Pasquale’s emphasis on the black box metaphor raises a couple of critical questions. The first question asks exactly what work the black box metaphor is doing, and whether black boxes in the reputation, search, and finance sectors share commonalities other than obscurity. The second question addresses the moral salience of black boxes and the varying importance of transparency from this perspective. Nevertheless, Rubel writes that The Black Box Society, as a complex and compelling book, should encourage scholars of political philosophy and business ethics to seek answers to these looming questions.

In an academic book review for the European Journal of Risk Regulation, law professor Sofia Ranchordás notes that although The Black Box Society “sheds new light on the problem of asymmetric data collection and makes a very important contribution to the literature,” yet readers with an interest in regulation “might miss some specific normative suggestions.” Ranchordás adds that the asymmetry of knowledge and transparency defined in chapter one “appears to become unfortunately diluted in the general narrative at the end of the book.” Likewise, law professor Anupam Chander, in another academic book review in the Michigan Law Review, argues that despite Pasquale’s careful and significant contribution to a line of scholarship interrogating the role of automated algorithms in our lives, The Black Box Society “lends itself to a misdiagnosis of the discrimination problem likely to lie in algorithmic decisionmaking.” Chander consequently emphasizes how even a transparent algorithm can still lead to discriminatory outcomes. Chander nonetheless appreciates Pasquale’s critical recognition that algorithmic systems are neither neutral or objective technologies.