User:Water Dellarobbia/sandbox

The Black box Society
The Black Box Society is a book written by Frank Pasquale. It examines the impacts of black-boxed practices on society. The term “black box practices” is used by Pasquale to refer to the covert collection and use of people’s personal information  through algorithmic technology. In many different contexts, a “black box” refers to a process which is unknowable to a person. Specifically, Pasquale writes about the data collection and data use practices carried out by such major societal institutions as national security, police, banks, financial institutions and commercial businesses. In coining the term “Black Box Society”, Pasquale is referring to how the general public is largely unaware of how these data collection processes work. In his writing on the matter, Pasquale suggests that the unintelligblility of these practices to the people about whom they collect data is not accidental. Rather, he says, the organizations carrying out black box operations deliberately conceal their methods behind the veneer of corporate privacy and freedom of information legislation, as well as technological practices too complex for the average person to understand. Pasquale sees the political and economic bodies as using black box practices to suppress political dissent and maximize their profit, respectively. Of concern to Pasquale is evidence suggesting that black-boxed practices, in their current state, serve to exacerbate existing systemic discrimination and economic disparity. Moreover, the lack of transparency surrounding the technological operations and underlying rational of these algorithmic processes make it difficult to evidence the harm caused by these algorithms. Flaws in algorithms that could cause them to have  a negative impact on people would be difficult to identify and fix, because the specifics of algorithmic operations remain unclear to the general public. Moreover, policy makers have been able but reluctant to employ private-sector technical experts to monitor black-box algorithms to ensure their compliance with existing legislation. In terms of why policy has proven ineffective at curbing black-boxed practices, Pasquale says policy makers lack the technological expertise to be able to keep up with and regulate black-boxed technological practices. However, Pasquale points out that tech firms themselves have humans intervene in and alter their algorithms, and that governments could hire tech experts from the private sector to monitor algorithms and ensure their compliance with existing laws. By the same token, Pasquale says that corporations and the law put too much faith in the alleged objectivity and infallible accuracy of algorithms. Especially where people’s finances are involved, Pasquale advocates for tech experts to monitor algorithms to ensure their accuracy and ethicality. He also suggests that people negatively impacted by potentially-inaccurate algorithms should be given a legal outlet to appeal the algorithm’ findings and have any actions taken against them overturned. Pasquale outlines the three strategies by which different institutions conceal their data-collection practices. These include “real” secrecy, legal secrecy, and obfuscation. He defines “real” secrecy as secrecy facilitated through information being accessible exclusively to those authorized by the data’s owner (in this case, data-collecting institutions). In contrast, legal secrecy sees rules and regulations obliging the members of an institution to preserve the confidentiality of data collected by the institution. Finally, obfuscation refers to an institutions’ working to keep their data secret after this data has been exposed. Pasquale closes his introduction with a reflection on the “self-preventing prophecy”, the notion that if people anticipate and are distributed by the dystopic future posed by existing black box practices, they will take the necessary steps to prevent it.

Events Leading Up to "The Black Box Society"
Pasquale's writing is based on a combination of political-economic and technological practices and revelations made about digital technology and the internet, including:


 * American credit bureaus of the early 1960s and 70s engaging in covert surveillance to gather prejudicial information about people, upon which their  were then based
 * The leak by Edward Snowden of the National Security Agency (NSA) documents, which suggested that the NSA is

Working directly with (or hacking) our largest telecom and Internet companies to store and monitor communications; that the agency can seize and bug computers that have never been attached to the Internet; and that it can crack many types of encryption that has previously been thought secure


 * The current process of credit scoring in America, facilitated in part by computer algorithms, in which credit bureaus do not make public the criteria upon which credit scores are based


 * Medical websites and apps collecting medical and other personal information from users, to be sold by them to credit bureaus, insurers and other third parties who will then use it to determine how they will treat people
 * Stores, businesses and online marketers collecting personal data about individuals to target them with ads and fliers tailored specifically to them
 * A data breach in Target’s databases, in which hackers stole such sensitive personal information about customers as mailing and email addresses and phone numbers. This was seen as serious by Pasquale as customers were not aware that such data has been collected about them, and so could not have preemptively ensured their data was protected against such attacks
 * The release of a report from the White House,  whose findings suggest that Big Data practices poses the risk of facilitating violations of civil rights protections in multiple areas, including employment
 * The United States Department of Homeland Security, state law enforcement and tech companies like Google coordinating their black-boxed data collection processes in “fusion centers” looking to cut down on crime. Scholars have argued that such enterprises open the door for discriminatory hyper-surveillance and policing, and the subsequent reification of economic disadvantage in marginalized communities
 * The growing use of reputation-tracking software by employers when hiring employees
 * A study conducted by the former head of Harvard University’s Privacy Lab, which suggest that Google and online ad provider Instant Checkmate prejudicially target people with names “associated with blacks” with ads regarding criminality and arrest
 * A "legitimate data broker’s" leaking the personal information of millions of Americans to identity thieves
 * American police arresting and interrogating people having posted content suggesting they might engage in social activist activity.
 * According to police, such content was reasonable grounds to suspect these people of terrorist activity
 * A Reuters Report finding that the NSA, the Drug Enforcement Administration (DEA) and the Internal Revenue Service (IRS) to have illegally shared “tips” about known and potential criminals
 * The establishment of the Federal Deposit Insurance Corporation (FDIC), which saw investment bank owners serve as investors in the bank and so had a vested interest in being prudent with its customers’ finances
 * The surge in derivative contracts in the 1990s. These contracts saw financial institutions “swapping risks” with their customers by promising to pay their customers money should their insurance rates rise or fall. The result was widespread engagement in high-risk ventures by investment banks, who subsequently concealed their methods for determining risk to enable themselves to exploit their customers’ finances in pursuit of increased profits. A number of major losses were caused as a result, and Pasquale asserts that these banking practices had detrimental economic inspects on America’s economy.
 * The “going public” of the aforementioned investment banks which, though making some of their processes more transparent to the public, also shielded bank owners from the consequences of any financial risks their banks may have takenl. As well, bank owners had a vested interest in concealing any losses experienced by their banks. This would have been to make them attractive to traders. Hence, Pasquale infers that investment banks would have nothing to lose and everything to gain from taking financial risks with their customers’ money and concealing any failed ventures via black-boxed practices.

Theories Drawn Upon in "The Black Box Society"
Pasquale draws upon many theories in his writings. In part, he draw upon the theory of the “Invisible Hand” by Adam Smith. This theory holds that the uninterrupted operations of the market will result in overall prosperity for a society, therefore arguing for minimal government intervention in economic affairs. Pasquale writes that society operates under “unredeemably opaque” version of Adam Smith’s Invisible Hand. Pasquale, writing in the U.S. context, is suggesting that the American government is reluctant to intervene in unethical practices being engaged in by corporate bodies. This reluctance to hold corporate entities to the law has allowed businesses to circumvent requirements of transparency and ethics. Unbound by these rules, Pasquale is suggesting, big business engages in covert (“black-boxed”), unethical data collection practices. Pasquale also situates his work within the grander field of “agnotology”, the study of the construction of ignorance and misinformation by social institutions. As well, Pasquale associates black-boxed practices with a laissez-faire approach to market regulation. Similar to the Invisible Hand, a laissez-faire economic system sees the market operating with minimal intervention or regulation from the state. In particular, Pasquale would seem to be aligned with a vein of scholars who see a laissez-faire economy prioritizing economic growth over mitigating the social consequences of such growth on the members of a society. Hence, Pasquale argues, the tenet of laissez-faire sees the profit-generating potentials of Big Data valued more than the ethical management of people’s data. Thus, black-boxed practices are allowed to operate largely uninterrupted. Stemming from the notion of laissez-faire, Pasquale sees the opacity surrounding data collection practices as a reflection of the “knowledge problem”. Posited by laissez-faire theorist Friedrich-von-Hayek, the knowledge problem refers to the fact that knowledge of aspects of the market are distributed between multiple bodies within a political-economic system. The result is that policymakers will not have all the knowledge related to the economy that they would need to effectively regulate the markets. What Pasquale seems to suggest, in both the Black Box Society and in his later feature in Andreas Sudmann's The Democratization of Artificial Intelligence, is that knowledge of how economic systems actually work is deliberately “distributed” by market entities to avoid exposing such illicit practices as non-consensual data collection. This means that the opacity of Big Data practices is not inevitable, but also that the answer to the problem must come from outside the powerful intuitions who stand to gain from it. Pasquale also references the “knowledge problem” in arguing for “localized decision making”. By this, Pasquale means that data should be collected by entities closer to the groups whose data is being collected, rather than a data-collecting organization largely removed from the affairs of this community. To illustrate this concept, Pasquale uses the example of “a loan officer in Phoenix would be far more likely to recognize dodgy local mortgage applicants than a high-level manager several hundred miles away”.

Academic Texts Referencing "The Black Box Society"
Pasquale's The Black Box Society is cited in a number of academic texts. Pasquale’s observations about how black box practices can facilitate discrimination is mentioned briefly in Safiya Umoja Noble's Algorithms of Oppression: How Search Engines Reinforce Racism. This book examines “technical redlining”, the use of algorithms to exacerbate existing forms and create new forms of  discrimination. Pasquale is also mentioned in Richard E. Susskind and Daniel Susskind's The Future of the Professions: How Technology will Transform the Work of Human Experts. In this book, the authors write that society is moving away from a reliance on professionals, on people with a high-degree of human expertise who go on to pass on their skills to and use their skills for the benefit of others. Instead, the authors suggest, society is becoming more reliant on machines operating automatically to perform the tasks previously performed by such human experts.The authors of this book reference Pasquale’s writing to explain the current obfuscation of professional practices, such that only the inputs and outputs, not the processes that turn the former into the latter, are knowable to the public. Unlike Pasquale, however, the authors suggest that the impending usurpation of professional practices by machines will lead to greater transparency in professional practices. Pasquale’s Black Box Society is further referenced in Learning Theory and Online Technologies. The focus of this text is to provide educators and researchers with guidance in how to learn about and incorporate digital technologies into their current pedagogues. The Black Box society is referenced here as indicating that social media and networks have become the primary gatekeepers of knowledge online. The risks posed by this technological control is posited by the authors as the counterargument to assertions, made by proponents of connectivism, that technology should be allowed to control our experiences in the offline world. Moreover, Pasquale’s notion of opacity  is mentioned in How the Machine ‘Thinks’: Understanding Opacity in Machine Learning Algorithms by. Specifically, the article posits that opacity is deliberately constructed within the political and economic realms so that the powers may coordinate the general public to maintain their monopolistic power. The authors furthermore conquer with Pasquale's notion of economic and state institutions deliberately concealing the nature of their operations to avoid regulations. They go on to recommend Pasquale’s notion of having an auditor ensure that black boxed algorithms operate in service of the public interest. Another scholarly text citing Pasquale's The Black Box Society is The Mediated Construction of Reality by Nick Couldry and Andreas Hepp. The subject of this text is what the authors term the "mediatization" or the interconnected transformation of media technologies and practices and social practices. Within their book, Couldry and Hepp concur with Pasquale that black-boxed data collection practices are inherently discriminatory and, in being used to determine the distribution of social and economic resources, reproduce existing forms of marginalization and economic disparity. Hence, like Pasquale, they encourage critical inquiry into the technological operations and underlying logic of these systems. Similarly, Mittelstadt et al. offer some direction into structuring the debate around the degree to which algorithms should be allowed to mediate in real-world affairs. In their article, The Ethics of Algorithms: Mapping the Debate, the authors agree with Pasquale's suggestion that oversight from technologically-proficient auditors is a potential solution to the issue of covert algorithms having severely detrimental consequences on people's lives.

In the Popular Press
In addition to being featured in a variety of academic literature, Pasquale's The Black Box Society is referenced in multiple articles in the popular press. Featured on the website of the Harvard University Press, Pasquale discusses the profit-making enterprise of black-boxed practices on the radio show The Brian Lehrer Show. Also, in 2019, the Black Box Society was used in an article by the Washington Post. The article in question looked at the use of algorithms to lower the number of prisoners in Virginia. Said algorithm has been found to have “led some judges to impose harsher sentences for young or black defendants, and more lenient ones for rapists.” Within the article, Pasquale is quoted as explaining that judges have not been properly trained to interpret the algorithm. As well, Pasquale says that judges may only be selectively basing their decisions on the algorithm when the algorithm’s findings support the judge’s preconceived, prejudicial judgments. In 2020, the Black Box was referenced in a Guardian article about the collection and use of personal health information in hiring practices, credit ratings and to target people with personalized ads. This article expands on Pasquale’s writings to include, amongst his original list of black-boxed practices, the illegal sharing of individual health information by the UK’s National Health Service (NHS). Additionally, Pasquale’s writings are referenced in a CBC News article about the interactions of bots with ads, streaming sites and app stores to produce falsified reviews, view counts and ratings. Adding an additional dimension to Pasquale’s Black Box Society, this article suggests that the false impressions of online services created by bots affects the psyche of online users. It argues that these fake metrics lead people to misconstrue the things we value, the online media content, social engagement and virtual space where we seek entertainment, information and connection and where we actively construct an integral part of ourselves Furthermore, in the CBC article, “The Case Against Facebook”, Pasquale is referenced as the writer of the Black Box Society. The article looks at how the monopoly or “dataopoly” structures of online platforms like Facebook means that the platform faces less competition, and users have fewer options in terms of alternative platforms. The result is that the “dataopolistic” platforms hold users hostage, compelling them to accept a potentially sub-standard quality of service and ineffective data protection policies or not have access to what few online platforms they have to choose from. In his quote for the article, Pasquale asserts that laws regulating how platforms manage user data must be accompanied by antitrust to deny these companies the power to coerce users and evade regulation. A similar article by the CBC looks at how Apple, Amazon, Facebook, Microsoft and Alphabet have come to dominate a data-driven economy. Discussing how the economic and technological power of these companies gives them heightened surveillance capabilities, Pasquale points out that monitoring power is being used to analyze “the strengths and vulnerabilities of their suppliers, competitors and customers”. Within this process, the companies acquire a monopoly on internet advertising, starve other online institutions like e-journalism of said resources, and dominate the internet economy as users’ only go-to spot for information, socialization and entertainment. Writing about how the American government could help to regulate these companies’ actions within U.S. borders, Pasquale writes that governments would need a larger volume of “money, personnel and technical capacity” to rival that of these enormous tech companies.