User:Flanagan Institute Applicant/Digital obsolescence

Digital obsolescence is the risk of data loss because of inabilities to access digital assets, due to the hardware or software required for information retrieval being repeatedly replaced by newer devices and systems, resulting in increasingly incompatible formats. While the threat of an eventual "digital dark age" (where large swaths of important cultural and intellectual information stored on archaic formats becomes irretrievably lost) was initially met with little concern until the 1990s, modern digital preservation efforts in the information and archival fields have implemented protocols and strategies such as data migration and technical audits, while the salvage and emulation of antiquated hardware and software address digital obsolescence to limit the potential damage to long-term information access.

Background
A false sense of security persists regarding digital documents: because an infinite number of identical copies can be created from original files, many users assume that their documents have a virtually indefinite shelf life. In reality, the mediums utilized for digital information storage and access present unique preservation challenges compared to many of the physical formats traditionally handled by archives and libraries. Paper materials and printed media migrated to film-based microform, for example, can be accessible for centuries if created and maintained under ideal conditions, compared to mere decades of physical stability offered by magnetic tape and disk or optical formats. Therefore, digital media have more urgent preservation concerns than the gradual change in written or spoken language experienced with the printed word.

Little professional thought in the fields of library and archival science was directed toward the topic of digital obsolescence as the use of computerized systems grew more widespread and commonplace, but much discussion began to emerge in the 1990s. Despite this, few options were proposed as genuine alternatives to the standard method of continuously migrating data to increasingly newer storage media, employed since magnetic tape began succeeding paper punch cards as practical data storage in the 1960s and 1970s. These basic migration practices persist into the modern era of hard disk and solid-state drives as research has shown many digital storage mediums frequently last considerably shorter in the field compared to manufacturer claims or laboratory testing, leading to the facetious observation that “digital documents last forever—or five years, whichever comes first.”

The causes for digital obsolescence aren’t always purely technical. Capitalistic accumulation and consumerism have been labeled key motivators toward digital obsolescence in society, with newly introduced products frequently assigned greater value than older products. Digital preservation relies on the continuous maintenance and usage of hardware and software formats, which the threat of obsolescence can interfere with. Four types of digital obsolescence exist in the realm of hardware and software access:.


 * Functional obsolescence, or the mechanical failure of of a device that prevents information access, which can be the result of damage through rough handling, gradual wear from extended usage, or intentional failure through planned obsolescence;
 * Postponement obsolescence, or intentionally upgrading some information systems within an institution, but not all of them, that is often implemented as part of a "security through obsolescence" strategy;
 * Systemic obsolescence, or deliberate design changes made to programs and applications so that newer updates are increasingly incompatible with older versions, forcing the user to purchase newer software editions or hardware;
 * Technical obsolescence, or the adoption of newer, more accessible technologies with the intention to replace older, often outdated software or hardware, occurring on the side of the consumer or manufacturer.

Examples of digital obsolescence
Because the majority of digital information relies on two factors for curation and retrieval, it is important to separately classify how digital obsolescence impacts digital preservation through both hardware and software mediums.

Hardware
Hardware concerns are two-fold in archival and library fields: in addition to the physical storage medium of magnetic tape, optical disc, or solid-state computer memory, a separate electronic device is often required for information access. And while proper storage can help mitigate some environmental vulnerabilities to storage formats (including dust, humidity, radiation, and temperature) and extend preservation for decades, there are other inevitable endangering factors. Magnetic tape and floppy disks are vulnerable to both the deterioration of adhesive holding the magnetic data layer to its backing or the demagnetization of the data layer, commonly called “bit rot”; optical discs are specifically susceptible to physical damage to their readable surface, and to oxidation occurring between improperly sealed outer layers; a process referred to as “disc rot” or, inaccurately, “laser rot” (particularly in reference to LaserDiscs). Older forms of read-only-memory chip-based storage such as cartridges and memory cards encounter their own form of bit rot when the electrons representing individual bits of binary information change polarity (called “flipping”) and the data in rendered unreadable.

The operability of a format’s appropriate playback or recording device possess their own vulnerabilities. Cassette decks and disk drives rely on the functionality of precision-manufactured moving parts that are susceptible to damages caused by repetitive physical stress and foreign materials like dust and grime. Routine maintenance, calibrations, and cleaning operations can help extend the lifetime of many devices, but broken or failing parts will need repair or replacement: sourcing parts becomes more difficult and expensive as the supply stock for older machines reaches scarcity, and user technical skills grow challenged as newer machines and storage formats use less electromechanical parts and more integrated circuits and other complex components.

Only a decade after the 1970s Viking program, NASA personnel discovered that much of the mission data stored on magnetic tapes, including over 3000 unprocessed images of the Martian surface transmitted by the two Viking probes, was inaccessible due to a multitude of factors. While in possession of indecipherable notes written by long-departed or deceased programmers, the computer hardware and source code needed to correctly run the decoding software had replaced and disposed of by the agency. Information was eventually recovered after more than a year of reserve engineering how the raw data was encoded onto the tapes, which included consulting with the original engineers of the Viking landers’ cameras and imaging hardware. NASA experienced similar issues when attempting to recover and process images from 1960s lunar orbiter missions. Engineers at the Jet Propulsion Laboratory acknowledged in 1990, following a one-year search that located a compatible data tape reader at a United States Air Force base, that a missing part might need rebuilt in-house if a replacement could not be sourced from computer salvage yards.

Software


Over the past several decades, there have been a number of various, once industry-standard file formats and application platforms for data, images, and text that have been repeatedly replaced and superseded by newer iterations of software formats and applications, often with increasingly greater degrees of incompatibility between each other and along their own product lines. Such incompatibilities now frequently extend to which version of the operating system is installed on the system (such as instances of Microsoft Works predating Version 4.5 being unable to run on the Windows 2000 operating system and beyond). One example of a developer cancelling an instance of planned obsolescence occurred in 2008, when Microsoft retracted intentions of an Office service package dropping support for a number of older file formats, due to the intensity of public outcry.

Systemic obsolescence in software can be exemplified by the history of the word processor WordStar. A popular option for WYSIWYG document editing on C/PM and MS-DOS operating systems during the 1980s, a delayed port to Windows 1.0 caused WordStar to lose significant market share to competitors WordPerfect and Microsoft Word by 1991. Further development of the Windows version stopped in 1994, and WordStar 7 for MS-DOS was last updated in 1999. Over time, any version of WordStar grew increasingly incompatible with modern versions of Windows beyond 3.1 to the frustration of long-devoted users, including authors William F. Buckley, Jr. and Anne Rice.

Digital obsolescence has a prominent effect on the preservation of video game history, since many older games and hardware were regarded by players as ephemeral products, due to the continuous process of computer hardware upgrading and home console generation cycles. Such cycles are often the result of both systemic and technical obsolescence. Some of the oldest computer games, like 1962's Spacewar! for the PDP-1 commercial minicomputer, were developed for hardware platforms so outdated that they are virtually nonexistent today. Many older games of the 1960s and 1970s built for contemporary mainframe terminals and microcomputers can only be played today through software emulation. While video games and other software applications can be orphaned by their parent developers or publishing companies and classified as abandonware, the copyright issues surrounding software are a very complicated hurdle in the path of digital preservation.

One prime example of copyright issues with software were those encountered during preservation efforts for the BBC Domesday Project, a 1986 UK multimedia data collection survey that commemorated the 900th anniversary of the original Domesday Book. While the project's specially customized LaserDisc reader resulted in its own hardware-based preservation problems, the combination of one million personal copyrights belonging to participating civilians, in addition to corporate claims on the specialized computer hardware, means that publicly accessible digital preservation efforts might be stalled unit 2090.

Prevention strategies
Organizations possessing digital archives should perform assessments of their records in order to identify file corruption and reduce the risks associated with file format obsolescence. Such assessments can be accomplished through internal file format action plans, which list digital file types in an archive's holdings and assess the actions taken in order to ensure continued accessibility.

One emerging strategic avenue in combatting digital obsolescence is the adoption of open source software, due to source code availability, transparency, and potential adaptability in modern hardware environments. For example, the Apache Software Foundation's OpenOffice application supports access for a number of legacy word processor formats, including Version 6 of Microsoft Word, and basic support for Version 4 of WordPerfect. This contrasts with criticism directed toward Microsoft's own purported Open XML format from the open source community for non-disclosure agreements and translator demands.

Standard strategies for digital preservation utilized by information institutions are frequently interconnected or otherwise related in function or purpose. Bitstream copying (or data backup) is a foundational operation often employed before many other practices, and facilitates establishing the redundancy of multiple storage locations: refreshing is the transportation of unchanging data, frequently between identical or functionally similar storage formats, while migration converts the format or coding of digital information to enable moving it between different operating systems and hardware generations. Normalization reduces organizational complexity for archival institutions by reducing the number of similar filetypes through conversion, and encapsulation assembles digital information with its associated metadata to guarantee information accessibility. Digital archives employ canonicalization to ensure that key aspects of documents have survived the process of conversion, while a reliance on standards established by regional archival institutions maintains organization within the broader spectrum of the field. Technology preservation (also called computer museum) and digital archeology respectively involve institutions maintaining possession or access to legacy hardware and software platforms, and the salvaging methods employed to recover digital information from damaged or obsolete media and devices. Following recovery, some data, such as documentation, can be converted to analog backups in the form of physically accessible copies, while executable code can be launched through emulation platforms within modern hardware and software environments designed to simulate obsolete computer systems. Writing in 1999, Jeff Rothenberg was critical of many contemporary preservation procedures and how they improperly addressed digital obsolescence as the most prominent problem in long-term digital information storage. Rothenberg disapproved of the reliance on hard copies, arguing that printing digital documents stripped them of their inherent “digital” qualities, including machine readability and dynamic, user functionalities. Computer museums were also cited as an inadequate practice. There are practical limitations of a limited number of locations capable of maintaining obsolete hardware forever, realistically limiting the full access capabilities of legacy digital documents: additionally, most older data rarely exists in coding formats to take full advantage of their original hardware or software environments. Two digital preservation processes specifically criticized were the implementation of relational database (RDB) standards and an overreliance on migration. While designed for standardization, RDBs and the features of their management systems (RDBMS) often promoted unintentional tribalistic practices among regional institutions, introducing incompatibilities between RDBs: meanwhile, the ubiquity of file and program migration frequently risked failing to compensate for conversional paradigm shifts between increasingly newer software environments. Emulation, with the digital data supported by an encapsulation of metadata, documentation, and software and emulation environment specifications, was argued as the most ideal preservation practice in the face of digital obsolescence.

The UK National Archives published a second revision to their Information Assurance Maturity Model (IAMM) in 2009, overviewing digital obsolescence risk management for institutions and businesses. After instructing senior information risk owners on the initial requirements that determined both potential risk of digital obsolescence and the mitigating actions to counter it, the guide dissects a multi-step process toward maintaining digital continuity of archival information. Such steps run the gamut from enforcing responsibility of information continuity and confirming the degree of content metadata, to ensuring critical information discovery through institutional usage and that system migration doesn’t affect information accessibility, to guaranteeing IT support and enforcing contingency plans for information survivability through organizational changes.

In 2014, the National Digital Stewardship Alliance recommended developing file format action plans, stating "it is important to shift from more abstract considerations about file format obsolescence to develop actionable strategies for monitoring and mining information about the heterogeneous digital files the organizations are managing". Other important resources for assessment support are the Library of Congress' Sustainability of Digital Formats page, and the UK National Archives' PRONOM online file format registry.

CERN began its Digital Memory Project in 2016, aiming to preserve decades of the organization’s media output through standardized initiatives. CERN determined that their solution would require continuous access to metadata, the implementation of an Open Archival Information System (OAIS) archive as soon as possible to reduce costs, and the advance execution of any new system’s archiving plan. Using OAIS, CERN developed certification for trustworthy digital repositories (TDR), the ISO 16363 standard, and implemented E-Ternity as the prototype for its compliant digital archive model.

On January 1, 2021, Adobe ended support and blocked content from running in its Flash Player in response to the advancements in open standards for the Web. This action followed a July 2017 announcement despite affecting the user experience for millions of websites to varying degrees. Since January 2018, BlueMaxima’s Flashpoint has been one of several Adobe Flash Player preservation projects salvaging more than 110,000 animations and games.