The Myceloom's Heirloom Layer: Stopping the Library of Alexandria from Burning Daily

Protocol Specification — A Digital Archaeological Investigation
Authors: Josie Jefferson & Felix Velasco (Unearth Heritage Foundry)
Collab: Claude 4.5 (Opus & Sonnet) & Gemini (2.5 & 3 Pro)
Date: January 2026
Version: 1.0
DOI: TBD
Abstract

The Library of Alexandria did not burn in a single, definitive fire. Neglect and disinvestment killed the institution. The forgetting of what made keeping matter sealed its fate. The same death occurs daily across modern infrastructure; rot of links and platform murder destroy continuity. Systems designed for consumption erase human memory, completing the destruction. The protocol specification introduces archaeobytology as the discipline for excavating and saving artifacts of the network. Vocabulary (vivibytes, umbrabytes, petribytes) establishes the necessary framework for understanding data as living heritage rather than disposable content. Research documenting link rot rates and platform mortality patterns reveals that contemporary infrastructure produces more information than any civilization in history while ensuring the shortest artifact lifespan ever recorded. The Myceloom Protocol's Temporal layer offers architectural solutions. The Heirloom principle mandates that all systems document succession paths and use human-readable formats. Systems must treat data as inheritance rather than consumption. A fundamental shift from User to Steward serves as the condition for our civilization surviving its own abundance.

I. Introduction: The Fire That Never Stops Burning

Popular imagination holds that the Library of Alexandria burned in a singular, devastating moment, with flames consuming scrolls and knowledge turning to ash. Civilization's memory supposedly erased in an afternoon. Reality proves less dramatic but more instructive. The Library died over centuries through "neglect, decline, and slow decay."1 Funding dried up. Scholars dispersed. Collections scattered. Successive conquests claimed the remnants not because conquerors targeted knowledge, but because the infrastructure necessary to protect the collection had failed.

The Library of Alexandria burns again. Not once, but continuously, a fire consumes the equivalent of the entire collection every few weeks. The blaze generates no headlines. No firefighters mobilize, and no international outcry follows. The fire burns in the quiet death of links that once connected human knowledge and in the sudden darkness of platforms that once held human memory. Systematic erasure of artifacts by systems architecturally incapable of preservation fuels the blaze.

History will vanish into a "digital dark age" according to Vint Cerf, a "father of the internet."2 The warning seems paradoxical. Humanity has never produced more documentation of its existence. Photographs and transactions generate data. Abundance masks a crisis of impermanence. The average webpage survives less than a decade. Platforms outlive user investment by mere years. The average link breaks within two years of creation.

Civilization produces the least accessible historical record despite being the most documented in history.

Archaeobytology provides the discipline necessary for understanding the crisis; the study of artifacts across lifecycles from creation through survival or decay.3 Archaeobytology requires both technical methodology and vocabulary. The Taxonomy of Ghosts distinguishes between vivibytes (living artifacts maintained through stewardship), umbrabytes (liminal artifacts existing in states of uncertain accessibility), and petribytes (fossilized data preserved through archival action).4 Precision replaces vague gestures toward "content" and "material."

Architectural solutions emerge from the Myceloom Protocol's Temporal layer, designated the Heirloom principle.5 Current infrastructure treats data as resource. The Heirloom layer mandates that all myceloom-aligned systems document succession paths and implement human-readable formats. Design for "Abyssal Time" (the temporal scales across which preservation must operate) is mandatory.

The fire burns. Unlike Alexandria's slow death, analysts understand the phenomenon while the destruction happens. The question of action remains.


II. The Empirical Crisis: Measuring Decay

A. The Statistics of Disappearance

Precision quantifies the scale of decay. A 2024 Ahrefs study examining the link profiles of over two million domains found that 66.5% of links created since 2013 are now dead; returning errors rather than content.6 Total loss rate approaches 74.5% when combined with links never captured by analysis tools. Two-thirds of the web's connective tissue has rotted within a decade.

Pew Research Center's 2024 analysis provides institutional validation. Examining webpages collected between 2013 and 2023, researchers found that 25% of all webpages that existed at some point during this period are no longer accessible as of October 2023.7 The decay accelerates with time: 38% of pages from 2013 have vanished, compared to 8% of pages from 2023. The study examined government websites, news organizations, and Wikipedia references, finding rot across all categories: 21% of government pages contained at least one broken link, as did 23% of news pages.

Academic literature faces particular vulnerability. A 2014 Harvard Law School study found that 50% of URLs cited in U.S. Supreme Court opinions are broken.8 The same study revealed that 70% of citations in the Harvard Law Review suffer from "reference rot"—either complete link death or "content drift" where the linked content no longer matches what the citation originally referenced. Legal scholarship's foundation increasingly rests on references that lead nowhere.

Temporal dynamics of link rot follow predictable patterns. A 2003 study established that approximately one link out of every 200 breaks each week, suggesting a half-life of 138 weeks—less than three years.9 Studies consistently demonstrate that the vast majority of online content disappears within two decades of creation.

Social media accelerates the dynamics. Pew Research's 2023 analysis of Twitter/X found that nearly 20% of tweets become inaccessible within three months of posting.12 Disappearance rates vary by language; more than 40% of tweets written in Turkish or Arabic vanish within the window. Permanence proves illusory.

B. The Mechanisms of Decay

Artifacts disappear for architectural rather than accidental reasons. Link rot emerges from multiple intersecting causes. Each cause reflects infrastructure design decisions prioritizing present function over persistence.

Website redesigns and migrations constitute a primary vector. When organizations rebuild web presence to adopt new content management systems or respond to platform shifts, URL structures typically change. Without deliberate implementation of redirects, every external link to the old site breaks. The 2023 CNET case exemplifies the pattern. The technology news site deleted thousands of articles in a stated effort to improve search engine optimization, erasing years of journalism to serve algorithmic preferences.13

Domain expiration and server shutdown create more permanent disappearance. When organizations cease operations, their online presence typically vanishes unless explicitly transferred or archived. Economic calculus drives the loss; keeping incurs continuous cost, while extinction requires no investment.14

Content management systems themselves introduce instability. Dynamic URL generation (producing addresses that include session identifiers, database queries, or timestamp parameters) creates links that may cease to function even when underlying content remains unchanged. The same page may be accessible through dozens of URLs, and all become invalid when system configuration changes.

Platform murder, the elimination of services following corporate consolidation, accelerates decay.15 Google's history includes the termination of Google+ and Google Reader. Picasa and other services also met deletion. Each deletion erased user-generated content that existed nowhere else. A pattern recurs across the industry; acquisition followed by extraction of valuable elements. Shutdown of the acquired platform follows.

C. Platform Mortality: Case Studies in Extinction

The GeoCities extinction event demonstrates platform mortality's scale. Yahoo acquired GeoCities in 1999 for $3.57 billion, recognizing the domain as the third-most-visited site on the early web.16 By 2009, Yahoo announced the platform's termination. Users received months to migrate approximately 38 million pages representing an estimated 190 million hours of human creative labor.17 The Archive Team, a volunteer preservation organization, mounted emergency rescue operations. The team ultimately preserved a portion of GeoCities content through distributed torrent distribution. But a portion implies significant loss, with entire communities and personal histories erased. Early web art vanished because economic calculus no longer favored existence.

GeoCities represented more than a hosting service. Organized into "neighborhoods" (SiliconValley for technology, Hollywood for entertainment, Athens for philosophy), the service constituted the first mass experiment in ordinary people creating web presence. The platform's destruction eliminated not merely content but evidence of how early web culture organized itself, acting as a methodological loss for any future study of internet history.

Google+ provides a more recent example. Launched in 2011 as Google's social networking challenger to Facebook, the platform accumulated millions of users before data breaches (affecting 500,000 users in March 2018 and 52.5 million users in November 2018) accelerated its planned shutdown.18 Users received a ten-month migration period before deletion—generous by industry standards, yet insufficient for users who had treated the platform as permanent archive. Content existed on Google+ and nowhere else; when Google+ ended, that content ended.

Platform death extends to Vine (2012-2017), MySpace's abandonment of pre-2016 content, LiveJournal's ownership transfers, and smaller communities like Nexopia and Friendster. Each event represents the "databound" condition.19 Users become emotionally and practically attached to data they produced but cannot control. Content remains stored on infrastructure whose continued existence depends on corporate decisions made without user input.


III. The Alexandria Precedent: Learning from Ancient Loss

A. The Mythology of the Sudden Blaze

Popular culture remembers the Library of Alexandria as destroyed by fire, whether by Caesar's troops or Christian mobs. Arab conquerors also serve as convenient villains. Narratives serve ideological purposes. Caesar's fire indicts Roman militarism. Christian destruction indicts religious zealotry, while Arab destruction indicts Islam. Modern scholarship rejects the narratives as historically inaccurate or oversimplified.20

Reality proves less dramatic and more instructive. Ptolemy I Soter founded the Library of Alexandria in the third century BCE as part of the Mouseion. Aggressive acquisition policies accumulated an estimated 400,000 scrolls.21 The Library represented the ancient Mediterranean world's most comprehensive collection of knowledge.

The Library's decline occurred through multiple factors operating across centuries. Ptolemy VIII expelled scholars around 145 BCE, creating a diaspora of Alexandrian learning to other Mediterranean centers.22 Julius Caesar's fire during the Alexandrian War (48 BCE) damaged portions of the collection, though the Library itself survived. Roman period administrators progressively reduced funding and institutional support. Physical damage accumulated through earthquakes, tsunamis (the 365 CE event submerged Alexandria's royal quarter), and successive military conflicts.23

The destruction of the Serapeum (a temple housing a daughter library) by Theophilus I in 391 CE eliminated a significant portion of the remnants. By this point, the Library had already experienced centuries of decline. The Arab conquest of 641 CE, often blamed for final destruction, likely found little remaining to destroy. The dramatic story of Caliph Omar ordering the burning is almost certainly later fabrication unsupported by contemporary sources.24

B. Lessons from Gradual Decline

The Alexandria precedent offers lessons for preservation. First, narratives of sudden destruction obscure the common pattern of gradual decay through disinvestment. The Library did not burn; abandonment killed it. Contemporary infrastructure faces the same pattern; not dramatic destruction but neglect. Link by link and platform by platform, the structure erodes. Funding decision by funding decision, the foundation crumbles.

Second, preservation requires institutional commitment across generations. The Library flourished when Ptolemaic rulers valued influence. Decline followed when Roman administrators did not. Digital preservation initiatives face similar vulnerability. Grant cycles and organizational priorities threaten the commitment preservation requires. Political winds exert similar pressure. The Internet Archive's 2024 difficulties illustrate the precarity.25

Third, loss of Alexandria meant loss of context. The scholarly apparatus made the collection meaningful. Community effort organized and cataloged the scrolls. Scholars kept, annotated, and interpreted the texts. Dispersal preceded and accelerated physical collection loss. The modern challenge is analogous. Saving bits without saving technical and interpretive context produces archives that cannot be used.

C. The BBC Domesday Paradox

The BBC Domesday Project provides perhaps the most instructive modern parallel to Alexandrian dynamics. In 1986, the BBC commissioned an ambitious multimedia survey of the United Kingdom, commemorating the 900th anniversary of William the Conqueror's original Domesday Book of 1086.26 The project collected photographs and maps. Census data and local histories from communities across Britain joined the collection. Specially formatted LaserDiscs accessible through dedicated hardware stored the data.

Within fifteen years, the BBC Domesday Project had become nearly unreadable. The custom LaserDisc players aged out of production. The specialized software required to interpret the discs became unavailable. By 2002, the project existed as a museum curiosity rather than accessible resource—while the original 1086 Domesday Book, written on vellum in medieval Latin, remained perfectly readable after nine centuries.

The CAMiLEON Project (Creative Archiving at Michigan and Leeds: Emulating the Old on the New) mounted rescue operations in 2002, developing emulation software that allowed modern computers to simulate the original BBC Micro hardware.27 The Domesday86 Project subsequently created tools to extract data directly from original LaserDiscs. Engineers saved the content, but only through extraordinary intervention that most digital artifacts will never receive.

The paradox illuminates the central challenge: physical durability does not ensure practical accessibility. Medieval monks preserved documents through a hierarchy of materials reflecting intended permanence. Wax tablets held ephemeral notes, parchment held records meant to last, and inscribed stone served as the medium for strong monuments.28 New media collapses the hierarchy. Everything feels permanent yet nothing endures. The 1086 Domesday Book will outlive most files created today.


IV. The Science of Survival: Frameworks and Failures

A. The OAIS Reference Model

The Open Archival Information System (OAIS) Reference Model, published as ISO 14721 in 2002 and revised in 2012, provides the dominant conceptual framework for digital preservation.29 Developed initially by the Consultative Committee for Space Data Systems (CCSDS) to address long-term preservation of scientific data, libraries, archives, and cultural heritage institutions worldwide have adopted OAIS.

OAIS defines preservation in terms of information packages moving through functional entities. A Submission Information Package (SIP) enters the archive from producers. The archive transforms the package into an Archival Information Package (AIP). This maintenance package includes both the Content Data Object (the artifact itself) and Representation Information (the metadata necessary to render and interpret the object). When users request access, the archive generates a Dissemination Information Package (DIP) appropriate to specified needs.30

The model's sophistication lies in attention to interpretive context. OAIS mandates that archives preserve not merely bits but the information necessary to understand the bits. Structure Information describes technical formats and data structures. The model documents encoding schemes. Semantic Information explains meaning. Preservation Description Information documents provenance and integrity verification (checksums). The system records relationships to other objects. The goal is "independent understandability": information sufficiently complete that the designated community can interpret the data without external assistance.31

OAIS has proven influential but implementation remains inconsistent. The model describes what archives should do without specifying how to do it. Institutions adopt OAIS vocabulary while implementing radically different practices. The gap between OAIS as ideal and preservation as practiced reflects both resource constraints and the difficulty of achieving genuine long-term thinking within institutions structured around shorter horizons.

B. Format Migration and Emulation Strategies

Literature identifies two primary strategies for maintaining access to objects over time: migration and emulation. Migration involves periodically transforming objects from obsolete formats to current ones. Examples include converting WordPerfect documents to contemporary Word formats or migrating JPEG images to more sustainable formats as standards evolve.32 Emulation involves preserving the ability to run original software environments, allowing objects to be accessed in the original technical context.

Each strategy carries costs and risks. Migration may alter objects. Formatting may shift, or features may vanish. Metadata may not transfer correctly. Each migration cycle introduces potential for error and loss. Over decades, repeatedly migrated objects may diverge significantly from originals, representing the digital equivalent of photocopying photocopies until the image degrades.

Emulation preserves fidelity but requires maintaining or recreating complex software environments. The Carnegie Mellon Olive Project demonstrates both emulation's promise and demands: the project preserves historical software and games as executable content, but doing so requires substantial technical infrastructure and expertise.33 Emulation scales poorly. The strategy works for canonical objects worth significant investment but cannot address the long tail of artifacts requiring preservation.

A third approach (sometimes termed "normalization") converts objects to standardized archival formats upon ingest, accepting initial transformation in exchange for reduced long-term migration burden. The Library of Congress maintains recommended format specifications across media types, guiding institutions toward formats with better archival characteristics.34 Normalization, however, cannot address objects whose meaning depends on format-specific features, and the recommended formats themselves may eventually require migration.

C. The Economics of Preservation

Preservation economics systematically disadvantage long-term thinking. The costs of preservation are immediate and concrete: storage and personnel. System maintenance and format migration add to the burden. The benefits are diffuse and future-oriented: researchers not yet born will be able to access materials needed in the future. Such temporal asymmetry makes preservation difficult to fund through normal institutional mechanisms.

The numbers illuminate the challenge. A 2015 study estimated the cost of preserving one terabyte of data for ten years at approximately $2,000—a figure that accounts for storage, redundancy, format migration, and institutional overhead.35 Global data production now exceeds 120 zettabytes annually. Preserving even a fraction of this output at institutional levels would require resources no current institution possesses.

The result is triage: institutions preserve what they can, often based on criteria that reflect present assumptions about future value. But present assumptions prove unreliable. The papers of obscure historical figures become invaluable when their correspondents achieve prominence. Technical documentation dismissed as ephemeral becomes essential when systems must be maintained or replicated. The GeoCities neighborhoods no one thought worth saving now constitute irreplaceable evidence of early web culture.

Market mechanisms exacerbate the problem. Platforms have no economic incentive to preserve user content beyond the period of active user engagement. Advertising revenue depends on current attention, not historical access. When platforms calculate that content no longer generates sufficient revenue to justify storage costs, deletion follows regardless of potential value to creators or researchers.


V. The Heirloom Layer: Architectural Solutions to Data Mortality

A. From User to Steward

The Myceloom Protocol's Temporal layer (the Heirloom principle) represents a fundamental change in how infrastructure conceptualizes time.36 Where current systems treat data as consumable resource, Heirloom architecture treats the material as inheritance: artifacts to pass down rather than use up, keeping rather than processing.

Vocabulary drives the shift. Current discourse describes people who interact with systems as users, a term connoting consumption and extraction. Temporary engagement defines the user experience. The Heirloom layer proposes steward as the appropriate term. A steward maintains, preserves, and passes on responsibility.37 The shift encodes a transformed relationship from consuming content to curating collections. The move goes from passive reception to preservation and from convenience to responsibility.

Technical requirements and interface designs differ for stewards. Success metrics differ as well. Systems designed for users optimize for engagement and ease. Satisfaction is the goal. Systems designed for stewards optimize for durability and portability. Accessibility remains a priority. Design manifests the difference at every level.

B. Heirloom Layer Technical Requirements

The Myceloom Protocol specifies concrete requirements for Heirloom-aligned systems. Such requirements emerge from preservation science research while extending the science into infrastructure-level mandates:

MUST Requirements:

SHOULD Requirements:

MAY Requirements:

Conformance Criteria:

Requirements reflect lessons from preservation failures. Human-readable formats (plain text over binary, open standards over proprietary) survive technical change better than optimized alternatives. Succession documentation addresses the orphan problem. Export capabilities ensure users retain control over data regardless of platform decisions.

C. Abyssal Time: Designing for Deep Futures

The Heirloom principle operates under what the Protocol terms Abyssal Time: the temporal scales across which preservation must function.39 Current infrastructure operates on timescales measured in product cycles and quarterly reports. Strategic plans rarely exceed five years. Abyssal Time demands thinking in decades and centuries. Planners potentially consider millennia.

The phrase evokes oceanic depths, regions untouched by surface turbulence and operating according to rhythms incomprehensible to surface observers. Preservation requires similar temporal reorientation: designing for futures beyond imagination and users beyond anticipation. Designers must also consider contexts indistinguishable from magic. The 1086 Domesday Book's creators did not anticipate use 940 years later. The creators built to last regardless, and the work survived.

Abyssal Time thinking transforms design priorities. Optimization for present efficiency matters less than resilience across unknown future conditions. Dependence on current technical infrastructure (cloud services, specific software, particular platforms) becomes liability rather than convenience. Builders must choose formats, protocols, and architectures not for present-day elegance but for probable survival across technological discontinuities.

Contemporary tools remain useful. Usage requires maintaining escape paths and documenting alternatives. Preserving independence is essential. The Heirloom layer does not reject the present. The Protocol refuses prison by the present.

D. The Archive and Anvil Methodology

The Protocol implements Heirloom principles through what it terms the Archive and Anvil methodology: simultaneous preservation of existing artifacts (Archive) and creation of new artifacts designed for durability (Anvil).40 The methodology recognizes that preservation cannot be solely retrospective; infrastructure must be redesigned to produce preservable outputs from inception.

Archive practices address existing digital artifacts:

Anvil practices address new creation:

The dual methodology reflects preservation science's core insight: it is far easier to create preservable artifacts than to preserve poorly-designed ones retrospectively. Archival intervention can rescue some content, but content created with preservation in mind survives better with less intervention. The Anvil shapes what the Archive will hold.


VI. The Taxonomy of Ghosts: A Vocabulary for Digital Heritage

A. Vivibytes, Umbrabytes, Petribytes

Archaeobytology requires conceptual vocabulary adequate to digital artifacts' distinctive characteristics. The Taxonomy of Ghosts provides this vocabulary, distinguishing artifacts by their preservation status and lifecycle position:41

Vivibytes designate living artifacts: data actively maintained and regularly accessed. Continuous updates define the state. The term fuses Latin vivus (living) with the computational byte, capturing artifacts that exist in states of active use. Vivibytes are the objects of current attention: documents being edited and websites actively maintained. Databases regularly queried also qualify. The associated preservation challenge is not survival but transition, ensuring that when active maintenance ceases, artifacts persist rather than perish.

Umbrabytes designate liminal artifacts existing in states of uncertain accessibility. The term derives from Latin umbra (shadow), capturing objects that exist in penumbral zones between full accessibility and complete loss. An unmaintained website still technically online but increasingly broken constitutes an umbrabyte. A file format readable by aging software not yet entirely obsolete constitutes an umbrabyte. The category captures the extensive middle ground between living data and dead links—artifacts neither fully alive nor definitively lost.

Petribytes designate fossilized data: artifacts preserved through deliberate archival action and removed from active use. Maintenance for future access defines the category. The term derives from Latin petra (stone) through the concept of petrification—organic matter transformed into mineral preservation. The Internet Archive's Wayback Machine produces petribytes: snapshots of websites extracted from living web contexts and stored in archival amber. Petribytes are no longer vivibytes (no longer updating, responding, or interacting) but survive.

B. Platform Murder and Digital Haunting

Additional terminology addresses the social and institutional dimensions of digital decay:

Platform murder designates the termination of digital platforms by their owners, resulting in loss of user-created content.42 The term's severity reflects the reality: platform deaths are not natural disasters but corporate decisions. GeoCities did not die of old age; Yahoo killed it. Google+ did not expire; Google terminated it. The language of murder correctly attributes agency and responsibility.

Digital haunting describes the persistence of traces after platform murder or content deletion. Fragments indexed by search engines, references in other documents, and partial captures in archive services remain. Hauntings remind observers that digital death rarely achieves completeness; something usually survives, though often in degraded or decontextualized form. The GeoCities torrent represents a haunting: incomplete, decontextualized, but present.

Databound captures the condition of users emotionally and practically attached to data they produced but cannot control.44 The term evokes both bondage (users bound to platforms) and binding (emotional attachment to digital creations). The databound user has invested time and creativity in platform presence. Users often invest identity as well, yet authority over that presence's continuation is lacking.

C. The Stewardship Imperative

Transformation of users into stewards and consumers into curators remains the imperative. Passive recipients must become active preservers. Each term in the taxonomy implies responsibilities.

Recognizing vivibytes requires planning for transition (understanding that active data will become an archival challenge). Identifying umbrabytes demands intervention before decay becomes irrecoverable. Valuing petribytes means supporting the institutional infrastructure that produces and maintains the data.

Acknowledging platform murder means refusing platforms that will not commit to user data portability. Understanding digital haunting means recognizing that deletion rarely achieves completeness, for better and worse. Accepting the databound condition means working to change the conditions that produce it.

Normative values define the vocabulary. Naming the phenomena establishes importance. Dynamics have stakes worth engaging.


VII. Conclusion: Becoming the Monks Who Copy

Sudden disaster is less common than neglect, according to the Library of Alexandria. Dramatic destruction rarely causes the great losses of human knowledge. No one maintained the infrastructure required for preservation. Fire did not burn the scrolls. Copying and repair ceased. Funding and attention stopped. Absence of care achieved what destruction could not.

Medieval monasteries understood what Alexandria's later administrators forgot: preservation requires commitment sustained across generations. Monks copied manuscripts not because copying was easy but because the orders understood that copied texts survive while uncopied texts perish. The orders developed classification systems and established scriptoria. The monks trained successors. Institutions maintained continuity across centuries of chaos. Monastic choices about what to copy shaped what survived into the present.

Modern infrastructure faces an analogous moment. Human expression exceeds all previous centuries combined. Yet expression exists in formats and on platforms architecturally incapable of survival. Systems within which expression exists are also vulnerable. A gap between production and survival grows daily. Broken links, dead platforms, and unreadable files represent loss. The loss occurs so continuously that notice has ceased.

The Myceloom Protocol's Heirloom layer offers architectural response: infrastructure designed from inception to support preservation rather than frustrate it. Human-readable formats ensure that artifacts remain interpretable across technological discontinuities. Succession documentation ensures that systems can outlive creators. Export capabilities ensure that users retain ultimate control over data regardless of platform decisions. Such requirements translate preservation science into infrastructure mandates.

Architecture alone cannot preserve. Monks enacted a conviction about knowledge's value across time, not merely institutional procedures. Technical solutions enable preservation. Conviction drives preservation. Consciousness must shift from user to steward. Civilization requires recognition of stewards as custodians of heritage rather than consumers of content.

Archaeobytology provides the discipline. The Taxonomy of Ghosts provides the vocabulary. The Heirloom layer provides the architecture. Choice remains. Agents must treat artifacts as disposable or start treating artifacts as inheritance. Stewards must refuse platform murder or demand systems that honor investment. Leaving preservation to chance allows the fire to continue, while building preservation into the foundation secures the future.

Copying ceased, so the Library of Alexandria burned. Commitment to the copying has not yet occurred for the modern library. Fire will not stop itself. Transitioning to the role of the monk serves as the alternative to letting the Library fade.

The Heirloom layer provides the scriptorium. Entry remains a choice.

  1. Lionel Casson, Libraries in the Ancient World (New Haven: Yale University Press, 2001), 31-47. Casson documents the gradual decline of the Library through reduced funding, scholarly dispersal, and institutional neglect across centuries, explicitly rejecting narratives of singular, total destruction.
  2. Vinton G. Cerf, "Digital Vellum and the Expansion of the Library of Alexandria," keynote address at American Association for the Advancement of Science Annual Meeting, San Jose, CA, February 13, 2015. Cerf's widely-reported warning introduced the "digital dark age" concept to public discourse.
  3. Unearth Heritage Foundry, "Archaeobytology," in The Unearth Lexicon of Digital Archaeology (2025), https://unearth.wiki. See also Digital Preservation, Vivibyte, Petribyte.
  4. Unearth Heritage Foundry, "Taxonomy of Ghosts," in The Unearth Lexicon of Digital Archaeology (2025), https://unearth.wiki. See also Vivibyte, Umbrabyte, Petribyte.
  5. The Myceloom Protocol Suite, "Temporal Layer (Heirloom)," MCP-1 V2 Specification (January 2026), Section 5.3.2.
  6. Tim Soulo and Joshua Hardwick, "At Least 66.5% of Links to Sites in the Last 9 Years Are Dead," Ahrefs Blog, February 2, 2024.
  7. Pew Research Center, "When Online Content Disappears," May 17, 2024.
  8. Jonathan Zittrain, Kendra Albert, and Lawrence Lessig, "Perma: Scoping and Addressing the Problem of Link and Reference Rot in Legal Citations," Legal Information Management 14, no. 2 (2014): 88-99.
  9. Steve Lawrence et al., "Persistence of Web References in Scientific Research," Computer 34, no. 2 (2001): 26-31.
  10. Adam Liptak, "When Links Die, Legal Scholarship Gets Harder to Verify," New York Times, December 6, 2021.
  11. Pew Research Center, "When Online Content Disappears."
  12. Thomas Germain, "CNET Is Deleting Thousands of Old Articles to Game Google Search," Gizmodo, August 9, 2023.
  13. Herbert Van de Sompel et al., "Scholarly Context Not Found: One in Five Articles Suffers from Reference Rot," PLoS ONE 9, no. 12 (2014): e115253.
  14. Unearth Heritage Foundry, "Platform Murder," in The Unearth Lexicon of Digital Archaeology (2025), https://unearth.wiki.
  15. Ian Bogost, "When the Internet Disappeared," The Atlantic, February 2017.
  16. Jason Scott, "The GeoCities Rescue," Archive Team, October 2009.
  17. Kevin Roose, "Google+ Is Shutting Down After a Security Bug Exposed User Data," New York Times, October 8, 2018.
  18. Odom et al., "Understanding Why We Preserve Some Things and Discard Others in the Context of Interaction Design," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (New York: ACM, 2009), 1053-1062.
  19. Diana Delia, "From Romance to Rhetoric: The Alexandrian Library in Classical and Islamic Traditions," American Historical Review 97, no. 5 (1992): 1449-1467.
  20. Roger S. Bagnall, "Alexandria: Library of Dreams," Proceedings of the American Philosophical Society 146, no. 4 (2002): 348-362.
  21. Casson, Libraries in the Ancient World, 38-40.
  22. Christopher Haas, Alexandria in Late Antiquity: Topography and Social Conflict (Baltimore: Johns Hopkins University Press, 1997), 159-169.
  23. Bernard Lewis, "The Arab Destruction of the Library of Alexandria: Anatomy of a Myth," in What Happened to the Ancient Library of Alexandria?, ed. Mostafa El-Abbadi and Omnia Fathallah (Leiden: Brill, 2008), 213-217.
  24. Internet Archive, "About the Internet Archive," accessed January 2026, https://archive.org/about/.
  25. Mike Hally, Electronic Brains: Stories from the Dawn of the Computer Age (London: Granta, 2005), 172-181.
  26. Jeffrey van der Hoeven, Bram Lohman, and Remco Verdegem, "The CAMiLEON Project: Emulation and Migration," Library Hi Tech 22, no. 3 (2004): 236-247.
  27. Rosamond McKitterick, The Carolingians and the Written Word (Cambridge: Cambridge University Press, 1989), 135-164.
  28. Consultative Committee for Space Data Systems, Reference Model for an Open Archival Information System (OAIS), CCSDS 650.0-M-2 (Washington, DC: CCSDS, 2012).
  29. Brian Lavoie, "The Open Archival Information System (OAIS) Reference Model: Introductory Guide," 2nd ed., DPC Technology Watch Report 14-02 (London: Digital Preservation Coalition, 2014).
  30. Consultative Committee for Space Data Systems, Reference Model for an OAIS, Section 4.2.1.
  31. Margaret Hedstrom, "Digital Preservation: A Time Bomb for Digital Libraries," Computers and the Humanities 31, no. 3 (1997): 189-202.
  32. Mahadev Satyanarayanan et al., "Olive: Expanding the Horizon for Executability of Digital Records," Communications of the ACM 58, no. 11 (2015): 96-106.
  33. Library of Congress, "Recommended Formats Statement 2024-2025," accessed January 2026.
  34. Stephen Abrams et al., "The State of the Digital Preservation Workforce," International Journal of Digital Curation 9, no. 2 (2014): 38-50.
  35. The Myceloom Protocol Suite, "Temporal Layer (Heirloom)," MCP-1 V2 Specification (January 2026), Section 5.3.2.
  36. Unearth Heritage Foundry, "Steward," in The Unearth Lexicon of Digital Archaeology (2025), https://unearth.wiki.
  37. The Myceloom Protocol Suite, "Layer 7: Temporal (Heirloom)," MCP-1 V2 Specification, Section 5.3.2.
  38. Unearth Heritage Foundry, "Abyssal Time," in The Unearth Lexicon of Digital Archaeology (2025), https://unearth.wiki.
  39. Unearth Heritage Foundry, "Archive and Anvil," in The Unearth Lexicon of Digital Archaeology (2025), https://unearth.wiki.
  40. Unearth Heritage Foundry, "Taxonomy of Ghosts."
  41. Unearth Heritage Foundry, "Platform Murder."
  42. Odom et al., "Understanding Why We Preserve Some Things and Discard Others."

Works Cited

Abrams, Stephen, Patricia Cruse, and John Kunze. "The State of the Digital Preservation Workforce." International Journal of Digital Curation 9, no. 2 (2014): 38-50.

Bagnall, Roger S. "Alexandria: Library of Dreams." Proceedings of the American Philosophical Society 146, no. 4 (2002): 348-362.

Bogost, Ian. "When the Internet Disappeared." The Atlantic, February 2017.

Casson, Lionel. Libraries in the Ancient World. New Haven: Yale University Press, 2001.

Cerf, Vinton G. "Digital Vellum and the Expansion of the Library of Alexandria." Keynote address at American Association for the Advancement of Science Annual Meeting, San Jose, CA, February 13, 2015.

Consultative Committee for Space Data Systems. Reference Model for an Open Archival Information System (OAIS). CCSDS 650.0-M-2. Washington, DC: CCSDS, 2012.

Delia, Diana. "From Romance to Rhetoric: The Alexandrian Library in Classical and Islamic Traditions." American Historical Review 97, no. 5 (1992): 1449-1467.

Germain, Thomas. "CNET Is Deleting Thousands of Old Articles to Game Google Search." Gizmodo, August 9, 2023.

Haas, Christopher. Alexandria in Late Antiquity: Topography and Social Conflict. Baltimore: Johns Hopkins University Press, 1997.

Hally, Mike. Electronic Brains: Stories from the Dawn of the Computer Age. London: Granta, 2005.

Hedstrom, Margaret. "Digital Preservation: A Time Bomb for Digital Libraries." Computers and the Humanities 31, no. 3 (1997): 189-202.

Lavoie, Brian. "The Open Archival Information System (OAIS) Reference Model: Introductory Guide." 2nd ed. DPC Technology Watch Report 14-02. London: Digital Preservation Coalition, 2014.

Lawrence, Steve, Frans Coetzee, Eric Glover, David Pennock, Gary Flake, Finn Nielsen, Andries Kruger, Lee Giles, and C. Lee Giles. "Persistence of Web References in Scientific Research." Computer 34, no. 2 (2001): 26-31.

Lewis, Bernard. "The Arab Destruction of the Library of Alexandria: Anatomy of a Myth." In What Happened to the Ancient Library of Alexandria?, edited by Mostafa El-Abbadi and Omnia Fathallah, 213-217. Leiden: Brill, 2008.

Liptak, Adam. "When Links Die, Legal Scholarship Gets Harder to Verify." New York Times, December 6, 2021.

McKitterick, Rosamond. The Carolingians and the Written Word. Cambridge: Cambridge University Press, 1989.

Odom, William, John Zimmerman, and Jodi Forlizzi. "Understanding Why We Preserve Some Things and Discard Others in the Context of Interaction Design." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1053-1062. New York: ACM, 2009.

Pew Research Center. "When Online Content Disappears." May 17, 2024.

Roose, Kevin. "Google+ Is Shutting Down After a Security Bug Exposed User Data." New York Times, October 8, 2018.

Satyanarayanan, Mahadev, Gilbert Tran, and Trent Jaeger. "Olive: Expanding the Horizon for Executability of Digital Records." Communications of the ACM 58, no. 11 (2015): 96-106.

Scott, Jason. "The GeoCities Rescue." Archive Team, October 2009.

Soulo, Tim, and Joshua Hardwick. "At Least 66.5% of Links to Sites in the Last 9 Years Are Dead." Ahrefs Blog, February 2, 2024.

The Myceloom Protocol Suite. MCP-1 V2 Specification. January 2026.

Unearth Heritage Foundry. The Unearth Lexicon of Digital Archaeology. 2025. https://unearth.wiki.

van der Hoeven, Jeffrey, Bram Lohman, and Remco Verdegem. "The CAMiLEON Project: Emulation and Migration." Library Hi Tech 22, no. 3 (2004): 236-247.

Van de Sompel, Herbert, Martin Klein, and Harihar Shankar. "Scholarly Context Not Found: One in Five Articles Suffers from Reference Rot." PLoS ONE 9, no. 12 (2014): e115253.

Zittrain, Jonathan, Kendra Albert, and Lawrence Lessig. "Perma: Scoping and Addressing the Problem of Link and Reference Rot in Legal Citations." Legal Information Management 14, no. 2 (2014): 88-99.

Digital Archaeological Investigation conducted by Unearth Heritage Foundry. This work is intended for publication at myceloom.im.