Knowledge Preservation


Preservation is the activity of protecting something from loss or danger. The condition of being preserved for future use. Kept intact or in a particular condition. Keep or maintain in unaltered condition. Cause to remain or last. A process that saves organic substances from decay. To keep constant through physical or chemical reactions or evolutionary change. An occurrence of improvement by virtue of preventing loss or injury or other change.

Documentation - Knowledge Keeper - Legacy - Long Term Thinking - Data Storage Types

Keeping
is the responsibility of a guardian or keeper. The act of retaining something. Supply with necessities and support. Maintain for use and service. Preserve information for retrieval. Conformity or harmony. 

Protecting
is to shield from danger, injury, destruction, or damage. Shielding against harm or discomfort. Error Correction - Noise.

Previous SubjectNext Subject

Egyptian Hieroglyphs One of our biggest concerns is, will our most important information and knowledge stand the test of time? Losing knowledge is like losing your memory. You can write it down, but if it's never heard or seen, no one will know. Like writing sheet music that instruments never play.

Knowledge Organization - Knowledge Management - Petroglyphs

Information Destruction throughout History: Info-Graphic (not including the suppression of valuable information). Hoarding knowledge and information is normal.

History - Academic Papers - Library Science - Ontology 

Digital Preservation is a formal endeavor to ensure that digital information of continuing value remains accessible and usable. It involves planning, resource allocation, and application of preservation methods and technologies, and it combines policies, strategies and actions to ensure access to reformatted and "born-digital" content, regardless of the challenges of media failure and technological change. The goal of digital preservation is the accurate rendering of authenticated content over time.

Data Preservation is the act of conserving and maintaining both the safety and integrity of data. Preservation is done through formal activities that are governed by policies, regulations and strategies directed towards protecting and prolonging the existence and authenticity of data and its metadata. Data can be described as the elements or units in which knowledge and information is created, and metadata are the summarizing subsets of the elements of data; or the data about the data. The main goal of data preservation is to protect data from being lost or destroyed and to contribute to the reuse and progression of the data.

Backup refers to the copying into an archive file of computer data so it may be used to restore the original after a data loss event. Backups have two distinct purposes. The primary purpose is to recover data after its loss, be it by data deletion or corruption. Redundancy. Backup is a copy of a file or directory on a separate storage device. Memory storage on a electronic memory device.

Backup is someone who takes the place of another, as when things get dangerous or difficult. The act of providing approval and support. Make a copy of a computer file, especially for storage in another place as a security copy. A copy of a file or record, stored separately from the original, that can be used to recover the original if it is destroyed or damaged.

Redundancy - Compression

Extra is something additional of the same kind.

Reserve is to hold back or set aside, especially for future use or contingency. Something kept back or saved for future use or a special purpose. Conservation.

Spare is something kept in reserve especially for emergency use.

Backup Software are computer programs used to perform backup; they create supplementary exact copies of files, databases or entire computers. These programs may later use the supplementary copies to restore the original contents in the event of data loss.

Archive is a depository containing historical records and documents. Archive is an accumulation of historical records or the physical place they are located. Archives contain primary source documents that have accumulated over the course of an individual or organization's lifetime, and are kept to show the function of that person or organization. Professional archivists and historians generally understand archives to be records that have been naturally and necessarily generated as a product of regular legal, commercial, administrative, or social activities. They have been metaphorically defined as "the secretions of an organism", and are distinguished from documents that have been consciously written or created to communicate a particular message to posterity. In general, archives consist of records that have been selected for permanent or long-term preservation on grounds of their enduring cultural, historical, or evidentiary value. Archival records are normally unpublished and almost always unique, unlike books or magazines for which many identical copies exist. This means that archives are quite distinct from libraries with regard to their functions and organization, although archival collections can often be found within library buildings. A person who works in archives is called an archivist. The study and practice of organizing, preserving, and providing access to information and materials in archives is called archival science. The physical place of storage can be referred to as an archive (more usual in the United Kingdom), an archives (more usual in the United States), or a repository. When referring to historical records or the places they are kept, the plural form archives is chiefly used. The computing use of the term 'archive' should not be confused with the record-keeping meaning of the term.

Archivist of the United States is the head and chief administrator of the National Archives and Records Administration of the United States. The Archivist is responsible for the supervision and direction of the National Archives. The archives contain 13.5 billion records.

Archival Science is the study and theory of building and curating archives, which are collections of documents, recordings and data storage devices.

Storage is a place to keep things for future use like storing things in a Warehouse, which is a building used for storage of goods. Data Storage.

Store is a supply of something available for future use. A depository or place for goods to lay aside for future use. To put away for storage. An electronic memory device.

Depository is a facility where things can be deposited or put somewhere for storage or safekeeping.

Repository is a place, building, or receptacle where things are or may be stored. A place in which something, especially a natural resource, has accumulated or where it is found in significant quantities. A central location in which data is stored and managed.

Institutional Repository is an archive for collecting, preserving, and disseminating digital copies of the intellectual output of an institution, particularly a research institution. Core Samples.

Disciplinary Repository is an online archive containing works or data associated with these works of scholars in a particular subject area.

Software Repository is a storage location from which software packages may be retrieved and installed on a computer.

Software Heritage. We collect and preserve software in source code form, because software embodies our technical and scientific knowledge and humanity cannot afford the risk of losing it. Software is a precious part of our cultural heritage. We curate and make accessible all the software we collect, because only by sharing it we can guarantee its preservation in the very long term.

Information Repository is an easy way to deploy a secondary tier of data storage that can comprise multiple, networked data storage technologies running on diverse operating systems, where data that no longer needs to be in primary storage is protected, classified according to captured metadata, processed, de-duplicated, and then purged, automatically, based on data service level objectives and requirements. In information repositories, data storage resources are virtualized as composite storage sets and operate as a federated environment.

Computer Data Storage is a technology consisting of computer components and recording media used to retain digital data. It is a core function and fundamental component of computers. Compression.

Data Warehouse is a central repository of integrated data from one or more disparate sources. They store current and historical data in one single place that are used for creating analytical reports for workers throughout the enterprise.

Energy Storage is the capture of energy produced at one time for use at a later time. Battery.

Storage (bit). In the earliest non-electronic information processing devices, such as Jacquard's loom or Babbage's Analytical Engine, a bit was often stored as the position of a mechanical lever or gear, or the presence or absence of a hole at a specific point of a paper card or tape. The first electrical devices for discrete logic (such as elevator and traffic light control circuits, telephone switches, and Konrad Zuse's computer) represented bits as the states of electrical relays which could be either "open" or "closed". When relays were replaced by vacuum tubes, starting in the 1940s, computer builders experimented with a variety of storage methods, such as pressure pulses traveling down a mercury delay line, charges stored on the inside surface of a cathode-ray tube, or opaque spots printed on glass discs by photolithographic techniques. In the 1950s and 1960s, these methods were largely supplanted by magnetic storage devices such as magnetic core memory, magnetic tapes, drums, and disks, where a bit was represented by the polarity of magnetization of a certain area of a ferromagnetic film, or by a change in polarity from one direction to the other. The same principle was later used in the magnetic bubble memory developed in the 1980s, and is still found in various magnetic strip items such as metro tickets and some credit cards. In modern semiconductor memory, such as dynamic random-access memory, the two values of a bit may be represented by two levels of electric charge stored in a capacitor. In certain types of programmable logic arrays and read-only memory, a bit may be represented by the presence or absence of a conducting path at a certain point of a circuit. In optical discs, a bit is encoded as the presence or absence of a microscopic pit on a reflective surface. In one-dimensional bar codes, bits are encoded as the thickness of alternating black and white lines. Off Loading our Memories.

History of Hard Disk Drives (wiki) - In 1956, IBM made a 5 MB storage device that was big as a car that cost $160,000. Today in 2017, Samsung made a 16 Terabyte SSD that fits in your pocket. (16,000 GB).

Side by Side is a 2012 American documentary film that investigates the history, process and workflow of both digital and photochemical film creation. It shows the transition from film to digital, and how media preservation has many challenges. The film shows what artists and filmmakers have been able to accomplish with both film and digital and how their needs and innovations have helped push filmmaking in new directions. Interviews with directors, colorists, scientists, engineers and artists reveal their experiences and feelings about working with film and digital media.

Curate is to organize a collection of things, such as in a library or in a museum.

Conservator - Alexandria - Library Science - Human Curation

Curator is the custodian of a collection as of a museum or library.

Digital Curation is the selection, preservation, maintenance, collection and archiving of digital assets. Digital curation establishes, maintains and adds value to repositories of digital data for present and future use. This is often accomplished by archivists, librarians, scientists, historians, and scholars. Enterprises are starting to use digital curation to improve the quality of information and data within their operational and strategic processes. Successful digital curation will mitigate digital obsolescence, keeping the information accessible to users indefinitely.

Data Curation is the organization and integration of data collected from various sources. It involves annotation, publication and presentation of the data such that the value of the data is maintained over time, and the data remains available for reuse and preservation. Data curation includes "all the processes needed for principled and controlled data creation, maintenance, and management, together with the capacity to add value to data". In science, data curation may indicate the process of extraction of important information from scientific texts, such as research articles by experts, to be converted into an electronic format, such as an entry of a biological database.

Content Curation is the process of gathering information relevant to a particular topic or area of interest. Services or people that implement content curation are called curators. Curation services can be used by businesses as well as end users. Art and Science of Curation.

Library is a curated collection of sources of information and similar resources, selected by experts and made accessible to a defined community for reference or borrowing. It provides physical or digital access to material, and may be a physical location or a virtual space, or both. A library's collection can include books, periodicals, newspapers, manuscripts, films, maps, prints, documents, microform, CDs, cassettes, videotapes, DVDs, Blu-ray Discs, e-books, audiobooks, databases, and other formats. Libraries range widely in size up to millions of items. In Latin and Greek, the idea of a bookcase is represented by Bibliotheca and Bibliotheke: derivatives of these mean library in many modern languages, e.g. French bibliothèque. The first libraries consisted of archives of the earliest form of writing—the clay tablets in cuneiform script discovered in Sumer, some dating back to 2600 BC. Private or personal libraries made up of written books appeared in classical Greece in the 5th century BC. In the 6th century, at the very close of the Classical period, the great libraries of the Mediterranean world remained those of Constantinople and Alexandria. The libraries of Timbuktu were also established around this time and attracted scholars from all over the world. A library is organized for use and maintained by a public body, an institution, a corporation, or a private individual. Public and institutional collections and services may be intended for use by people who choose not to—or cannot afford to—purchase an extensive collection themselves, who need material no individual can reasonably be expected to have, or who require professional assistance with their research. In addition to providing materials, libraries also provide the services of librarians who are experts at finding and organizing information and at interpreting information needs. Libraries often provide quiet areas for studying, and they also often offer common areas to facilitate group study and collaboration. Libraries often provide public facilities for access to their electronic resources and the Internet. Modern libraries are increasingly being redefined as places to get unrestricted access to information in many formats and from many sources. They are extending services beyond the physical walls of a building, by providing material accessible by electronic means, and by providing the assistance of librarians in navigating and analyzing very large amounts of information with a variety of digital resources. Libraries are increasingly becoming community hubs where programs are delivered and people engage in lifelong learning. Pooling all the knowledge together that has been collected from all of the great minds who have ever lived.

The Goal of the Curator is to make the viewer more intelligent and more understanding of themselves and the world around them. Of course most people are unfamiliar with knowledge and information, so it will take time for some people to see all the different layers and Interpret them a meaningful way. But the more a person looks and reads, the more they will see and the more they will understand. The information loop should not be a closed loop that’s controlled by one company. Librarian's have a degreed and are trained and publicly accountable, and the ones curating books for their communities, in concert with their communities. If you’re a public librarian or university librarian, school librarian—collection development, and what you buy for the public to have access to, you do it in tandem with your public that you’re serving.

"The curator is never more interesting than the collection that's inside the museum".

Library of Alexandria in Alexandria, Egypt, was one of the largest and most significant libraries of the ancient world. The library was part of a larger research institution called the Mouseion, which was dedicated to the Muses, the nine goddesses of the arts. The idea of a universal library in Alexandria may have been proposed by Demetrius of Phalerum, an exiled Athenian statesman living in Alexandria, to Ptolemy I Soter, who may have established plans for the Library, but the Library itself was probably not built until the reign of his son Ptolemy II Philadelphus. The Library quickly acquired many papyrus scrolls, due largely to the Ptolemaic kings' aggressive and well-funded policies for procuring texts. It is unknown precisely how many such scrolls were housed at any given time, but estimates range from 40,000 to 400,000 at its height. Alexandria came to be regarded as the capital of knowledge and learning, in part because of the Great Library. Many important and influential scholars worked at the Library during the third and second centuries BC, including, among many others: Zenodotus of Ephesus, who worked towards standardizing the texts of the Homeric poems; Callimachus, who wrote the Pinakes, sometimes considered to be the world's first library catalogue; Apollonius of Rhodes, who composed the epic poem the Argonautica; Eratosthenes of Cyrene, who calculated the circumference of the earth within a few hundred kilometers of accuracy; Aristophanes of Byzantium, who invented the system of Greek diacritics and was the first to divide poetic texts into lines; and Aristarchus of Samothrace, who produced the definitive texts of the Homeric poems as well as extensive commentaries on them. During the reign of Ptolemy III Euergetes, a daughter library was established in the Serapeum, a temple to the Greco-Egyptian god Serapis. Despite the widespread modern belief that the Library of Alexandria was burned once and cataclysmically destroyed, the Library actually declined gradually over the course of several centuries, starting with the purging of intellectuals from Alexandria in 145 BC during the reign of Ptolemy VIII Physcon, which resulted in Aristarchus of Samothrace, the head librarian, resigning from his position and exiling himself to Cyprus. Many other scholars, including Dionysius Thrax and Apollodorus of Athens, fled to other cities, where they continued teaching and conducting scholarship. The Library, or part of its collection, was accidentally burned by Julius Caesar during his civil war in 48 BC, but it is unclear how much was actually destroyed and it seems to have either survived or been rebuilt shortly thereafter; the geographer Strabo mentions having visited the Mouseion in around 20 BC and the prodigious scholarly output of Didymus Chalcenterus in Alexandria from this period indicates that he had access to at least some of the Library's resources. The Library dwindled during the Roman Period, due to lack of funding and support. Its membership appears to have ceased by the 260s AD. Between 270 and 275 AD, the city of Alexandria saw a rebellion and an imperial counterattack that probably destroyed whatever remained of the Library, if it still existed at that time. The daughter library of the Serapeum may have survived after the main Library's destruction. The Serapeum was vandalized and demolished in 391 AD under a decree issued by Coptic Christian Pope Theophilus of Alexandria, but it does not seem to have housed books at the time and was mainly used as a gathering place for Neoplatonist philosophers following the teachings of Iamblichus. Bibliotheca Alexandrina.

Agora is a 2009 historical drama film. The biopic stars Rachel Weisz as Hypatia, a mathematician, philosopher and astronomer in late 4th-century Roman Egypt, who investigates the flaws of the geocentric Ptolemaic system and the heliocentric model that challenges it. Surrounded by religious turmoil and social unrest, Hypatia struggles to save the knowledge of classical antiquity from destruction. Max Minghella co-stars as Davus, Hypatia's father's slave, and Oscar Isaac as Hypatia's student, and later prefect of Alexandria, Orestes. The story uses historical fiction to highlight the relationship between religion and science at the time amidst the decline of Greco-Roman polytheism and the Christianization of the Roman Empire. The title of the film takes its name from the agora, a public gathering place in ancient Greece, similar to the Roman forum. The film was produced by Fernando Bovaira and shot on the island of Malta from March to June 2008. Justin Pollard, co-author of The Rise and Fall of Alexandria (2007), was the historical adviser for the film. Neoplatonic School of Alexandria was a remarkable center of learning due to the blending of Greek and Oriental influences. Pagan and Christian students were mainly taught the works of Plato and Aristotle in Middle 5th until early7th century CE.

Dead Sea Scrolls are ancient Jewish and Hebrew religious manuscripts that were found in the Qumran Caves in the Judaean Desert, near Ein Feshkha on the northern shore of the Dead Sea in the West Bank. Scholarly consensus dates these scrolls from the last three centuries BCE and the first century CE. The texts have great historical, religious, and linguistic significance because they include the second-oldest known surviving manuscripts of works later included in the Hebrew Bible canon, along with deuterocanonical and extra-biblical manuscripts which preserve evidence of the diversity of religious thought in late Second Temple Judaism. Almost all of the Dead Sea Scrolls are held by the state of Israel in the Shrine of the Book on the grounds of the Israel Museum, but ownership of the scrolls is disputed by Jordan and Palestine. The Digital Dead Sea Scrolls.

Cave of Letters is a cave in Nahal Hever in the Judean Desert where letters and fragments of papyri from the Roman Empire period were found. Some are related to the Bar Kokhba revolt (circa 131-136), including letters of correspondence between Bar-Kokhba and his subordinates. Another notable bundle of papyri, known as the Babatha cache, comprises legal documents of Babatha, a female landowner of the same period. The cave is located at the head of Nahal Hever in the Judean desert, about 40 kilometres (25 mi) south of Qumran, 20 km south of Wadi Murabba'at. The site is a few kilometers southwest of En-gedi, approximately 10 kilometers north of Masada, on the western shore of the Dead Sea. The cave has two openings, three halls and some crevices.

Cave of Horror is the name given to what archaeologists have catalogued as Cave 8 of the Judaean Desert, Israel, where the remains of Jewish refugees from the Bar Kokhba revolt were found. The cave lies in the southern cliff of the Nahal Hever wadi, adjacent to the Cave of Letters located on the northern cliff of the stream, where many documents from the Bar Kokhba revolt were uncovered. Cave of horror, or just a good hiding place?

Hiding Place is a secret place for concealing someone or hiding something. Hiding is to keep something out of sight for protection and safety. The activity of keeping something secret. Prevent something from being seen or discovered. To cover as if with a shroud. To make something undecipherable or imperceptible by obscuring or concealing.

Rosetta Disk is the physical companion of the Rosetta Digital Language Archive, and a prototype of one facet of The Long Now Foundation's 10,000-Year Library.

Ancient Symbols - Petroglyphs - Carving - Sacred Text - Symbols - Decoding - Writing

Rosetta Stone is a granodiorite stele inscribed with three versions of a decree issued in Memphis, Egypt in 196 BC during the Ptolemaic dynasty on behalf of King Ptolemy V Epiphanes. The top and middle texts are in Ancient Egyptian using hieroglyphic and Demotic scripts respectively, while the bottom is in Ancient Greek. The decree has only minor differences between the three versions, making the Rosetta Stone key to deciphering the Egyptian scripts.

Internet Archive is a non-profit library of millions of free books, movies, software, music, websites, and more.

Archive Team is a loose collective of rogue archivists, programmers, writers and loudmouths dedicated to saving our digital heritage.

Preservation Library and Archival Science refers to the set of activities that aims to prolong the life of a record and relevant metadata, or enhance its value, or improve access to it through non-interventive means. This includes actions taken to influence records creators prior to selection and acquisition.

Wikipedia Terminal Event Management Policy is an official policy of Wikipedia detailing the procedures to be followed to safeguard the content of the encyclopedia in the event of a non-localized event that would render the continuation of Wikipedia in its current form untenable. The policy is designed to facilitate the preservation of the encyclopedia by a transition to non-electronic media in an orderly, time-sensitive manner or, if events dictate otherwise, the preservation of the encyclopedia by other means. Editors are asked to familiarize themselves with the procedures and in the unlikely event that the implementation of these procedures proves necessary, act in accordance with the procedural guidelines, inasmuch as circumstances allow.

Cryptosteel Cold Storage Wallet

Biogenesis is the production of new living organisms or organelles.

Cells - Evolution

Digital Amnesia (Bregtje van der Haak, VPRO) (youtube)

The End of Memory: A good Film Directed by Vincent Amouroux and Produced by ARTE France - ZED / Diff : ARTE - Documentary, France, 2014, 52 min.. This film is a scientific investigation about the challenges of memory storage and the short lifespan of current storage formats.

Big Data

File Format is a standard way that information is encoded for storage in a computer file. It specifies how bits are used to encode information in a digital storage medium. Document Formats - Standards.

Encoding Memory is the ability to encode, store and recall information. Memories give an organism the capability to learn and adapt from previous experiences as well as build relationships. Encoding allows the perceived item of use or interest to be converted into a construct that can be stored within the brain and recalled later from short-term or long-term memory.

Working Memory stores information for immediate use or manipulation which is aided through hooking onto previously archived items already present in the long-term memory of an individual. Volatile Memory.

Noah's Ark Knowledge Ark is a collection of knowledge preserved in such a way that future generations would have access to said knowledge if current means of access were lost. Scenarios where availability to information (such as the Internet) would be lost could be described as Existential Risks or Extinction Level Events. A knowledge ark could take the form of a traditional Library or a modern computer database. It could also include images only (such as photographs of important information, or diagrams of critical processes). A knowledge ark would have to be resistant to the effects of natural or man-made disasters to be viable. Such an ark should include, but would not be limited to, information or material relevant to the survival and prosperity of human civilization. Current examples include the Svalbard Global Seed Vault, a Seed Bank which is intended to preserve a wide variety of plant seeds (such as important crops) in case of their extinction.

Photo Ark - Joel Sartore - Sending Human Knowledge into Outer Space - Encyclopedias - Voyager

Knowledge Vault is a knowledge base created by Google. As of 2014, it contained 1.6 billion facts which had been collated automatically from the Internet. Knowledge Vault is a potential successor to Google's Knowledge Graph. The Knowledge Graph pulled in information from structured sources like Freebase, Wikidata and Wikipedia, while the Knowledge Vault is an accumulation of facts from across the entire web, including unstructured sources. "Facts" in Knowledge Vault also include a confidence value, giving the capability of distinguishing between knowledge statements that have a high probability of being true from others that may be less likely to be true (based on the source that Google obtained the data from and other factors). The concept behind the Knowledge Vault was presented in a paper authored by a Google Research team. Google has indicated that Knowledge Vault is a research paper and not an active product in development, as of August 2014.

Print Wikipedia printed 106 of the 7,473 volumes of English Wikipedia as it existed on April 7, 2015 and also included wallpaper displaying 1,980 additional volumes. A 36-volume index of all of the 7.5 million contributors to English Wikipedia is also part of the project. The table of contents takes up 91 700-page volumes. The printed volume only includes text of the articles. Images and references are not included. DNA Printed Out on Paper.

Norway World Arctic Archive is built in “Mine 3,” an abandoned coal mine close to the Global Seed Vault. Countries are being encouraged to submit data that is particularly significant to their culture. Grønland 56, 3045 Drammen, NORWAY, +47 905 33 432, office@piql.com.

A Lunar Ark has been proposed which would store and transmit valuable information to receiver stations on Earth. The success of this would also depend on the availability of compatible receiver equipment on Earth, and adequate knowledge of that equipment's operation. Other types of knowledge arks might include genetic material. With the potential for widespread personal DNA sequencing becoming a reality, an individual might agree to store their genetic code in a digital or analog storage format which would enable later retrieval of that code. If a species was sequenced before extinction, its genome would remain available for study even in the case of extinction.

The Arch Mission Foundation is building a space-based archive designed to survive for 6 billion years or more — a million times longer than the oldest written records in existence today. One of the primary evolutionary challenges that we face is amnesia about our past mistakes, and the lack of active countermeasures to repeating them. The Arch Lunar Library contains a 30 million page archive of human history and civilization, covering all subjects, cultures, nations, languages, genres, and time periods. Israel’s Beresheet spacecraft little lunar probe carries a 30-million-page archive of human knowledge etched into a DVD-size metal disc. The Arch Lunar Librar represents the first in a series of lunar archives from the Arch Mission Foundation, designed to preserve the records of our civilization for up to billions of years. It is installed in the SpaceIL “Beresheet” lunar lander, scheduled to land on the Moon in April of 2019. The Library is housed within a 100 gram nanotechnology device that resembles a 120mm DVD. However it is actually composed of 25 nickel discs, each only 40 microns thick, that were made for the Arch Mission Foundation by NanoArchival. The first four layers contain more than 60,000 analog images of pages of books, photographs, illustrations, and documents - etched as 150 to 200 dpi, at increasing levels of magnification, by optical nanolithography. The first analog layer is the Front Cover and is visible to the naked eye. It contains 1500 pages of text and images, as well as holographic diffractive logos and text, and can be easily read with a 100X magnification optical microscope, or even a lower power magnifying glass. The next three analog layers each contain 20,000 images of pages of text and photos at 1000X magnification, and require a slightly more powerful microscope to read. Each letter on these layers is the size of a bacillus bacterium. Also in the analog layers of the Library is a specially designed “Primer” that teaches over a million concepts in pictures and corresponding words across major languages, as well as the content of the Wearable Rosetta disc, from the Long Now Foundation, which teaches the linguistics of thousands of languages. Following the Primer, are a series of documents that teach the technical specifications, file formats, and scientific and engineering knowledge necessary to access, decode and understand, the digital information encoded in deeper layers of the Library. Also in the analog layers, are several private archives, including an Israeli time-capsule for SpaceIL, containing the culture and history of Israel, songs, and drawings by children. Beneath the analog layers of the Library are 21 layers of 40 micron thick nickel foils, each containing a DVD master. Collectively, the digital layers contain more than 100GB of highly compressed datasets, which decompress to nearly 200GB of content, including the text and XML of the English Wikipedia, plus tens of thousands of PDFs of books — including fiction, non-fiction, a full reference library, textbooks, technical and scientific handbooks, and more. The digital layers also contain the Panlex datasets from the Long Now Foundation, a linguistic key to 5000 languages, with 1.5 billion translations between them. All the necessary specifications for extracting the file formats and content within the digital layers are provided in the analog layers above. Has a independent third party proof read this material?

Scientists devise method to secure Earth's biodiversity on the moon. Proposed lunar biorepository could store genetic samples without electricity or liquid nitrogen.

Voyager 1 and 2 - The Test of Time - Mind Map

Time Capsule is a historic cache of goods or information, usually intended as a deliberate method of communication with future people, and to help future archaeologists, anthropologists, or historians. The preservation of holy relics dates back for millennia, but the practice of preparing and preserving a collection of everyday artifacts and messages to the future appears to be a more recent practice. Time capsules are sometimes created and buried during celebrations such as a world's fair, a cornerstone laying for a building, or at other ceremonies. Commercially-manufactured sealable containers are sold for protection of personal time capsules; some of the more durable waterproof containers used for geocaching may also be suitable. Many underground time capsules are destroyed by groundwater infiltration after short periods of time; caches stored within the wall cavities of buildings can survive as long as the building is used and maintained. In 2016, the art collective Ant Farm displayed a show, The Present Is the Form of All Life: The Time Capsules of Ant Farm and LST, at the art center Pioneer Works, in Brooklyn, New York. The artists had previous experiences with failed time capsules, and were now exploring "digital time capsules" as a more durable form of preservation. They have said, "We’ve come to understand that the best way to preserve digital media is to distribute it." Block chain and cognitive learning is now used in time capsule technology. Researchers have started to study methods of preserving digital data in forms that will still be usable in the distant future.

Message in a Bottle is a form of communication in which a message is sealed in a container (typically a bottle) and released into a conveyance medium (typically a body of water). Time Travel.

UNESCO established the Memory of the World Programme in 1992.

Transgenerational is acting across multiple generations.

Encyclopedia Galactica is a fictional or hypothetical encyclopedia containing all the knowledge accumulated by a galaxy-spanning civilization. The name evokes the exhaustive aspects of the real-life Encyclopædia Britannica.

Venice Time Machine is an open digital archive of the city's cultural heritage covering more than 1,000 years of evolution. The project aims to trace circulation of news, money, commercial goods, migration, artistic and architectural patterns amongst others to create a Big Data of the Past. Its fulfillment would represent the largest database ever created on Venetian documents. The project is an example of the new area of scholar activity that has emerged in the Digital Age.

Digital Humanities are ways of doing scholarship that involve collaborative, transdisciplinary, and computationally engaged research, teaching, and publishing. It brings digital tools and methods to the study of the humanities with the recognition that the printed word is no longer the main medium for knowledge production and distribution.

Earths Black Box will record every step we take towards this catastrophe. Hundreds of data sets, measurements and interactions relating to the health of our planet will be continuously collected and safely store for future generations. The Mind.

Flight Recorder is an electronic recording device placed in an aircraft for the purpose of facilitating the investigation of aviation accidents and incidents. Flight recorders are also known by the misnomer black box—they are, in fact, painted bright orange in color to aid in their recovery after accidents. There are two different flight recorder devices: the flight data recorder (FDR) preserves the recent history of the flight through the recording of dozens of parameters collected several times per second; the cockpit voice recorder (CVR) preserves the recent history of the sounds in the cockpit, including the conversation of the pilots. The two devices may be combined into a single unit. Together, the FDR and CVR objectively document the aircraft's flight history, which may assist in any later investigation. The two flight recorders are required by international regulation, overseen by the International Civil Aviation Organization, to be capable of surviving the conditions likely to be encountered in a severe aircraft accident. For this reason, they are typically specified to withstand an impact of 3400 g and temperatures of over 1,000 °C (1,830 °F), as required by EUROCAE ED-112. They have been a mandatory requirement in commercial aircraft in the United States since 1967. After the unexplained disappearance of Malaysia Airlines Flight 370 in 2014, commentators have called for live streaming of data to the ground, as well as extending the battery life of the underwater locator beacons.


Long Term Thinking


Futures Studies is the study of postulating possible, probable, and preferable futures and the worldviews and myths that underlie them. Seeks to understand what is likely to continue and what could plausibly change. Part of the discipline thus seeks a systematic and pattern-based understanding of past and present, and to determine the likelihood of future events and trends. Future studies is also called Futurology or Foresight Organizations.

Futurist is a person who studies the future and makes predictions about it based on current trends. Futurist are people whose specialty or interest is futurology or the attempt to systematically explore predictions and possibilities about the future and how they can emerge from the present, whether that of human society in particular or of life on Earth in general.

Thought Leaders - Oracle - Technological Advancements - Scenario Planning - Quotes about the Future

Planning Far into the Future - Long Now - Long-Term Thinking - Manual for Civilization - Seven Generation Sustainability

Long Now Foundation seeks to start and promote a long-term cultural institution. It aims to provide a counterpoint to what it views as today's faster/cheaper mindset and to promote slower/better thinking. The Long Now Foundation hopes to "creatively foster responsibility" in the framework of the next 10,000 years. In a manner somewhat similar to the Holocene calendar, the foundation uses 5-digit dates to address the Year 10,000 problem (e.g., by writing the current year "02021" rather than "2021"). The organisation's logo is X, a capital X with an overline, a representation of 10,000 in Roman numerals.

Strategic Foresight is a planning-oriented discipline related to futures studies, the study of the future. Strategy is a high level plan to achieve one or more goals under conditions of uncertainty. Strategic foresight happens when any planner uses scanned inputs, forecasts, alternative futures exploration, analysis and feedback to produce or alter plans and actions of the organization.

Space Song Foundation supports long-range space missions, while promoting long-term thinking at the intersection of art, science, and design.

"When you look down you can only see a few feet ahead. When you look straight forward you can see miles ahead. Feet = Days and Miles = Years. Don't just look down, look ahead. Live in the moment but also live for the future."

History - Libraries - Curate - Knowledge Ark

Clock of the Long Now or 10,000-year clock, is a mechanical clock under construction that is designed to keep time for 10,000 years. It is being built by the Long Now Foundation.



Knowledge Storage Types


Writing on Stone could last 10,000 yearsStone Carving - History in Granite.

Writing on Paper
could last 1,000 years (permanent paper without bleaches using acrylic ink)

Writing on Vinyl could last 50 years.

Writing on a CD could last 20 years.

Writing on Quartz Stone could last Millions of years. - Laser-etched quartz glass will store data for millions of years (Hitachi and Kyoto University's Kiyotaka Miura). Researchers teleport information within a Diamond.

Optical data storage breakthrough. Physicists have developed a technique with the potential to enhance optical data storage capacity in diamonds. This is possible by multiplexing the storage in the spectral domain.

Holographic message encoded in simple plastic. Important data can be stored and concealed quite easily in ordinary plastic using 3D printers and terahertz radiation, scientists show. Holography can be done quite easily: A 3D printer can be used to produce a panel from normal plastic in which a QR code can be stored, for example. The message is read using terahertz rays -- electromagnetic radiation that is invisible to the human eye.

Whatever Matter that we choose to store our valuable information on, it needs to be transferred periodically because all matter eventually degrades or goes through a transition. That is why DNA gets passed on to a new human body so that it will not get lost or be deleted, unless an extinction happens. DNA Storage.

3D Optical Disk with 100 Layers has an ultra high capacity of 1.6 petabytes or 200,000 giga bytes.

NanoFiche Deck is made of 16 layers of nickel including the cover sheet riding on the surface of Intuitive Machine’s Odysseus lunar lander touched down on the moon. The Intuitive Machines IM-1 mission made their historic lunar landing as part of NASA’s Commercial Lunar Payload Services (“CLPS”) initiative. This is the first time the US has landed on the moon since the Apollo program over 50 years ago.

Slow Fire describes paper embrittlement resulting from acid decay or continuous acidification of paper. Solutions to this problem include the use of acid-free paper stocks, reformatting brittle books by microfilming, photocopying or digitization, and a variety of deacidification techniques to increase the pH of acidic paper on a large scale. Although acid-free paper has become more common, a large body of acidic paper still exists in books made after the 1850s because of its cheaper and simpler production methods. Acidic paper, especially when exposed to light, air pollution, or high relative humidity, yellows and becomes brittle over time. During mass de-acidification an alkaline agent is deposited in the paper to neutralize existing acid and prevent further decay.

Machine-Readable Medium is a medium capable of storing data in a format readable by a mechanical device (rather than human readable). Examples of machine-readable media include magnetic media such as magnetic disks, cards, tapes, and drums, punched cards and paper tapes, optical discs, barcodes and magnetic ink characters. Common machine-readable technologies include magnetic recording, processing waveforms, and barcodes. Optical character recognition (OCR) can be used to enable machines to read information available to humans. Any information retrievable by any form of energy can be machine-readable. MARC standards or MARC (Machine-Readable Cataloging) standards are a set of digital formats for the description of items catalogued by libraries, such as books.

Magnetic Tape is a medium for magnetic recording, made of a thin, magnetizable coating on a long, narrow strip of plastic film. (15-30 yearsMagnetic Storage is the storage of data on a magnetised medium. Magnetic storage uses different patterns of magnetisation in a magnetisable material to store data and is a form of non-volatile memory. The information is accessed using one or more read/write heads. Non-Volatile Memory is a type of computer memory that can retrieve stored information even after having been power cycled (turned off and back on). The opposite of non-volatile memory is volatile memory which needs constant power in order to prevent data from being erased. Examples of non-volatile memory include read-only memory, flash memory, ferroelectric RAM, most types of magnetic computer storage devices (e.g. hard disk drives, floppy disks, and magnetic tape), optical discs, and early computer storage methods such as paper tape and punched cards.

Robust high-performance data storage through magnetic anisotropy. A technologically relevant material for HAMR data memories are thin films of iron-platinum nanograins. An international team has now observed experimentally for the first time how a special spin-lattice interaction in these iron-platinum thin films cancels out the thermal expansion of the crystal lattice.

Wire Recording was the first early magnetic recording technology, an analog type of audio storage in which a magnetic recording is made on thin steel or stainless steel wire. The first crude magnetic recorder was invented in 1898 by Valdemar Poulsen.

Nearline Magnetic Tape - Artificial Brain

Magnetic Hard Drives go atomic. Physicists demonstrate the first single-atom magnetic storage. Existing hard drives use magnets made of about 1 million atoms to store a single bit of data. Chop a magnet in two, and it becomes two smaller magnets. Slice again to make four. But the smaller magnets get, the more unstable they become; their magnetic fields tend to flip polarity from one moment to the next. Now, however, physicists have managed to create a stable magnet from a single atom.

A new ultrafast control scheme of ferromagnet for energy-efficient data storage. Using a single laser pulse that did not switch the ferrimagnetic layer, researchers demonstrated a much faster and less energy consuming switching of the ferromagnet. The digital data generated around the world every year is now counted in zettabytes, or trillions of billions of bytes -- equivalent to delivering data for hundreds of millions of books every second. The amount of data generated continues to grow. If existing technologies remained constant, all the current global electricity consumption would be devoted to data storage by 2040.

Rewritable Atomic-Scale Memory Storage Device: Little patterns of atoms can be arranged to represent English characters, fitting the content of more than a billion books onto the surface of a stamp.

A Kilobyte Rewritable Atomic Memory - Nano Technology

Data shrunk to a microscopic size is encapsulated between two sapphire disks. Can preserve 10,000 letter pages at 150 dpi or or 2,700 650×850 pictures can be stored preserve personal data for 1000 years! Any magnifying device (200x) is sufficient to access the data saved. (Nanoform).

Data Storage Device is a device for recording and storing information or data. Recording can be done using virtually any form of energy, spanning from manual muscle power in handwriting, to acoustic vibrations in phonographic recording, to electromagnetic energy modulating magnetic tape and optical discs. A storage device may hold information, process information, or both. A device that only holds information is a recording medium. Devices that process information (data storage equipment) may either access a separate portable (removable) recording medium or a permanent component to store and retrieve data. Electronic data storage requires electrical power to store and retrieve that data. Most storage devices that do not require vision and a brain to read data fall into this category. Electromagnetic data may be stored in either an analog data or digital data format on a variety of media. This type of data is considered to be electronically encoded data, whether it is electronically stored in a semiconductor device, for it is certain that a semiconductor device was used to record it on its medium. Most electronically processed data storage media (including some forms of computer data storage) are considered permanent (non-volatile) storage, that is, the data will remain stored when power is removed from the device. In contrast, most electronically stored information within most types of semiconductor (computer chips) microcircuits are volatile memory, for it vanishes if power is removed. Except for barcodes, optical character recognition (OCR), and magnetic ink character recognition (MICR) data, electronic data storage is easier to revise and may be more cost effective than alternative methods due to smaller physical space requirements and the ease of replacing (rewriting) data on the same medium.

Solving a memristor mystery to develop efficient, long-lasting memory devices. Newly discovered role of phase separation can help develop memory devices for energy-efficient AI computing. Phase separation, when molecules part like oil and water, works alongside oxygen diffusion to help memristors -- electrical components that store information using electrical resistance -- retain information even after the power is shut off, according to a recent study. To better understand the underlying phenomenon driving nonvolatile memristor memory, the researchers focused on a device known as resistive random access memory or RRAM, an alternative to the volatile RAM used in classical computing, and are particularly promising for energy-efficient artificial intelligence applications.

Plastination is a technique or process used in anatomy to preserve bodies or body parts. The water and fat are replaced by certain plastics, yielding specimens that can be touched, do not smell or decay, and even retain most properties of the original sample.

For 20 years, beginning in the 1950s, states laminated documents to try to protect them. But it caused a chemical reaction. The natural acids from the paper mixed with the degrading laminate to create a noxious vinegar. Each passing year will further degrade the document until it's gone. There are as many as 6 million laminated historical documents.

Printing on Plastic Sheets will last a long time. Plastic takes millions of years to degrade.

Ultra-high density optical data storage in common transparent plastics.

Scientists develop plastic flexible magnetic memory device.

Plastic Film is a thin continuous polymeric material. Thicker plastic material is often called a “sheet”. These thin plastic membranes are used to separate areas or volumes, to hold items, to act as barriers, or as printable surfaces. Plastic films are used in a wide variety of applications. These include: packaging, plastic bags, labels, building construction, landscaping, electrical fabrication, photographic film, film stock for movies, video tape, etc.

How to store information in your clothes invisibly, without electronics - Sensors

Storing Data in Music. Researchers have developed a technique for embedding data in music and transmitting it to a smartphone. Since the data is imperceptible to the human ear, it doesn't affect listening pleasure. The transmission principle behind this technique is fundamentally different from the well-known RDS system as used in car radios to transmit the radio station's name and details of the music that is playing. Radio Data System is a communications protocol standard for embedding small amounts of digital information in conventional FM radio broadcasts. Subliminal Messages.

Piggyback transportation refers to the transportation of goods where one transportation unit is carried on the back of something else. It is a specialized form of intermodal transportation and combined transport. Piggybacking Data Transmission (wiki) - Gaining access to a restricted communications channel by using the session another user already established. Embedded System (wiki).

Say goodbye to the dots and dashes to enhance optical storage media. Purdue University innovators have created technology aimed at replacing Morse code with colored "digital characters" to modernize optical storage. This advancement allows for more data to be stored and for that data to be read at a quicker rate. Morse code has been around since the 1830s.

Scientists from the RIKEN Center for Emergent Matter Science and collaborators have shown that they can manipulate single skyrmions—tiny magnetic vortices that could be used as computing bits in future ultra-dense information storage devices—using pulses of electric current, at room temperature.

Researchers solve mystery surrounding dielectric properties of unique metal oxide. A research team has solved a longstanding mystery surrounding strontium titanate, a metal oxide semiconductor, providing insight for future research on the material and its applications to electronic devices and data storage.


DNA Information Storage


Writing on DNA could last over 100,000 years. (maybe someone has already done this millions of years ago?)

ETH Zurich is writing digital information on DNA and then encapsulating it in a protective layer of glass. DNA.

DNA Digital Data Storage (wiki) - (Instead of Zero's and Ones we use 4 letters CTAG, or SeeTag)

DNA Knowledge - Li-Fi - Digital Inheritance - Genetic Memory

Researchers Store Computer Operating System and Short Movie on DNA. DNA is an ideal storage medium because it's ultra-compact and can last hundreds of thousands of years if kept in a cool, dry place, as demonstrated by the recent recovery of DNA from the bones of a 430,000-year-old human ancestor found in a cave in Spain. They compressed the files into a master file, and then split the data into short strings of binary code made up of ones and zeros. Using an erasure-correcting algorithm called fountain codes, they randomly packaged the strings into so-called droplets, and mapped the ones and zeros in each droplet to the four nucleotide bases in DNA: A, G, C and T. The algorithm deleted letter combinations known to create errors, and added a barcode to each droplet to help reassemble the files later. They generated a digital list of 72,000 DNA strands, each 200 bases long. To retrieve their files, they used modern sequencing technology to read the DNA strands, followed by software to translate the genetic code back into binary. They recovered their files with zero errors, the study reports. They also demonstrated that a virtually unlimited number of copies of the files could be created with their coding technique by multiplying their DNA sample through polymerase chain reaction (PCR), and that those copies, and even copies of their copies, and so on, could be recovered error-free. Finally, the researchers show that their coding strategy packs 215 petabytes of data on a single gram of DNA. The capacity of DNA data-storage is theoretically limited to two binary digits for each nucleotide, but the biological constraints of DNA itself and the need to include redundant information to reassemble and read the fragments later reduces its capacity to 1.8 binary digits per nucleotide base. The team's insight was to apply fountain codes, a technique Erlich remembered from graduate school, to make the reading and writing process more efficient. With their DNA Fountain technique, Erlich and Zielinski pack an average of 1.6 bits into each base nucleotide. That's at least 60 percent more data than previously published methods, and close to the 1.8-bit limit. The researchers spent $7,000 to synthesize the DNA they used to archive their 2 megabytes of data, and another $2,000 to read it.

How DNA is preserved in Archaeological Sediments for thousands of years.

Life Ship adds your DNA to a time capsule of life from Earth launching to the Moon.

MIT researchers have devised a way to encapsulate DNA into a thermoset polymer known as cross-linked polystyrene. After the DNA is embedded into the polymer, it can be released again by treating the polymer with cysteamine.

Discovery of world's oldest DNA breaks record by one million years. Microscopic fragments of environmental DNA were found in Ice Age sediment in northern Greenland. Using cutting-edge technology, researchers discovered the fragments are one million years older than the previous record for DNA sampled from a Siberian mammoth bone. The ancient DNA has been used to map a two-million-year-old ecosystem which weathered extreme climate change. The incomplete samples, a few millionths of a millimetre long, were taken from the København Formation, a sediment deposit almost 100 metres thick tucked in the mouth of a fjord in the Arctic Ocean in Greenland's northernmost point. The climate in Greenland at the time varied between Arctic and temperate and was between 10-17C warmer than Greenland is today. The sediment built up metre by metre in a shallow bay. Scientists discovered evidence of animals, plants and microorganisms including reindeer, hares, lemmings, birch and poplar trees. Researchers even found that Mastodon, an Ice Age mammal, roamed as far as Greenland before later becoming extinct. Previously it was thought the range of the elephant-like animals did not extend as far as Greenland from its known origins of North and Central America.

If DNA can be used as information storage, then there must be information in our DNA from our creator. There must be messages in our DNA that our creator has left us.

Capturing the immense potential of microscopic DNA for data storage. Researchers at NUS CDE pioneer an innovative 'biological camera' that ushers in a new paradigm of information storage. A 'biological camera' bypasses the constraints of current DNA storage methods, harnessing living cells and their inherent biological mechanisms to encode and store data. This represents a significant breakthrough in encoding and storing images directly within DNA, creating a new model for information storage reminiscent of a digital camera. Led by Principal Investigator Associate Professor Chueh Loo Poh from the College of Design and Engineering at the National University of Singapore, and the NUS Synthetic Biology for Clinical and Technological Innovation (SynCTI), the team's findings, which could potentially shake up the data-storage industry, were published in Nature Communications on 3 July 2023.

Storing Data in everyday objects. Researchers have discovered a new method for turning nearly any object into a data storage unit. This makes it possible to save extensive data in, say, shirt buttons, water bottles or even the lenses of glasses, and then retrieve it years later. The technique also allows users to hide information and store it for later generations. It uses DNA as the storage medium. DNA of Things. Several developments of the past few years have made this advance possible. One of them is Grass's method for marking products with a DNA "barcode" embedded in miniscule glass beads. These nanobeads have various uses; for example, as tracers for geological tests, or as markers for high-quality foodstuffs, thus distinguishing them from counterfeits. The barcode is relatively short: just a 100-bit code (100 places filled with "0"s or "1"s). This technology has now been commercialised by ETH spin-off Haelixa. At the same time, it has become possible to store enormous data volumes in DNA. Grass's colleague Yaniv Erlich, an Israeli computer scientist, developed a method that theoretically makes it possible to store 215,000 terabytes of data in a single gram of DNA. And Grass himself was able to store an entire music album in DNA -- the equivalent of 15 megabytes of data.

Abstract: The ability to write a stable record of identified molecular events into a specific genomic locus would enable the examination of long cellular histories and have many applications, ranging from developmental biology to synthetic devices. We show that the type I-E CRISPR-Cas system of E. coli can mediate acquisition of defined pieces of synthetic DNA. We harnessed this feature to generate records of specific DNA sequences into a population of bacterial genomes. We then applied directed evolution to alter the recognition of a protospacer adjacent motif by the Cas1-Cas2 complex, which enabled recording in two modes simultaneously. We used this system to reveal aspects of spacer acquisition, fundamental to the CRISPR-Cas adaptation process. These results lay the foundations of a multimodal intracellular recording device.

For 3 billion years, one of the major carriers of information needed for life, RNA, has had a glitch that creates errors when making copies of genetic information. Researchers at The University of Texas at Austin have developed a fix that allows RNA to accurately proofread for the first time. Certain viruses called retroviruses can cause RNA to make copies of DNA, a process called reverse transcription. This process is notoriously prone to errors because an evolutionary ancestor of all viruses never had the ability to accurately copy genetic material. The new innovation engineered at UT Austin is an enzyme that performs reverse transcription but can also "proofread," or check its work while copying genetic code. The enzyme allows, for the first time, for large amounts of RNA information to be copied with near perfect accuracy.

Molecular Memory is a term for data storage technologies that use molecular species as the data storage element, rather than e.g. circuits, magnetics, inorganic materials or physical shapes. The molecular component can be described as a molecular switch, and may perform this function by any of several mechanisms, including charge storage, photochromism, or changes in capacitance. In a perfect molecular memory device, each individual molecule contains a bit of data, leading to massive data capacity. However, practical devices are more likely to use large numbers of molecules for each bit, in the manner of 3D optical data storage (many examples of which can be considered molecular memory devices). The term "molecular memory" is most often used to mean indicate very fast, electronically addressed solid-state data storage, as is the term computer memory. At present, molecular memories are still found only in laboratories.

Molecular recordings by directed CRISPR spacer acquisition.

Molecular thumb drives: Researchers store digital images in metabolite molecules. In a step toward molecular storage systems that could hold vast amounts of data in tiny spaces, researchers have shown it's possible to store image files in solutions of common biological small molecules. DNA molecules are well known as carriers of huge amounts of biological information, and there is growing interest in using DNA in engineered data storage devices that can hold vastly more data than our current hard drives. But new research shows that DNA isn't the only game in town when it comes to molecular data storage.

A Smear of DNA Can Hold 10,000 Gigabytes of Data. The U.S. is investing $48 million to turn DNA into living hard drives. Intelligence Advanced Research Projects Activity - Molecular Information Storage.

New approach to DNA data storage makes system more dynamic and scalable, giving users the ability to read or modify data files without destroying them and making the systems easier to scale up for practical use. Current systems rely on sequences of DNA called primer-binding sequences that are added to the ends of DNA strands that store information. In short, the primer-binding sequence of DNA serves as a file name. When you want a given file, you retrieve the strands of DNA bearing that sequence. Many of the practical barriers to DNA data storage technologies revolve around the use of PCR to retrieve stored data. Systems that rely on PCR have to drastically raise and lower the temperature of the stored genetic material in order to rip the double-stranded DNA apart and reveal the primer-binding sequence. This results in all of the DNA -- the primer-binding sequences and the data-storage sequences -- swimming free in a kind of genetic soup. Existing technologies can then sort through the soup to find, retrieve and copy the relevant DNA using PCR. The temperature swings are problematic for developing practical technologies, and the PCR technique itself gradually consumes -- or uses up -- the original version of the file that is being retrieved.

Catching electrons in action in an antiferromagnetic nanowire. A property called spin, electrons behave like tiny magnets, similar to how a bar magnet's magnetization is dipolar, pointing from south to north, the electrons in a material have magnetic dipole moment vectors that describe the material's magnetization. When these vectors are in random orientation, the material is nonmagnetic. When they are parallel to each other, it's called ferromagnetism and antiparallel alignments are antiferromagnetism. Current data storage technology is based on ferromagnetic materials, where the data are stored in small ferromagnetic domains. This is why a strong enough magnet can mess up a mobile phone or other electronic storage. Depending on the direction of magnetization (whether pointing up or down), data are recorded as bits (either a 1 or 0) in ferromagnetic domains. However, there are two bottlenecks, and both hinge on proximity. First, bring an external magnet too close, and its magnetic field could alter the direction of magnetic moments in the domain and damage the storage device. And, second, the domains each have a magnetic field of their own, so they can't be too close to each other either. The challenge with smaller, more flexible, more versatile electronics is that they demand devices that make it harder to keep ferromagnetic domains safely apart. In a study published in Nano Letters, physicists from Michigan Technological University explore alternative materials to improve capacity and shrink the size of digital data storage technologies.


Analog


Analog are signals or information represented by a continuously variable physical quantity such as spatial position, voltage, etc.

Digital Information - Analog Computer

Analog Signal has a theoretically infinite resolution. In practice an analog signal is subject to electronic noise and distortion introduced by communication channels and signal processing operations, which can progressively degrade the Signal-to-Noise Ratio or SNR, which is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. It is defined as the ratio of signal power to the noise power, often expressed in decibels. A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise. While SNR is commonly quoted for electrical signals, it can be applied to any form of signal (such as isotope levels in an ice core or biochemical signaling between cells). Errors.

Analogue Electronics are electronic systems with a continuously variable signal, in contrast to digital electronics where signals usually take only two levels. The term "analogue" describes the proportional relationship between a signal and a voltage or current that represents the signal. The word analogue is derived from the Greek word ανάλογος (analogos) meaning "proportional".

Analog Device is usually a combination of both analog machine and analog media that can together measure, record, reproduce, or broadcast continuous information, for example, the almost infinite number of grades of transparency, voltage, resistance, rotation, or pressure. In theory, the continuous information (also analog signal) has an infinite number of possible values with the only limitation on resolution being the accuracy of the analog device.

Analog-to-Digital Converter is a system that converts an analog signal, such as a sound picked up by a microphone or light entering a digital camera, into a digital signal. An ADC may also provide an isolated measurement such as an electronic device that converts an input analog voltage or current to a digital number proportional to the magnitude of the voltage or current. Typically the digital output is a two's complement binary number that is proportional to the input, but there are other possibilities. There are several ADC architectures. Due to the complexity and the need for precisely matched components, all but the most specialized ADCs are implemented as integrated circuits (ICs). A digital-to-analog converter (DAC) performs the reverse function; it converts a digital signal into an analog signal.

Breaking the scaling limits of analog computing. New technique could diminish errors that hamper the performance of super-fast analog optical neural networks. A new technique greatly reduces the error in an optical neural network, which uses light to process data instead of electrical signals. With their technique, the larger an optical neural network becomes, the lower the error in its computations. This could enable them to scale these devices up so they would be large enough for commercial uses. As machine-learning models become larger and more complex, they require faster and more energy-efficient hardware to perform computations. Conventional digital computers are struggling to keep up.


Noise - Filtering Noise - Errors


Noise in electronics is a random fluctuation in an electrical signal, a characteristic of all electronic circuits. Noise generated by electronic devices varies greatly as it is produced by several different effects. Thermal noise is unavoidable at non-zero temperature (see fluctuation-dissipation theorem), while other types depend mostly on device type (such as shot noise, which needs a steep potential barrier) or manufacturing quality and semiconductor defects, such as conductance fluctuations, including 1/f noise. Magnonic devices can replace electronics without much noise - Magnetism.

Johnson-Nyquist Noise is the electronic noise generated by the thermal agitation of the charge carriers (usually the electrons) inside an electrical conductor at equilibrium, which happens regardless of any applied voltage. Thermal noise is present in all electrical circuits, and in sensitive electronic equipment such as radio receivers can drown out weak signals, and can be the limiting factor on sensitivity of an electrical measuring instrument. Thermal noise increases with temperature. Some sensitive electronic equipment such as radio telescope receivers are cooled to cryogenic temperatures to reduce thermal noise in their circuits. The generic, statistical physical derivation of this noise is called the fluctuation-dissipation theorem, where generalized impedance or generalized susceptibility is used to characterize the medium. Thermal noise in an ideal resistor is approximately white, meaning that the power spectral density is nearly constant throughout the frequency spectrum (however see the section below on extremely high frequencies). When limited to a finite bandwidth, thermal noise has a nearly Gaussian amplitude distribution.

Signal-to-Noise Ratio is the ratio of the strength of an electrical or other signal carrying information to that of interference, sometimes expressed in decibels. A measure used in science and engineering that compares the level of a desired signal to the level of background noise. SNR is defined as the ratio of signal power to the noise power, often expressed in decibels. A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise.

Crosstalk is any phenomenon by which a signal transmitted on one circuit or channel of a transmission system creates an undesired effect in another circuit or channel. Crosstalk is usually caused by undesired capacitive, inductive, or conductive coupling from one circuit or channel to another. Crosstalk is a significant issue in structured cabling, audio electronics, integrated circuit design, wireless communication and other communications systems. Auditory Hallucination.

Noise is unwanted sound judged to be unpleasant, loud or disruptive to hearing. From a physics standpoint, noise is indistinguishable from sound, as both are vibrations through a medium, such as air or water. The difference arises when the brain receives and perceives a sound.

Filter - White Noise - Sensory Interactions - Misinterpret - Statistical Noise - Statistics

Noise in analog video and television, is a random dot pixel pattern of static displayed when no transmission signal is obtained by the antenna receiver of television sets and other display devices. The random pattern superimposed on the picture, visible as a random flicker of "dots" or "snow", is the result of electronic noise and radiated electromagnetic noise accidentally picked up by the antenna. This effect is most commonly seen with analog TV sets or blank VHS tapes. There are many sources of electromagnetic noise which cause the characteristic display patterns of static. Atmospheric sources of noise are the most ubiquitous, and include electromagnetic signals prompted by cosmic microwave background radiation, or more localized radio wave noise from nearby electronic devices. The display device itself is also a source of noise, due in part to thermal noise produced by the inner electronics. Most of this noise comes from the first transistor the antenna is attached to.

Noise Reduction is the process of removing noise from a signal. All signal processing devices, both analog and digital, have traits that make them susceptible to noise. Noise can be random or white noise with an even frequency distribution, or frequency dependent noise introduced by a device's mechanism or signal processing algorithms.

Noise Cancelation - Noise Suppression - Information Paradox

Potential application of unwanted electronic noise in semiconductors. Random telegraph noises in vanadium-doped tungsten diselenide can be tuned with voltage polarity. Random telegraph noise in semiconductors is typically caused by two-state defects. Two-dimensional van der Waals layered magnetic materials are expected to exhibit large fluctuations due to long-range Coulomb interaction; importantly, which could be controlled by a voltage compared to 3D counterparts having large charge screening. Researchers reported electrically tunable magnetic fluctuations and RTN signal in multilayered vanadium-doped tungsten diselenide by using vertical magnetic tunneling junction devices. They identified bistable magnetic states in the 1/f2 RTNs in noise spectroscopy, which can be further utilized for switching devices via voltage polarity.

Signal Separation is the separation of a set of source signals from a set of mixed signals, without the aid of information (or with very little information) about the source signals or the mixing process. It is most commonly applied in digital signal processing and involves the analysis of mixtures of signals; the objective is to recover the original component signals from a mixture signal. The classical example of a source separation problem is the cocktail party problem, where a number of people are talking simultaneously in a room (for example, at a cocktail party), and a listener is trying to follow one of the discussions. The human brain can handle this sort of auditory source separation problem, but it is a difficult problem in digital signal processing.

Independent Component Analysis is a computational method for separating a multivariate signal into additive subcomponents. This is done by assuming that the subcomponents are, potentially, non-Gaussian signals and that they are statistically independent from each other. ICA is a special case of blind source separation. A common example application is the "cocktail party problem" of listening in on one person's speech in a noisy room.

Jitter is the deviation from true periodicity of a presumably periodic signal, often in relation to a reference clock signal.

Distortion is the alteration of the original shape (or other characteristic) of something, such as an object, image, sound or waveform. Distortion is usually unwanted, and so engineers strive to eliminate distortion, or minimize it. In some situations, however, distortion may be desirable. The important signal processing operation of heterodyning is based on nonlinear mixing of signals to cause intermodulation. Distortion is also used as a musical effect, particularly with electric guitars. - Vague Words.

Generation Loss is the loss of quality between subsequent copies or transcodes of data. Anything that reduces the quality of the representation when copying, and would cause further reduction in quality on making a copy of the copy, can be considered a form of generation loss. File size increases are a common result of generation loss, as the introduction of artifacts may actually increase the entropy of the data through each generation. Rumors.

Data Degradation is the gradual corruption of computer data due to an accumulation of non-critical failures in a data storage device. The phenomenon is also known as data decay, data rot or bit rot.

Backing up data every 10 years on a new memory storage medium could make data last forever.

Channel in communications refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used to convey an information signal, for example a digital bit stream, from one or several senders (or transmitters) to one or several receivers. A channel has a certain capacity for transmitting information, often measured by its bandwidth in Hz or its data rate in bits per second.

Signal Processing is an enabling technology that encompasses the fundamental theory, applications, algorithms, and implementations of processing or transferring information contained in many different physical, symbolic, or abstract formats broadly designated as signals. It uses mathematical, statistical, computational, heuristic, and linguistic representations, formalisms, and techniques for representation, modelling, analysis, synthesis, discovery, recovery, sensing, acquisition, extraction, learning, security, or forensics.

Data Loss is an error condition in information systems in which information is destroyed by failures or neglect in storage, transmission, or processing. Information systems implement backup and disaster recovery equipment and processes to prevent data loss or restore lost data. Data loss is distinguished from data unavailability, which may arise from a network outage. Although the two have substantially similar consequences for users, data unavailability is temporary, while data loss may be permanent. Data loss is also distinct from data breach, an incident where data falls into the wrong hands, although the term data loss has been used in those incidents.

Checksum is a digit representing the sum of the digits in an instance of digital data; used to check whether errors have occurred in transmission or storage.

Landauer's Principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment".

Shadowing is the effect that the received signal power fluctuates due to objects obstructing the propagation path between transmitter and receiver. These fluctuations are experienced on local-mean powers, that is, short-term averages to remove fluctuations due to multipath fading.

Data Corruption refers to errors in computer data that occur during writing, reading, storage, transmission, or processing, which introduce unintended changes to the original data. Propaganda.

Error-Correcting Code Memory is a type of computer data storage that can detect and correct the most common kinds of internal data corruption. ECC memory is used in most computers where data corruption cannot be tolerated under any circumstances, such as for scientific or financial computing.

Error Detection and Correction are techniques that enable reliable delivery of digital data over unreliable communication channels. Many communication channels are subject to channel noise, and thus errors may be introduced during transmission from the source to a receiver. Error detection techniques allow detecting such errors, while error correction enables reconstruction of the original data in many cases.

DNA Repair - DNA Error Corrections - Mutations

New Algorithm Repairs Corrupted Digital Images in One Step.

New chip for mobile devices knocks out unwanted signals. The receiver chip efficiently blocks signal interference that slows device performance and drains batteries.

Faster computing results without fear of errors. Researchers developed a new system that can make computer programs run faster, while guaranteeing accuracy. A new technique can dramatically accelerate programs known as shell scripts, through a process called parallelization, while ensuring the programs return accurate results.

New Techniques Boost Performance of Non-Volatile Memory Systems. North Carolina State University have developed new software and hardware designs that should limit programming errors and improve system performance in devices that use non-volatile memory (NVM) technologies.

Anomalies - Broken Symmetry - Reasons - Synchronous and Asynchronous Communication

A physical qubit with built-in error correction. Generating a logical qubit from a single light pulse that has the inherent capacity to correct errors. To avoid qubit losses and other errors, it is necessary to couple several single-photon light pulses together to construct a logical qubit -- as in the case of the superconductor-based approach. Rather than using a single photon, the team employed a laser-generated light pulse that can consist of several photons.

Distance Decay is when the interaction between two locales declines as the distance between them increases. Once the distance is outside of the two locales' activity space, their interactions begin to decrease. Long-Distance Relationship is an intimate relationship between partners who are geographically isolated from one another. Partners in LDRs face geographic separation and lack of face-to-face contact.

Entropy in information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information. The amount of information of every event forms a random variable whose expected value, or average, is the Shannon entropy. Units of entropy are the shannon, nat, or hartley, depending on the base of the logarithm used to define it, though the shannon is commonly referred to as a bit.

Communication Noise refers to influences on effective communication that influence the interpretation of conversations. While often looked over, communication noise can have a profound impact both on our perception of interactions with others and our analysis of our own communication proficiency. Forms of communication noise include psychological noise, physical noise, physiological and semantic noise. All these forms of noise subtly, yet greatly influence our communication with others and are vitally important to anyone’s skills as a competent communicator.

Adding noise for completely secure communication. The new protocol overcomes this hurdle with a trick -- the researchers add artificial noise to the actual information about the crypto key. Even if many of the information units are undetected, an "eavesdropper" receives so little real information about the crypto key that the security of the protocol remains guaranteed. In this way, the researchers lowered the requirement on the detection efficiency of the devices.

Interference is anything which modifies, or disrupts a signal as it travels along a channel between a source and a receiver. The term typically refers to the addition of unwanted signals to a useful signal. Common examples are: Electromagnetic interference (EMI). Co-channel interference (CCI), also known as crosstalk. Adjacent-channel interference (ACI). Intersymbol interference (ISI). Inter-carrier interference (ICI). caused by doppler shift in OFDM modulation (multitone modulation). Common-mode interference (CMI). Conducted interference. Interference is typically but not always distinguished from noise, for example white thermal noise. Radio resource management aims at reducing and controlling the co-channel and adjacent-channel interference. See also: Distortion, Signal-to-Interference Ratio (SIR), Signal to noise plus interference (SNIR), Inter-flow interference and Intra-flow interference.

'Flipping' optical wavefront eliminates distortions in multimode fibers. Researchers have devised a novel technique to 'flip' the optical wavefront of an image for both polarizations simultaneously, so that it can be transmitted through a multimode fiber without distortion. The use of multimode optical fibers to boost the information capacity of the Internet is severely hampered by distortions that occur during the transmission of images because of a phenomenon called modal crosstalk.


Compression of Data


Data Compression involves encoding information using fewer bits than the original representation. Compression can be either lossy or lossless. Lossless Compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Filtering. The process of reducing the size of a data file is referred to as data compression. In the context of data transmission, it is called source coding (encoding done at the source of the data before it is stored or transmitted) in opposition to channel coding.

Data Decompression
is the action of reversing data compression. The act of expanding a compression file back into its original form, so that the information can be read or extracted. Decompression (wiki)

Lossless Compression is a class of data compression algorithms that allows the original Data to be perfectly reconstructed from the compressed data. By contrast, lossy compression permits reconstruction only of an approximation of the original data, though this usually improves compression rates (and therefore reduces file sizes).

Lossy Compression is the class of data encoding methods that uses inexact approximations and partial data discarding to represent the content. These techniques are used to reduce data size for storage, handling, and transmitting content.

Audio compression (data), a type of lossy or lossless compression in which the amount of data in a recorded waveform is reduced to differing extents for transmission respectively with or without some loss of quality, used in CD and MP3 encoding, Internet radio, and the like.
Dynamic range compression.

Dynamic Range Compression, also called audio level compression, in which the dynamic range, the difference between loud and quiet, of an audio waveform is reduced.

Entropy Encoding is a lossless Data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix-free code to each unique symbol that occurs in the input. These entropy encoders then compress data by replacing each fixed-length input symbol with the corresponding variable-length prefix-free output codeword. The length of each codeword is approximately proportional to the negative logarithm of the probability. Therefore, the most common symbols use the shortest codes.

Asymmetric Numeral Systems is a family of entropy coding methods introduced by Jarosław (Jarek) Duda, used in data compression since 2014 due to improved performance compared to the previously used methods. ANS combines the compression ratio of arithmetic coding (which uses a nearly accurate probability distribution), with a processing cost similar to that of Huffman coding.

How Computers Compress Text: Huffman Coding and Huffman Trees (youtube)

Zip is an archive file format that supports lossless data compression. A .ZIP file may contain one or more files or directories that may have been compressed. The .ZIP file format permits a number of compression algorithms, though DEFLATE is the most common. Unzip in computing means to decompress a file that has previously been compressed.

Data Compression Ratio is a computer science term used to quantify the reduction in data-representation size produced by a data compression algorithm. The data compression ratio is analogous to the physical compression ratio used to measure physical compression of substances.

DEFLATE is a lossless data compression algorithm and associated file format that uses a combination of the LZ77 algorithm and Huffman coding.

Huffman Coding s a particular type of optimal prefix code that is commonly used for lossless data compression.

Prefix Code is a type of code system (typically a variable-length code) distinguished by its possession of the "prefix property", which requires that there is no whole code word in the system that is a prefix (initial segment) of any other code word in the system. For example, a code with code words {9, 55} has the prefix property; a code consisting of {9, 5, 59, 55} does not, because "5" is a prefix of "59" and also of "55". A prefix code is a uniquely decodable code: given a complete and accurate sequence, a receiver can identify each word without requiring a special marker between words. However, there are uniquely decodable codes that are not prefix codes; for instance, the reverse of a prefix code is still uniquely decodable (it is a suffix code), but it is not necessarily a prefix code.

Variable-Length Code is a code which maps source symbols to a variable number of bits. Variable-length codes can allow sources to be compressed and decompressed with zero error (lossless data compression) and still be read back symbol by symbol. With the right coding strategy an independent and identically-distributed source may be compressed almost arbitrarily close to its entropy. This is in contrast to fixed length coding methods, for which data compression is only possible for large blocks of data, and any compression beyond the logarithm of the total number of possibilities comes with a finite (though perhaps arbitrarily small) probability of failure.


Redundancy


Redundancy is the use of words or data that could be omitted without loss of meaning or function. Redundancy also means repeating information or the repetition or superfluity of information. Redundancy is the repetition of messages to reduce the probability of errors in transmission. Redundancy can also mean the repetition of needless information. Superfluous is more than is needed, desired, or required. Serving no useful purpose and having no excuse for being. Rote Learning.

Redundancy is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.

Redundant Code is source code or compiled code in a computer program that is unnecessary, such as: recomputing a value that has previously been calculated and is still available, code that is never executed (known as unreachable code), code which is executed but has no external effect (e.g., does not change the output produced by a program; known as dead code). Redundancy (information theory) (wiki).

Gene Redundancy is the existence of multiple genes in the genome of an organism that perform the same function.

Data Redundancy data redundancy is the existence of data that is additional to the actual data and permits correction of errors in stored or transmitted data. The additional data can simply be a complete copy of the actual data, or only select pieces of data that allow detection of errors and reconstruction of lost or damaged data up to a certain level.

Redundancy in engineering is the duplication of critical components or functions of a system with the intention of increasing reliability of the system, usually in the form of a backup or fail-safe, or to improve actual system performance, such as in the case of GNSS receivers, or multi-threaded computer processing.



The Test of Time


Even if our information and knowledge is saved in digital format, on paper or some how saved in bacteria or in our DNA, there is still no guarantee that the information and knowledge will not be lost or destroyed. One idea would be is to launch multiple unmanned Spacecraft, like the Voyager 1, into space that's programmed to stay within the solar system and programmed to return to earth at 500-year intervals. If we are still here and if the earth is still inhabitable, then the Space Probe would land in a populate area so that it’s information and knowledge can be retrieved. Then people would update the space pod and then send it back out into Space. And if the space pod returns to earths orbit and sees no life because earth has became uninhabitable for whatever reason, then the space pod would leave earths orbit and check other planets in the solar system for signs of life. The space pod would keep doing this as long as it survives. I kind of get this feeling like this has happened already before, besides what we have seen portrayed in some of our sci-fi movies with Extra Terrestrial's of course.

Seed ships could be entirely robotic, but might contain human embryos that could be delivered to distant star systems where they would be incubated and, presumably, raised by robo-caretakers. Knowledge Ark.

Data Degradation is the gradual corruption of computer data due to an accumulation of non-critical failures in a data storage device. The phenomenon is also known as data decay, data rot or bit rot.

Software Rot also known as code rot, bit rot, software erosion, software decay or software entropy is either a slow deterioration of software performance over time or its diminishing responsiveness that will eventually lead to software becoming faulty, unusable, or otherwise called "legacy" and in need of upgrade. This is not a physical phenomenon: the software does not actually decay, but rather suffers from a lack of being responsive and updated with respect to the changing environment in which it resides. The Jargon File, a compendium of hacker lore, defines "bit rot" as a jocular explanation for the degradation of a software program over time even if "nothing has changed"; the idea being this is almost as if the bits that make up the program were subject to Radioactive Decay.

Colonization of the Moon (wiki) - Failed Civilizations (knowledge lost) - Human Extinction

Maybe we could build a Monolith that could store our most valuable information and knowledge. We could make it out of the same material that could survive deep space and also survive entering a planets atmosphere, and maybe even survive a Black Hole. So I'm thinking, maybe that's the reason our universe is here, because someone has already thought of a way to preserve information in the previous universe.

Asgardia is a free and unrestricted society which holds knowledge, intelligence and science at its core, will launch a satellite later this year to test the concept of long-term data storage in orbit around the Earth.

Artificial Intelligence 

Humans embryos and sperm would have to be cryogenically frozen and raised during space flight by Ai Robots that would be trained to raise children. Humans can also adapt to space travel. And when they find a new planet, then they will use our original DNA to raise the original human species to adapt to a new planet environment.

extremophile deinocolus radioduans is an extremophilic bacterium, one of the most radiation-resistant organisms known.

In order to travel in space for hundreds of years to reach a new habitable planet, humans would need to evolve into a different kind of human more suitable for space travel. Eventually humans would look like space aliens with big heads with little bodies. But as long as humans preserve their original DNA in eggs and sperm, they could raise original humans again to adapt in their new world. Again this sounds like it has already happened. Déjà vu - Jamais vu.

Mammal Embryos can develop fully in Space

"If life already evolved on another planet before the earth was born, then maybe life on that planet launched a pod into space, like a seed from tree, hoping to land somewhere to grow again, and keep life moving forward."

John Adams Preserve Knowledge Quote "When I think about how to preserve our information and knowledge, I can't help but think that someone millions of years ago already solved that problem because we would not be here if they didn't."

"If we ever did lose all our knowledge, and we had to start all over again, we would most likely do the same things and make the same mistakes, all because we did not learn enough, or teach enough."

Understanding the meaning of words and knowing how to assemble words in a meaningful way is extremely important. But someone still has to be there to accurately interpret the message. You could write the most profound message in the world, but if no one is there to read it or understand it, then your message falls on deaf ears, or your message remains silent and floats in a sea of emptiness.



Legacies - What will be your Legacy


"All good men and women must take responsibility to create legacies that will take the next generation to a level we could only imagine."

Legacies - Digital Inheritance - Genetic Memory - DNA Storage - Time Travel - Impermanence - Legends - Bravery - Success - Find Your Calling - Afterglow

What will you be known for when you leave this earth? The most influential people, the ones who leave behind incredible legacies, will live on in the hearts of the people they touch. Physically, they will no longer be a part of society—but their principles, philosophies and achievements will become immortal, spreading from generation to generation.

“Carve your name on hearts, not tombstones. A legacy is etched into the minds of others and the stories they share about you.” —Shannon L. Alder

“If you would not be forgotten as soon as you are dead, either write something worth reading or do something worth writing.” —Benjamin Franklin

“No legacy is so rich as honesty.” —William Shakespeare

“I think the whole world is dying to hear someone say, ‘I love you.’ I think that if I can leave the legacy of love and passion in the world, then I think I’ve done my job in a world that’s getting colder and colder by the day.” —Lionel Richie

“That is your legacy on this Earth when you leave this Earth: how many hearts you touched.” —Patti Davis

“The great use of life is to spend it for something that will outlast it.” —William James

Immortality is to live your life doing good things, and leaving your mark behind.” —Brandon Lee

“You make your mark by being true to who you are and letting that be your staple.” —Kat Graham

“The legacy of heroes is the memory of a great name and the inheritance of a great example.” —Benjamin Disraeli

Your story is the greatest legacy that you will leave to your friends. It’s the longest-lasting legacy you will leave to your heirs.” —Steve Saint



Previous Subject Up Top Page Next Subject



The Thinker Man