Facebook Twiter Goole Plus Linked In YouTube Blogger

Knowledge Preservation


Preservation is the activity of protecting something from loss or danger. An occurrence of improvement by virtue of preventing loss or injury or other change. A process that saves organic substances from decay. The condition of being preserved for future use. Kept intact or in a particular condition. Keep or maintain in unaltered condition; cause to remain or last. Keep constant through physical or chemical reactions or evolutionary change. Documentation

Keeping
is the responsibility of a guardian or keeper. The act of retaining something. Supply with necessities and support. Maintain for use and service. Preserve information for retrieval. Conformity or harmony. Knowledge Keeper

Protecting
is to shield from danger, injury, destruction, or damage. Shielding (or designed to shield) against harm or discomfort.
Backup is a copy of a file or directory on a separate storage device. Memory - Storage on a electronic memory device.

Previous SubjectNext Subject

Egyptian Hieroglyphs Data Storage Types
Knowledge Organization
Knowledge Management

One of our biggest concerns is..."Will our most important information and knowledge stand The Test of Time?"

Information Destruction throughout History: Info-Graphic (not including the suppression of valuable information)

History
Academic Papers
Library Science

Digital Preservation is a formal endeavor to ensure that digital information of continuing value remains accessible and usable. It involves planning, resource allocation, and application of preservation methods and technologies, and it combines policies, strategies and actions to ensure access to reformatted and "born-digital" content, regardless of the challenges of media failure and technological change. The goal of digital preservation is the accurate rendering of authenticated content over time.

Computer Data Storage is a technology consisting of computer components and recording media used to retain digital data. It is a core function and fundamental component of computers. Compression.

Storage (bit). In the earliest non-electronic information processing devices, such as Jacquard's loom or Babbage's Analytical Engine, a bit was often stored as the position of a mechanical lever or gear, or the presence or absence of a hole at a specific point of a paper card or tape. The first electrical devices for discrete logic (such as elevator and traffic light control circuits, telephone switches, and Konrad Zuse's computer) represented bits as the states of electrical relays which could be either "open" or "closed". When relays were replaced by vacuum tubes, starting in the 1940s, computer builders experimented with a variety of storage methods, such as pressure pulses traveling down a mercury delay line, charges stored on the inside surface of a cathode-ray tube, or opaque spots printed on glass discs by photolithographic techniques. In the 1950s and 1960s, these methods were largely supplanted by magnetic storage devices such as magnetic core memory, magnetic tapes, drums, and disks, where a bit was represented by the polarity of magnetization of a certain area of a ferromagnetic film, or by a change in polarity from one direction to the other. The same principle was later used in the magnetic bubble memory developed in the 1980s, and is still found in various magnetic strip items such as metro tickets and some credit cards. In modern semiconductor memory, such as dynamic random-access memory, the two values of a bit may be represented by two levels of electric charge stored in a capacitor. In certain types of programmable logic arrays and read-only memory, a bit may be represented by the presence or absence of a conducting path at a certain point of a circuit. In optical discs, a bit is encoded as the presence or absence of a microscopic pit on a reflective surface. In one-dimensional bar codes, bits are encoded as the thickness of alternating black and white lines. Off Loading our Memories

History of Hard Disk Drives (wiki) - In 1956, IBM made a 5 MB storage device that was big as a car that cost $160,000. Today in 2017, Samsung made a 16 Terabyte SSD that fits in your pocket. (16,000 GB).

Digital Curation is the selection, preservation, maintenance, collection and archiving of digital assets. Digital curation establishes, maintains and adds value to repositories of digital data for present and future use. This is often accomplished by archivists, librarians, scientists, historians, and scholars. Enterprises are starting to use digital curation to improve the quality of information and data within their operational and strategic processes. Successful digital curation will mitigate digital obsolescence, keeping the information accessible to users indefinitely.

Content Curation is the process of gathering information relevant to a particular topic or area of interest. Services or people that implement content curation are called curators. Curation services can be used by businesses as well as end users.

Knowledge Vault is a knowledge base created by Google. As of 2014, it contained 1.6 billion facts which had been collated automatically from the Internet. Knowledge Vault is a potential successor to Google's Knowledge Graph. The Knowledge Graph pulled in information from structured sources like Freebase, Wikidata and Wikipedia, while the Knowledge Vault is an accumulation of facts from across the entire web, including unstructured sources. "Facts" in Knowledge Vault also include a confidence value, giving the capability of distinguishing between knowledge statements that have a high probability of being true from others that may be less likely to be true (based on the source that Google obtained the data from and other factors).
The concept behind the Knowledge Vault was presented in a paper authored by a Google Research team. Google has indicated that Knowledge Vault is a research paper and not an active product in development, as of August 2014.

Planning Far into the Future
Long Now Long-Term Thinking

Manual for Civilization

Seven Generation Sustainability is the great law of the Iroquois, which holds appropriate to think seven generations ahead (about 140 years into the future) and decide whether the decisions they make today would benefit their children seven generations into the future.

Transgenerational is acting across multiple generations.

Futures Studies is the study of postulating possible, probable, and preferable futures and the worldviews and myths that underlie them. Seeks to understand what is likely to continue and what could plausibly change. Part of the discipline thus seeks a systematic and pattern-based understanding of past and present, and to determine the likelihood of future events and trends.(also called futurology), foresight organizations

Strategic Foresight is a Planning-oriented disicipline related to futures studies, the study of the future. Strategy is a high level plan to achieve one or more goals under conditions of uncertainty. Strategic foresight happens when any planner uses scanned inputs, forecasts, alternative futures exploration, analysis and feedback to produce or alter plans and actions of the organization.

"When you look down you can only see a few feet ahead. When you look straight forward you can see miles ahead. Feet = Days and Miles = Years. Don't just look down, look ahead. Live in the moment but also live for the future."

Archive Team is a loose collective of rogue archivists, programmers, writers and loudmouths dedicated to saving our digital heritage.

Rosetta Disk is the physical companion of the Rosetta Digital Language Archive, and a prototype of one facet of The Long Now Foundation's 10,000-Year Library.

Internet Archive is a non-profit library of millions of free books, movies, software, music, websites, and more.

Preservation library and Archival Science refers to the set of activities that aims to prolong the life of a record and relevant metadata, or enhance its value, or improve access to it through non-interventive means. This includes actions taken to influence records creators prior to selection and acquisition.

Print Wikipedia printed 106 of the 7,473 volumes of English Wikipedia as it existed on April 7, 2015 and also included wallpaper displaying 1,980 additional volumes. A 36-volume index of all of the 7.5 million contributors to English Wikipedia is also part of the project. The table of contents takes up 91 700-page volumes. The printed volume only includes text of the articles. Images and references are not included.

Wikipedia Terminal Event Management Policy is an official policy of Wikipedia detailing the procedures to be followed to safeguard the content of the encyclopedia in the event of a non-localized event that would render the continuation of Wikipedia in its current form untenable. The policy is designed to facilitate the preservation of the encyclopedia by a transition to non-electronic media in an orderly, time-sensitive manner or, if events dictate otherwise, the preservation of the encyclopedia by other means. Editors are asked to familiarize themselves with the procedures and in the unlikely event that the implementation of these procedures proves necessary, act in accordance with the procedural guidelines, inasmuch as circumstances allow.

Encyclopedia Galactica is a fictional or hypothetical encyclopedia containing all the knowledge accumulated by a galaxy-spanning civilization. The name evokes the exhaustive aspects of the real-life Encyclopædia Britannica.

Cryptosteel Cold Storage Wallet

Biogenesis is the production of new living organisms or organelles.

Cells - Evolution

Digital Amnesia (Bregtje van der Haak, VPRO) (youtube)

The End of Memory: A good Film Directed by Vincent Amouroux and Produced by ARTE France - ZED / Diff : ARTE
Documentary, France, 2014, 52 min.. This film is a scientific investigation about the challenges of memory storage and the
short lifespan of current storage formats.

Big Data

File Format is a standard way that information is encoded for storage in a computer file. It specifies how bits are used to encode information in a digital storage medium.

Encoding (memory) is the ability to encode, store and recall information. Memories give an organism the capability to learn and adapt from previous experiences as well as build relationships. Encoding allows the perceived item of use or interest to be converted into a construct that can be stored within the brain and recalled later from short-term or long-term memory.

Working memory stores information for immediate use or manipulation which is aided through hooking onto previously archived items already present in the long-term memory of an individual.

Knowledge Ark is a collection of knowledge preserved in such a way that future generations would have access to said knowledge if current means of access were lost. Scenarios where availability to information (such as the Internet) would be lost could be described as Existential Risks or Extinction Level Events. A knowledge ark could take the form of a traditional Library or a modern computer database. It could also include images only (such as photographs of important information, or diagrams of critical processes). A knowledge ark would have to be resistant to the effects of natural or man-made disasters to be viable. Such an ark should include, but would not be limited to, information or material relevant to the survival and prosperity of human civilization. Current examples include the Svalbard Global Seed Vault, a seedbank which is intended to preserve a wide variety of plant seeds (such as important crops) in case of their extinction.

Sending our Knowledge into Outer Space

Norway World Arctic Archive is built in “Mine 3,” an abandoned coal mine close to the Global Seed Vault. Countries are being
encouraged to submit data that is particularly significant to their culture. Grønland 56, 3045 Drammen, NORWAY, +47 905 33 432, office@piql.com

A Lunar Ark has been proposed which would store and transmit valuable information to receiver stations on Earth. The success of this would also depend on the availability of compatible receiver equipment on Earth, and adequate knowledge of that equipment's operation. Other types of knowledge arks might include genetic material. With the potential for widespread personal DNA sequencing becoming a reality, an individual might agree to store their genetic code in a digital or analog storage format which would enable later retrieval of that code. If a species was sequenced before extinction, its genome would remain available for study even in the case of extinction.

Noah's Ark

Knowledge Storage Types


Writing on Stone could last 10,000 yearsStone Carving

Writing on Paper
could last 1,000 years (permanent paper without bleaches using acrylic ink)

Writing on Vinyl could last 50 years.

Writing on a CD could last 20 years.

Writing on Quartz Stone could last Millions of years.
Laser-etched quartz glass will store data for millions of years (Hitachi and Kyoto University's Kiyotaka Miura)

Magnetic Tape is a medium for magnetic recording, made of a thin, magnetizable coating on a long, narrow strip of plastic film. (15-30 yearsMagnetic Storage is the storage of data on a magnetised medium. Magnetic storage uses different patterns of
magnetisation in a magnetisable material to store data and is a form of non-volatile memory. The information is accessed using one or more read/write heads. Non-Volatile Memory is a type of computer memory that can retrieve stored information even after having been power cycled (turned off and back on). The opposite of non-volatile memory is volatile memory which needs constant power in order to prevent data from being erased. Examples of non-volatile memory include read-only memory, flash memory, ferroelectric RAM, most types of magnetic computer storage devices (e.g. hard disk drives, floppy disks, and magnetic tape), optical discs, and early computer storage methods such as paper tape and punched cards.

Wire Recording was the first early magnetic recording technology, an analog type of audio storage in which a magnetic recording is made on thin steel or stainless steel wire. The first crude magnetic recorder was invented in 1898 by Valdemar Poulsen.

Nearline Magnetic Tape

Magnetic hard drives go atomic. Physicists demonstrate the first single-atom magnetic storage. Existing hard drives use magnets made of about 1 million atoms to store a single bit of data. Chop a magnet in two, and it becomes two smaller magnets. Slice again to make four. But the smaller magnets get, the more unstable they become; their magnetic fields tend to flip polarity from one moment to the next. Now, however, physicists have managed to create a stable magnet from a single atom.

Rewritable Atomic-Scale Memory Storage Device: Little patterns of atoms can be arranged to represent English characters, fitting the content of more than a billion books onto the surface of a stamp.

A kilobyte rewritable atomic memory
Nano Technology

Data shrunk to a microscopic size is encapsulated between two sapphire disks. Can preserve 10,000 letter pages at 150 dpi or or 2,700 650×850 pictures can be stored preserve personal data for 1000 years! Any magnifying device (200x) is sufficient to access the data saved. (Nanoform)  

Li-Fi

DNA

Writing on DNA could last over 100,000 years. (maybe someone has already done this millions of years ago?)

ETH Zurich is writing digital information on DNA and then encapsulating it in a protective layer of glass. DNA

DNA Digital Data Storage (wiki)  (Instead of Zero's and Ones we use 4 letters CTAG, SeeTag!)

DNA Knowledge

Researchers Store Computer Operating System and Short Movie on DNA. DNA is an ideal storage medium because it's ultra-compact and can last hundreds of thousands of years if kept in a cool, dry place, as demonstrated by the recent recovery of DNA from the bones of a 430,000-year-old human ancestor found in a cave in Spain. They compressed the files into a master file, and then split the data into short strings of binary code made up of ones and zeros. Using an erasure-correcting algorithm called fountain codes, they randomly packaged the strings into so-called droplets, and mapped the ones and zeros in each droplet to the four nucleotide bases in DNA: A, G, C and T. The algorithm deleted letter combinations known to create errors, and added a barcode to each droplet to help reassemble the files later. They generated a digital list of 72,000 DNA strands, each 200 bases long. To retrieve their files, they used modern sequencing technology to read the DNA strands, followed by software to translate the genetic code back into binary. They recovered their files with zero errors, the study reports. They also demonstrated that a virtually unlimited number of copies of the files could be created with their coding technique by multiplying their DNA sample through polymerase chain reaction (PCR), and that those copies, and even copies of their copies, and so on, could be recovered error-free. Finally, the researchers show that their coding strategy packs 215 petabytes of data on a single gram of DNA. The capacity of DNA data-storage is theoretically limited to two binary digits for each nucleotide, but the biological constraints of DNA itself and the need to include redundant information to reassemble and read the fragments later reduces its capacity to 1.8 binary digits per nucleotide base. The team's insight was to apply fountain codes, a technique Erlich remembered from graduate school, to make the reading and writing process more efficient. With their DNA Fountain technique, Erlich and Zielinski pack an average of 1.6 bits into each base nucleotide. That's at least 60 percent more data than previously published methods, and close to the 1.8-bit limit. The researchers spent $7,000 to synthesize the DNA they used to archive their 2 megabytes of data, and another $2,000 to read it.

For 3 billion years, one of the major carriers of information needed for life, RNA, has had a glitch that creates errors when making copies of genetic information. Researchers at The University of Texas at Austin have developed a fix that allows RNA to accurately proofread for the first time. Certain viruses called retroviruses can cause RNA to make copies of DNA, a process called reverse transcription. This process is notoriously prone to errors because an evolutionary ancestor of all viruses never had the ability to accurately copy genetic material. The new innovation engineered at UT Austin is an enzyme that performs reverse transcription but can also "proofread," or check its work while copying genetic code. The enzyme allows, for the first time, for large amounts of RNA information to be copied with near perfect accuracy.

Molecular recordings by directed CRISPR spacer acquisition

Abstract: The ability to write a stable record of identified molecular events into a specific genomic locus would enable the examination of long cellular histories and have many applications, ranging from developmental biology to synthetic devices. We show that the type I-E CRISPR-Cas system of E. coli can mediate acquisition of defined pieces of synthetic DNA. We harnessed this feature to generate records of specific DNA sequences into a population of bacterial genomes. We then applied directed evolution to alter the recognition of a protospacer adjacent motif by the Cas1-Cas2 complex, which enabled recording in two modes simultaneously. We used this system to reveal aspects of spacer acquisition, fundamental to the CRISPR-Cas adaptation process. These results lay the foundations of a multimodal intracellular recording device.

Data Storage Device is a device for recording (storing) information (data). Recording can be done using virtually any form of energy, spanning from manual muscle power in handwriting, to acoustic vibrations in phonographic recording, to electromagnetic energy modulating magnetic tape and optical discs. A storage device may hold information, process information, or both. A device that only holds information is a recording medium. Devices that process information (data storage equipment) may either access a separate portable (removable) recording medium or a permanent component to store and retrieve data. Electronic data storage requires electrical power to store and retrieve that data. Most storage devices that do not require vision and a brain to read data fall into this category. Electromagnetic data may be stored in either an analog data or digital data format on a variety of media. This type of data is considered to be electronically encoded data, whether it is electronically stored in a semiconductor device, for it is certain that a semiconductor device was used to record it on its medium. Most electronically processed data storage media (including some forms of computer data storage) are considered permanent (non-volatile) storage, that is, the data will remain stored when power is removed from the device. In contrast, most electronically stored information within most types of semiconductor (computer chips) microcircuits are volatile memory, for it vanishes if power is removed. Except for barcodes, optical character recognition (OCR), and magnetic ink character recognition (MICR) data, electronic data storage is easier to revise and may be more cost effective than alternative methods due to smaller physical space requirements and the ease of replacing (rewriting) data on the same medium.

Plastination is a technique or process used in anatomy to preserve bodies or body parts. The water and fat are replaced by certain plastics, yielding specimens that can be touched, do not smell or decay, and even retain most properties of the original sample.


Noise Filtering


Analog Signal has a theoretically infinite resolution. In practice an analog signal is subject to electronic noise and distortion introduced by communication channels and signal processing operations, which can progressively degrade the Signal-to-Noise Ratio (SNR), which is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. It is defined as the ratio of signal power to the noise power, often expressed in decibels. A ratio higher than 1:1 (greater than 0 dB) indicates more signal than noise. While SNR is commonly quoted for electrical signals, it can be applied to any form of signal (such as isotope levels in an ice core or biochemical signaling between cells).

Analog-to-Digital Converter is a system that converts an analog signal, such as a sound picked up by a microphone or light entering a digital camera, into a digital signal. An ADC may also provide an isolated measurement such as an electronic device that converts an input analog voltage or current to a digital number proportional to the magnitude of the voltage or current. Typically the digital output is a two's complement binary number that is proportional to the input, but there are other possibilities. There are several ADC architectures. Due to the complexity and the need for precisely matched components, all but the most specialized ADCs are implemented as integrated circuits (ICs). A digital-to-analog converter (DAC) performs the reverse function; it converts a digital signal into an analog signal.

Noise (electronics) is a random fluctuation in an electrical signal, a characteristic of all electronic circuits. Noise generated by electronic devices varies greatly as it is produced by several different effects. Thermal noise is unavoidable at non-zero temperature (see fluctuation-dissipation theorem), while other types depend mostly on device type (such as shot noise, which needs a steep potential barrier) or manufacturing quality and semiconductor defects, such as conductance fluctuations, including 1/f noise.

Noise is unwanted sound judged to be unpleasant, loud or disruptive to hearing. From a physics standpoint, noise is indistinguishable from sound, as both are vibrations through a medium, such as air or water. The difference arises when the brain receives and perceives a sound.

Jitter is the deviation from true periodicity of a presumably periodic signal, often in relation to a reference clock signal.

Distortion is the alteration of the original shape (or other characteristic) of something, such as an object, image, sound or waveform. Distortion is usually unwanted, and so engineers strive to eliminate distortion, or minimize it. In some situations, however, distortion may be desirable. The important signal processing operation of heterodyning is based on nonlinear mixing of signals to cause intermodulation. Distortion is also used as a musical effect, particularly with electric guitars.

Channel (communications) refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer networking. A channel is used to convey an information signal, for example a digital bit stream, from one or several senders (or transmitters) to one or several receivers. A channel has a certain capacity for transmitting information, often measured by its bandwidth in Hz or its data rate in bits per second.

Signal Processing is an enabling technology that encompasses the fundamental theory, applications, algorithms, and implementations of processing or transferring information contained in many different physical, symbolic, or abstract formats broadly designated as signals. It uses mathematical, statistical, computational, heuristic, and linguistic representations, formalisms, and techniques for representation, modelling, analysis, synthesis, discovery, recovery, sensing, acquisition, extraction, learning, security, or forensics.

Data Corruption refers to errors in computer data that occur during writing, reading, storage, transmission, or processing, which introduce unintended changes to the original data.

Error-Correcting Code Memory is a type of computer data storage that can detect and correct the most common kinds of internal data corruption. ECC memory is used in most computers where data corruption cannot be tolerated under any circumstances, such as for scientific or financial computing.

Error Detection and Correction are techniques that enable reliable delivery of digital data over unreliable communication channels. Many communication channels are subject to channel noise, and thus errors may be introduced during transmission from the source to a receiver. Error detection techniques allow detecting such errors, while error correction enables reconstruction of the original data in many cases.

New Techniques Boost Performance of Non-Volatile Memory Systems. North Carolina State University have developed new software and hardware designs that should limit programming errors and improve system performance in devices that use non-volatile memory (NVM) technologies.

Anomalies - Broken Symmetry

Entropy (information theory) In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information. The amount of information of every event forms a random variable whose expected value, or average, is the Shannon entropy. Units of entropy are the shannon, nat, or hartley, depending on the base of the logarithm used to define it, though the shannon is commonly referred to as a bit.

Communication Noise refers to influences on effective communication that influence the interpretation of conversations. While often looked over, communication noise can have a profound impact both on our perception of interactions with others and our analysis of our own communication proficiency. Forms of communication noise include psychological noise, physical noise, physiological and semantic noise. All these forms of noise subtly, yet greatly influence our communication with others and are vitally important to anyone’s skills as a competent communicator.

Interference is anything which modifies, or disrupts a signal as it travels along a channel between a source and a receiver. The term typically refers to the addition of unwanted signals to a useful signal. Common examples are: Electromagnetic interference (EMI). Co-channel interference (CCI), also known as crosstalk. Adjacent-channel interference (ACI). Intersymbol interference (ISI). Inter-carrier interference (ICI). caused by doppler shift in OFDM modulation (multitone modulation). Common-mode interference (CMI). Conducted interference. Interference is typically but not always distinguished from noise, for example white thermal noise. Radio resource management aims at reducing and controlling the co-channel and adjacent-channel interference. See also: Distortion, Signal-to-Interference Ratio (SIR), Signal to noise plus interference (SNIR), Inter-flow interference and Intra-flow interference.


Compression

Data Compression involves encoding information using fewer bits than the original representation. Compression can be either lossy or lossless. Lossless Compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information. Filtering. The process of reducing the size of a data file is referred to as data compression. In the context of data transmission, it is called source coding (encoding done at the source of the data before it is stored or transmitted) in opposition to channel coding.

Lossless Compression is a class of data compression algorithms that allows the original Data to be perfectly reconstructed from the compressed data. By contrast, lossy compression permits reconstruction only of an approximation of the original data, though this usually improves compression rates (and therefore reduces file sizes).

Lossy Compression is the class of data encoding methods that uses inexact approximations and partial data discarding to represent the content. These techniques are used to reduce data size for storage, handling, and transmitting content.

Entropy Encoding is a lossless Data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix-free code to each unique symbol that occurs in the input. These entropy encoders then compress data by replacing each fixed-length input symbol with the corresponding variable-length prefix-free output codeword. The length of each codeword is approximately proportional to the negative logarithm of the probability. Therefore, the most common symbols use the shortest codes.

Asymmetric Numeral Systems is a family of entropy coding methods introduced by Jarosław (Jarek) Duda, used in data
compression since 2014 due to improved performance compared to the previously used methods. ANS combines the compression ratio of arithmetic coding (which uses a nearly accurate probability distribution), with a processing cost similar to that of Huffman coding.

How Computers Compress Text: Huffman Coding and Huffman Trees (youtube)

Zip (file format) is an archive file format that supports lossless data compression. A .ZIP file may contain one or more files or directories that may have been compressed. The .ZIP file format permits a number of compression algorithms, though DEFLATE is the most common.

DEFLATE is a lossless data compression algorithm and associated file format that uses a combination of the LZ77 algorithm and Huffman coding.

Huffman Coding s a particular type of optimal prefix code that is commonly used for lossless data compression.

Prefix Code is a type of code system (typically a variable-length code) distinguished by its possession of the "prefix property", which requires that there is no whole code word in the system that is a prefix (initial segment) of any other code word in the system. For example, a code with code words {9, 55} has the prefix property; a code consisting of {9, 5, 59, 55} does not,
because "5" is a prefix of "59" and also of "55". A prefix code is a uniquely decodable code: given a complete and accurate sequence, a receiver can identify each word without requiring a special marker between words. However, there are uniquely decodable codes that are not prefix codes; for instance, the reverse of a prefix code is still uniquely decodable (it is a suffix code), but it is not necessarily a prefix code.

Variable-Length Code is a code which maps source symbols to a variable number of bits. Variable-length codes can allow sources to be compressed and decompressed with zero error (lossless data compression) and still be read back symbol by symbol. With the right coding strategy an independent and identically-distributed source may be compressed almost arbitrarily close to its entropy. This is in contrast to fixed length coding methods, for which data compression is only possible for large blocks of data, and any compression beyond the logarithm of the total number of possibilities comes with a finite (though perhaps arbitrarily small) probability of failure.



The Test of Time


Even if our information and knowledge is saved in digital format, on paper or some how saved in bacteria or in our DNA, there is still no guarantee that the information and knowledge will not be lost or destroyed. One idea would be is to launch multiple unmanned Spacecraft, like the Voyager 1, into space that's programmed to stay within the solar system and programmed to return to earth at 500-year intervals. If we are still here and if the earth is still inhabitable, then the Space Probe would land in a populate area so that it’s information and knowledge can be retrieved. Then people would update the space pod and then send it back out into Space. And if the space pod returns to earths orbit and sees no life because earth has became uninhabitable for whatever reason, then the space pod would leave earths orbit and check other planets in the solar system for signs of life. The space pod would keep doing this as long as it survives. I kind of get this feeling like this has happened already before, besides what we have seen portrayed in some of our sci-fi movies with Extra Terrestrial's of course.  

Seed ships could be entirely robotic, but might contain human embryos that could be delivered to distant star systems where they would be incubated and, presumably, raised by robo-caretakers. Knowledge Ark  

Colonization of the Moon (wiki)

Maybe we could build a Monolith that could store our most valuable information and knowledge. We could make it out of the same material that could survive deep space and also survive entering a planets atmosphere, and maybe even survive a Black Hole. So I'm thinking, maybe that's the reason our universe is here, because someone has already thought of a way to preserve information in the previous universe. 

Asgardia is a free and unrestricted society which holds knowledge, intelligence and science at its core, will launch a satellite later this year to test the concept of long-term data storage in orbit around the Earth.

Artificial Intelligence 

Humans embryos and sperm would have to be cryogenically frozen and raised during space flight by Ai Robots that would be trained to raise children. Humans can also adapt to space travel. And when they find a new planet, then they will use our original DNA to raise the original human species to adapt to a new planet environment.

extremophile deinocolus radioduans is an extremophilic bacterium, one of the most radiation-resistant organisms known.

In order to travel in space for hundreds of years to reach a new habitable planet, humans would need to evolve into a different kind of human more suitable for space travel. Eventually humans would look like space aliens with big heads with little bodies. But as long as humans preserve their original DNA in eggs and sperm, they could raise original humans again to adapt in their new world. Again this sounds like it has already happened.  Déjà vu - Jamais vu

Mammal embryos can develop fully in space

"If life already evolved on another planet before the earth was born, then maybe life on that planet launched a pod into space, like a seed from tree, hoping to land somewhere to grow again, and keep life moving forward."

"When I think about how to preserve our information and knowledge, I can't help but think that someone millions of years ago already solved that problem because we would not be here if they didn't."

"If we ever did lose all our knowledge, and we had to start all over again, we would most likely do the same things and make the same mistakes, all because we did not learn enough, or teach enough."


John Adams Preserve Knowledge Quote


For 20 years, beginning in the 1950s, states laminated documents to try to protect them. But it caused a chemical reaction. The natural acids from the paper mixed with the degrading laminate to create a noxious vinegar. Each passing year will further degrade the document until it's gone. There are as many as 6 million laminated historical documents.



The Thinker Man