(no title)
mmt | 7 years ago
This struck me as a strange analogy, considering DNA's inherent fragility, but it would make more sense compared to modern software than hardware.
Alternatively, it also makes sense if "hardware" means a particular model/architecture, with DNA corresponding to an HDL, rather than an instance of hardware (e.g. single CPU, server, or smartphone). A frequent enough topic on HN is the challenge archivists have with archaic software and data formats, even if all the original collections-of-bits are faithfully preserved.
Real_S|7 years ago
DNA in living cells has some fragility because it is in an aqueous solution and also actively used to generate RNA. DNA out side of cells is even more fragile, it will inevitably be eaten by bacteria. But put DNA in the right sterile environment, it can last for thousands of years and has great resistance to electromagnetic interference.
mmt|7 years ago
This has echos of No True Scotsman. Computer storage media also have ideal conditions that can be used to extend their lifetimes (though, granted, not indefinitely, AFAIK).
What about in real conditions, subject to things like temperature variations (including "extreme" heat that non-operating computer hardware can do just fine in), exposure to light, humidity from the air (to put it back into aqueous solution occasionally), and common oxiders found floating around in the air?
Could one rely on an arbitrary single strand to last even 5 years in an office environment, or are numerous, RAID1-style, copies required to maintain fidelity?