bit decay ==>
<jargon> A hypothetical disease the existence of which has been deduced
from the observation that unused programs or features will often stop working
after sufficient time has passed, even if "nothing has changed". The theory
explains that bits decay as if they were radioactive. As time passes, the
contents of a file or the code in a program will become increasingly garbled.
People with a physics background tend to prefer the variant "bit decay" for the
analogy with particle decay.
There actually are physical processes that produce such effects (alpha particles
generated by trace radionuclides in ceramic chip packages, for example, can
change the contents of a computer memory unpredictably, and various kinds of
subtle media failures can corrupt files in mass storage), but they are quite
rare (and computers are built with error detection circuitry to compensate for
them). The notion long favoured among hackers that cosmic rays are among the
causes of such events turns out to be a myth.
Bit rot is the notional cause of software rot.
See also computron, quantum bogodynamics.
bit plane « bit rate « bit-robbing « bit rot
» bit slice » bits per inch » bits per pixel