In 1949, one year after Shannon published the rules of information theory, he drew up a table of the various stores of memory that then existed. The biggest memory in his table was the US Library of Congress, which he estimated to contain one hundred trillion bits of information. That was at the time a fair guess at the sum total of recorded human knowledge. Today a memory disc drive storing that amount of information weighs a few pounds and can be bought for about a thousand dollars. Information, otherwise known as data, pours into memories of that size or larger, in government and business offices and scientific laboratories all over the world. Gleick quotes the computer scientist Jaron Lanier describing the effect of the flood: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.”I still remember the family Radio Shack TRS-80 Model III, which had 48k RAM, something like 64k hard drive and cost several thousand dollars for a used model.
Monday, February 28, 2011
A Short History of Information
How We Know, by Freeman Dyson (h/t Ritholtz):
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment