Life After Capitalism: The Great Challenge of the Coming Era

The invention we now call “information theory” came from Claude Shannon’s 1948 paper “A Mathematical Theory of Communication.”

It defined information as “unexpected bits,” or “surprisal.” Measured by the logarithm of the probabilities of the elements in a message, it allows the calculation of the carrying capacity of a communications channel, whether a wire or the world, by simple addition of the base-2 logarithms.

Because probabilities are fractions between 0 and 1, Shannon put a negative sign before the sum. A negative times a negative is a positive. Thus, the information content is always positive.

The Concept of Thermodynamic Entropy… And Why It Matters

The great John von Neumann, one of the inventors of quantum theory and of prevailing computer architecture, was a colleague of Shannon’s at Bell Labs. It’s alleged that he suggested naming the information gauge “entropy.” Shannon’s information index shared a formal equation with the concept of thermodynamic entropy.

Here began a gigantic confusion that has persisted for nearly three quarters of a century. Believing that information inheres in the identifiable patterns of a code — such as an orderly language, physical arrangement, or DNA helix — brilliant scholars around the world have jumped to define information as order. And they are correct that order is necessary to intelligible information. To bear unpredictably informative bits, it takes a predictable code or language or framework of regularity.

As I have written, it takes a low-entropy (predictable) carrier to bear high-entropy (unpredictable) information. Unless you can separate the message from the carrier at the other end of the line, no information can be received. But the fundamental error afflicting most economic thought is the idea that the essence of information is a patterned carrier… rather than the unexpected or surprising modulation of the predictable carrier.

Because the electromagnetic spectrum is an infinite expanse of frequencies regulated everywhere by the ultimate physical constant of the velocity of light, most of the world’s information is flowing toward the spectrum as a carrier. It is always possible to separate the message from the speed-of-light carrier.

Both forms of entropy measure degrees of disorder. Thermodynamic entropy measures the breakdown of orderly systems of physics and chemistry — such as organic or inorganic structures — into disorderly and random arrangements of atoms and molecules. Information entropy measures the degree of “surprisal” in a message, which is also a kind of disorder.

For example, the logarithm of the probabilities of one bit, with two possible outcomes, is 1 (2^1 equals 2; the base-2 log of 2 is 1). The logarithm of the probabilities of a byte of 8 bits is 3 (2x2x2).

This index of information enables network architects and telecommunications engineers to calculate the information-bearing capacity of any wire, fiber optic thread, expanse of “air,” or satellite link to move information from one point to another in space. It also measures the capacity of a memory chip or disk to store and transfer information.

By basing information on surprisal, it can measure creativity, which is also gauged by its surprise. Providentially calculated by the degrees of freedom of a message’s creator — the size of his portfolio of possibilities — Shannon’s theory ties the economics of growth and creativity inexorably to the defense of liberty.

As Shannon anticipated, it further enables biologists to measure the capacity of the human genome to transmit the information that characterizes life. In a paper entitled “An Algebra for Theoretical Genetics,” he showed how to calculate the capacity of the genome. The communication of the intricate codes of life-bearing information over the millennia is incomparably the most impressive feat of information storage and transmission in the universe.

How DNA Could Solve Our Information Storage Problem

DNA storage and communication is not only a matter of the past. The future of information storage may not be silicon chips but carbon DNA molecules. As an article in the March 2, 2017 issue of Science by Robert Service observed, DNA has the potential to solve the information storage problem of a knowledge economy.

DNA has many advantages for storing digital data. It’s ultracompact, and it can last hundreds of thousands of years if kept in a cool, dry place. And as long as human societies are reading and writing DNA, they will be able to decode it.

“DNA won’t degrade over time like cassette tapes and CDs, and it won’t become obsolete,’ says Yaniv Erlich, a computer scientist at Columbia University.”

Several startups are pursuing this grail.

Although Shannon was skeptical of efforts to extend his concept of information into other fields, the relationship between low-entropy carriers and high-entropy messages is relevant to music, art, literature, and many other fields, such as economics.

Shannon’s entropy of surprisal is the foundation of the information theory of economics, in which progress is measured by the surprisal inherent in entrepreneurial creativity.

Little known is the fact that Shannon first broached the term “information theory” during World War II. At Bell Labs in New York City, he was working on secret codes for the military in a paper called “A Mathematical Theory of Cryptography.” The paper was immediately classified. But it introduced the ideas that he later enshrined in 1948 in his Theory of Communications.

Cryptography is a kind of inverted or negative form of information or disinformation theory; as Shannon remarked, “They are so close together you cannot separate them.”

Nonetheless, the creators of the internet separated them. They focused almost entirely on communication, modulation, and mutability rather than on security, factuality, and immutability. The internet model maximized propagation and copying rather than ground states and secure identities. Security became a post-hack patchwork rather than an architectural endowment.

Today’s Prophecy

The great challenge of the new LAC era (Life After Capitalism) will be to endow the architecture of the internet and the world economy with this complement of Shannon’s high-entropy information: the low-entropy carriers of security and factuality, true money and measurement.

In the new era, the separation of the Shannon’s inseparables — carriers and content, cryptography and information theory, measurement and money, order and disorder — will be reversed. We will transcend the sterile categories of a capitalism defined and stigmatized by Marxian materialism and achieve an understanding of human progress as the surprising creations of entrepreneurs.

Bringing back together the infocosm with the cryptocosm, we can remedy the tragic separation of communication from truth that has debauched the world’s money and its information as well.

Regards,

George Gilder
Editor, Gilder’s Daily Prophecy

You May Also Be Interested In:

[Video Prophecy] On Location in Calabasas

Welcome back to your weekly Video Prophecy! Once again, I sat down with my Publisher Doug Hill and addressed some of your most pressing questions ― this time across the country in sunny California.  As always, please send any feedback or questions you may have. If you’d like to submit a question to be featured...

Read This
George Gilder

George Gilder is the most knowledgeable man in America when it comes to the future of technology — and its impact on our lives.

He’s an established investor, writer, and economist with an uncanny ability to foresee how new breakthroughs will play out, years in advance.

And he’s certainly no stranger to the financial newsletter...

View More By George Gilder