This page may be out of date. Submit any pending changes before refreshing this page.
Hide this message.
Quora uses cookies to improve your experience. Read more

How are information, mass and energy related to each other?

8 Answers
Gregory Bloom
Gregory Bloom, Aspiring Curmudgeon
Mass and energy are fungible - each can be converted to the other.  So your question boils down to "How is information related to energy?"

The short answer is: thermodynamics.

The slightly longer but still very 'hand-wavy' answer is that every change in the universe involves a change in energy that increases entropy.  Let's say that again, but shorter and punchier: each new state of the universe requires an increase in entropy.  Entropy involves greater distribution of "stuff" across ever increasing "degrees of freedom", whether that involves electrons ("stuff") occupying more possible energy levels ("degrees of freedom"), or sodium and chlorine ions occupying different configurations of ionic bonding with protons and OH- ions in solution, the "stuff" is always getting spread across more "degrees of freedom". 

The first obvious tie to information is that a full description of any new state of the universe will always require more information to fully describe it, since our description must list all the new degrees of freedom that can be occupied and the probability that they are.  The universe always becomes more complex, because it always moves into a state that will require more information to completely describe it.

To illustrate: energy and information are tied through the energy and entropy required to "care" about a specific configuration.  For example, we can adequately describe the contents of uninitialized RAM by saying something like "this memory has 1024 gigabits, each set to either one or zero, but we don't particularly care which, so we'll just list them all having a value of X, and each bit is free to be either value".  If we decide to enforce a particular pattern of ones and zeroes onto that RAM, our description becomes vastly longer, like "this memory has 1024 gigabits, and the first bit is 1, the second bit is 1, the third bit is 0..." and so on.  Now, given that the universe appears to prefer states that require longer descriptions, you might think that imposing particular patterns on things would be something the universe prefers to do.  Alas, it is not so.  Since the universe always requires more degrees of freedom to become occupied, and since changing an 'X' (don't care) value into either a '1' or a '0' removes a degree of freedom (since 'X' is free to be either '1' or '0') we must increase the entropy somewhere else in the universe.  And that somewhere else is, in the case of RAM, in the increased distribution of the motion of the molecules making up the memory and the matter surrounding it as electrical energy is converted to heat.  So the description would go from "a bunch of electrons packed together close enough that their repulsion is like a bunch of springs squeezed into a box" to "a bunch of molecules wiggling around in a bunch of different ways - some stretching, some rotating, some bending".  A full description of all the ways these molecules are wiggling would require much more information to describe than the loss of degrees of freedom in our RAM, so the universe allows it.
Abhishek Kumar
Abhishek Kumar, played with physics of neural computation
One can't overstate the importance of "entropy" in connecting these three nodes. Concept of Entropy is the common thread that connects physical world (mass, energy) the world of uncertainity of events or existent (Information).

Confused - Stay with me for some more and lets introduce this term:

mass - energy - entropy - information


Part 1. Mass - Energy: Equivalence is stated E=mc2

Intuitively, it may be easy to see how this plays out in daily life. We take some gas (mass) and ignite it to create fire (Energy) - mass converts into Energy (energy is not created but mass converted into energy).

FIRST INTRODUCED: 1905: [Einstein coined this relationship that becomes the basis of how mass is converted into energy. To understand the details of this relationship, you may visit this paper - http://www.fourmilab.ch/etexts/e...

Part 2. Information - Energy: Allow me to introduce a term called "entropy" that connects Information and Energy: Information - Entropy - Energy.

Entropy (IMHO) is probably the most intriguing concept in physical science that connects the physical world (energy, mass) to the information world.

  • Energy - Entropy:
Entropy is the measure of disorder for a variable and it takes energy to reduce the disorder.

in case of classical systems, this relationship is explained by Increase in Entropy (disorder)= [Heat Dissipation][/Temperature]

Intuitively, it takes heat absorption at constant temperature to change ice into water. When ice changes into water, entropy (disorder) increases.


FIRST INTRODUCTION:1865 : This concept is more than 100 year old introduced in http://gallica.bnf.fr/ark:/12148... by German scientist Rudolf Clausis (http://en.wikipedia.org/wiki/Rud...)
  • Information-Entropy :

Expected value of that state in which that object is defined as its most likely state and quantity of information stored by that object is precise the inverse of that. That quantity is what you would want to know to size the amount of information. In other words, inverse of certainity

For e.g imagine a light bulb that always glows in red color - there is no information there - there is certainty. If, on the other hand, the bulb glows in multiple colors based on certain probability distribution, there is some uncertainty there - that is information - the unknown.

Information = 1/P(State of object) = uncertainity that the object exists in a state


NOw you can see how entropy, which is a measure of disorder will be highly connected with information and the precise definition is that

Entropy (measure of disorder) is basically Expected value of Information (uncertainty) across different states.

FIRST INTRODUCTION: 1948: For further reference, read this landmark paper http://cm.bell-labs.com/cm/ms/wh... by Claude Shannon http://en.wikipedia.org/wiki/Cla...

--------------------------------------------------------------------------------------
In short:

There is an imaginary object in the world called M.

M burns =>
Converts into energy E =>
E become a part of surrounding S and increase its disorder =>
E just caused uncertainity =>
E converted itself into something unknown - information I

We went from physical space to abstract space. Its all different space.

We covered Clausis, Einstein, and Shannon and went from 1865 to 1949.

Boom!
--------------------------------------------------------------------------------------
The last answer was so right that I was inspired to find some quotes to add to the constructed model.

"Pure information, in the mathematical sense, does not require energy; it is that which orders energy. It is the negative of entropy, that which brings disorder to energy systems."

-- Robert Anton Wilson (I highly recommend reading Prometheus Rising for more information on related topics)


"Imagine that your brain is a computer, as modern neurology suggests. Now imagine that the whole universe is a big computer, a mega-computer, as John Lilly has proposed. Then imagine that the sub-quantum realm, the realm of what Dr. David Bohm calls 'hidden variables' is made up of mini-mini- computers. Now, the hardware of each 'computer'—the universe, your brain, the sub-quantum mechanisms—is localized. Each part of it is somewhere in spacetime, here not there, now not then. But the software—the information—is non-local. It is here, there and everywhere; now, then and everywhen."

-- Jack Sarfatti
Har Sukhdeep Singh
All the three concepts are very good for doing the calculations but it is almost impossible to elaborate them. On gut feeling these seem to be intimately connected, entangled.I a very rudimentary way one can say 'Information' is the blueprint  of any change plus spec sheet of previous and final states. Energy is that allows the transition from previous state to the next state. Mass is that adds a brute physical aspect to states. Entropy is that tells us about the disorder( lack of information-  sort of homogeneity) in the states. Mass and energy are very well correlated but Information is still not so well unified with mass and energy. May be still more about Information is yet to come; Information( Potential, Field, Current, Conservation). It is my personal  belief that Information is the basis of all that is created and humans get connected with creation through information only.

This angle may help the dialog:

when we have an original thought, did we add new information to this universe? How can we view this based on the supposition that energy can’t be created, nor destroyed? If we invent a new concept, have we changed the amount of information within the universe, or just translated existing information?

Franz Plochberger
Franz Plochberger, IT-Orientation-on-Human-Being

By Information Science Mass and Energy are attributes in Physics.

Information needs a Human Connection…. f. i. Mass or Energy of a Human Being as Human biological, physical attributes.

But that is all!