Information and Energy, and Entropy!

What follows is quite incomprehensible if you do not have some idea of maths/physics, but read on anyways :)The question that I have been asking for quite some time is : “What is information?”, from an abstract point of view. Would have hardly expected a physicist to answer this, but they did. Looks like there is a connection between the energy of a system and the information it contains. More precisely, the number of states that a system can take is directly proportional to the information it contains. This was put forward by Boltzmann, who is rightly well known for his huge contributions to physics. Take for example two people, A who sits on a chair the whole day and B who keeps running around the whole day. If someone tells you A is sitting on the chair, you would already know it, so it is nothing new to you. However, news about B’s whereabouts will always be new information to you. Therefore, an energetic system tends to contain more information than a static one.

One could also understand this by looking at a storage cell, which can contain n bits. As n increases, amount of information stored increases, so does amount of energy it contains (Not to be confused with present day memories like RAM, where most power is consumed by resistance and (silicon) crystal imperfections.) Boltzmann stated that if a system can have M mutually distinguishable states, then the amount of entropy is given by \log{M} . Entropy can be called a measure of the randomness in a system, that part of the system’s energy that is unavailable for useful work.

Claude Shannon, founder of practically everything we know (if we exaggerate a bit), generalized Boltzmann’s hypothesis to a case where we have some idea of the probability of a state of the system occuring, i.e, we have a probability distribution of the states of the system. In Boltzmann’s view, the distribution was uniform, and hence there was the \log{M} result. In a probabilistic system, one can only talk of the expected information, and that is what we get: Information entropy is given by I = -\Sigma_x p_x \log{p_x} . Note that this reduces to the previous form if p_x = \frac{1}{x} . Here, since the system behaves in ways we understand, the total entropy has been reduced by a factor I . If the base of the logarithm being taken is 2, the unit of the value is called bit. If natural logarithms are used, it is called a nat. Nat, interestingly,  corresponds to the Boltzmann constant when used to in an entropy context. Thus, we see the beginning of a link between entropy and information. We reduce the entropy of a system by gaining more information about the states in which it can be present. If we know exactly what state it is in, the amount of entropy is effectively zero.

How do we find the distribution of states ? we have to measure. Therefore, what is being implied is that measurement reduces the entropy of a system, or that measurement reduces the uncertainty we have about a system, which seems intuitively correct. If system A measures system B, B’s entropy reduces, whereas the entropy of the system consisting of both A and B does not (from the viewpoint of a system C which has not measured either A or B). The information got via measurement must be stored (and/or erased) somewhere, and this requires energy. This could be seen as the solution for the famous Maxwell’s Demon paradox, which claimed to violate the second law of thermodynamics.

These views have importance in the theory of computation, especially in the lower limits of energy required for computation. Say, I have a system that takes in 2 bits and gives out 1 bit (like an OR gate), then the amount of energy expended must be atleast equal to the difference in information entropy, which is 1 bit. Similarly, if information entropy increases, the system must take in energy. You can read all this and more (especially qualifications) in this paper.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s