The Second Law of Thermodynamics says that in a closed system the entropy will either stay the same (for reversible processes) or increase over time (for irreversible processes). The entropy of a closed system can never decrease, except for short-lived random fluctuations. But an open system that can interact with its environment can have its entropy decrease.
The word entropy can be understood in in several ways and one such way is in terms of order. Highly ordered systems have low entropy while highly unordered systems have high entropy. So the second law can also be stated as the tendency of closed systems to move towards increasing disorder.
All this is pretty well known, confusion arising mainly due to the willful obtuseness of some religious people who ignore the distinction between open and closed systems to argue that evolution violates the Second Law, since evolution seems to produce more order. But this is possible because the Earth is an open system that interacts with the rest of the universe, most importantly the Sun.
Less well known is that there is a negative correlation between entropy and information, where a decrease in information leads to an increase in entropy and vice versa. Seen that way, the relationship between entropy and order becomes a little more transparent since we are likely to have more information about ordered systems (that have less entropy) than about disordered systems (that have more entropy).
To lift this out of a hand waving level of argument and into more rigorous science, we have to quantify the concepts of entropy and order so that we can measure or calculate them and establish the appropriate relationships. This has been done over many years, ever since Ludwig Boltzmann identified the key relationship between entropy S and an order parameter W written as S=klnW, where k is a universal constant known as the Boltzmann constant and ‘ln’ is the abbreviation for ‘natural logarithm’. (This equation appears on his tombstone.)
The work on information theory came later and was pioneered by people like Harry Nyquist, R. V. L. Hartley, and Claude Shannon, who were all involved in telecommunications engineering where loss of information in transmission along telephone wires is an important consideration in designing such systems.
The key insight was that of physicist Rolf Landauer who showed that the irreversible loss of information by the erasure of a single bit of information resulted in heat in the amount of kTln2 being released, where T is the absolute temperature, something that is now called the Landauer Principle.
Shannon warned against the glib use of these terms to arrive at spurious conclusions by conflating scientific terms with their vernacular use (the kind of thing that Deepak Chopra does), saying in 1956:
“I personally believe that many of the concepts of information theory will prove useful in these other fields—and, indeed, some results are already quite promising—but the establishing of such applications is not a trivial matter of translating words to a new domain, but rather the slow tedious process of hypothesis and experimental verification.”
This idea of ‘losing’ information is an intriguing one. Suppose I write a grocery list on a piece of paper. If I then erase it, I have ‘lost’ that information. But how have I actually increased the entropy of the universe by doing so? Sean Carroll has a nice post where he discusses in a non-technical way how the irreversible loss of information (such as by erasing words on a piece of paper) leads to increased entropy.
The issue of when information is irreversibly lost becomes trickier when you include the human mind in the system. If I commit the grocery list to memory before erasing it, is the information irreversibly lost, since I can in principle recreate the list? As far as I know (and I am not certain of this since I haven’t done the detailed analysis), as long as the information remains in my mind, the information is not irreversibly lost but as soon as I forget it, it is. In order to do the calculation of the change in entropy due to the act of forgetting, we have to know the detailed structure of my brain containing the list information and after again I have forgotten it. Since the brain with the list is presumably more ordered than the brain without it, then forgetting results is a loss of order and an increase of entropy.
Of course the act of committing the list to memory and physically erasing the words on the paper also involves the expenditure of energy and has entropy implications. Erasing the words on the paper involves the irreversible transfer of muscle energy into heat and that too results in a increase of entropy that is separate from the entropy changes due to information loss.