Entropy is a measure of the number of ways a thermodynamic system can be arranged, commonly described as the "disorder" of a system. This concept is fundamental to physics and chemistry, and is used in the Second law of thermodynamics, which states that the entropy of a closed system (meaning it doesn't exchange matter or energy with its surroundings) may never decrease. This means that the "multiplicity", or number of ways a system can be arranged will never decrease, and that the system will naturally tend to higher disorder. The maximum disorder of a system occurs when it is at thermal equilibrium, therefore, this is what all isolated systems will tend to over time.
Entropy can also be described as a system's thermal energy per unit temperature that is unavailable for doing useful work. Therefore entropy can be regarded as a measure of the effectiveness of a specific amount of energy. Shown in Figure 1, this is represented as the "energy quality", which decreases as the entropy of a system increases. It can be seen that heat has a lower energy quality than mechanical energy or electricity, so this can be used to understand why an amount of heat cannot be converted completely into the same amount of these higher quality forms of energy.
There are many examples of complexity in our world around us. Things such as
are extremely complex systems which appear to violate the Entropy Statement of the Second law of thermodynamics.
Since entropy is increasing, and the Second law entails that out of this increase comes disorder, randomness and simplicity, how is it that there is so much order and complexity around us?
This is a major confusion in understanding entropy, and it is important to distinguish between an isolated (closed) and non-isolated (open) system. Open systems are free to interact with their environment, and therefore energy can be added to or removed from them. Systems that become more ordered as time passes are called self-organizing systems. In order for this decrease in entropy to be possible, they must take in energy from an outside source. However, because the open system's entropy is decreasing, there must be an increase of entropy outside of the system.
For example, this is why water can freeze into complex structures. The water forms a highly organized crystal, and the entropy of the water decreases as it forms this structure. This happens because heat energy is transferred from the water to the surrounding air, therefore increasing the entropy of the air. This increase in the air must be more than the decrease in the water, because the whole system's entropy must increase. This is analogous to refrigeration, as work must be input to the refrigerator in order to cool it down, therefore decreasing its entropy.
This concept of decreasing a non-isolated system's entropy can be visualized in Figure 3 below. (The arrangement of the bricks is not a literal representation of the brick's entropy, rather it is just to demonstrate the idea of "multiplicity". The actual entropy of the bricks has to do with their internal temperature.)
Since the entropy of a system tends to more disorder over time, and never in reverse, it is said to give us "time's arrow". If snapshots of a system at two different times shows one state which is more disordered, then it could be implied that this state came later in time.