From Reddit:

Here we have a cardboard box, about the size of a shoebox, filled with coins. Not a ton of them, just enough so that the all lay flat on the bottom of the box.

There are three possible states for that box to be in: all coins heads-side-up, all coins heads-side-down, and the other thing where some of the coins are heads-up and some are heads-down.

Imagine the box starts out in the all-coins-up state. We put the lid on and shake it. Without opening the lid, what state do you expect the box to be in?

The answer is obvious: Some of the coins will be heads-up and some will be heads-down. Why? Because the all-heads-up and all-heads-down states correspond to *exactly one arrangement of coins* each, while there are *many*arrangements of coins that correspond to the some-up-some-down state.

The “all-up, all-down, some-of-both” states are what we call *macrostates.* They’re the states we care about, the ones we can easily observe. The individual position and heads-up-or-down-ness of all the coins comprises what we call a*microstate.* It’s a state that is normally invisible to us, hidden from view, either because we just don’t care about that much detail, or because that much detail is practically impossible for us to measure.

Entropy is, in a sense, how many microstates correspond to a particular macrostate. In this example, the all-heads-up and all-heads-down macrostates each correspond to just a single microstate; that’s a very low-entropy condition. But the some-of-each macrostate corresponds to *many* microstates, making that a high-entropy condition.

When we started out, all the coins were heads-side-up, but when we put the lid on the box and shook it, the system moved from a low-entropy state to a high-entropy state.

In nature, systems always tend to move from low-entropy to high-entropy states. In the most abstract sense, this is just because of pure dumb luck: There are more combinations of coins that add up to “some of each” than either “all up” or “all down,” so *pure random chance* dictates that we’re far more likely to go from the all-up state to the some-of-each state than the other way around … and furthermore, that as we continue to shake the box, we’re far more likely to *stay* in the some-of-each state, because the odds against getting all the coins to land heads-side-up are enormous.

In reality, this use of pure-dumb-luck-based statistics to describe complex systems is a mathematical approximation. After all, things like the motions of molecules in a bathtub of water aren’t really random. They’re actually the product of a *huge* number of very simple interactions … but that’s the thing. When you take something that’s fundamentally simple but that becomes vastly complex because of sheer scale, that thing tends to behave very much like a purely random system governed by dumb luck. So it turns out those dumb-luck-based statistical approximations are actually incredibly useful and predictive.

So basically, entropy can be thought of as a way of quantifying just how likely or unlikely it is that a complex system will evolve in a particular way. If the evolution you’re imagining is from a low-entropy state to a high-entropy state, in general that’s pretty likely. If it’s the other way around, from a high-entropy state to a low-entropy state, then in general that’s probably not going to happen. The more complex the system you’re thinking about, the better statistical methods tend to be for predicting the evolution of that system over time.