Entropy is not chaos
Mediocre physics teachers who are trying to explain the concept of entropy often say that entropy is a sort of measure of chaos, with increases in entropy meaning increased chaos. I found that claim confusing from the first time I heard it; once I got a grip on the concept of entropy, I realized that it’s simply false: entropy has little to do with chaos. Consider, for instance, a bucket into which different-color paints have been slopped, forming a chaotic mess of colors. That mess has less entropy than it will after you mix it to a orderly uniform color, which is the opposite of the way the entropy-means-chaos idea would have it. Likewise, a room filled with a chaotic mixture of air at different temperatures has less entropy than it will after the temperatures all equilibrate to the same value. Or take a situation in which you have two cylinders, one filled with air and the other evacuated, and connected by a pipe with a valve. Once you open the valve, half the air will rush from the full cylinder to the empty; this will increase the entropy. But which situation is more chaotic than the other? Relative to the everyday meaning of chaos, it’d be hard to say.
As for what entropy is, if it’s not chaos — well, as with other things in physics, a definition could be given simply enough, but wouldn’t mean much to anyone who didn’t already know how to put it in context. (“The logarithm of what?”) The concept takes a lot of understanding; I didn’t really get a grip on it until I spent a lot of quality time with Enrico Fermi’s book Thermodynamics. That book explains it probably as simply as it can be explained, but it’s still not easy.
It’s a worthwhile concept, though. One can get the impression from casual physics talk that entropy is only good for making gloomy statements about the heat death of the universe, and how everything is doomed to run down and deteriorate. (Or in the above case, how it’s easier to mix paints than to unmix them.) There is that aspect of it, but entropy is also a practical tool. Using it it one can, for instance, derive the Clausius-Clapeyron equation, which relates the vapor pressure of a liquid to its heat of vaporization. Or one can use it to calculate the exhaust velocity of a rocket engine, under the assumption of shifting equilibrium.
While on the subject of chaos, it’s also worth mentioning that the “chaos” defined in the branch of mathematics known as “chaos theory” also isn’t chaos in the usual sense of the English language. In chaos theory, water dripping from a faucet is a “chaotic process”. That’s because the exact size of each drip and the exact interval between drips is hard to predict, even though to the eye it looks like a steady drip, drip, drip, and though the average person would say you were nuts to call it chaotic. This has rendered scientific papers a bit more difficult to read, since it can be hard to tell whether “chaotic” is meant in the ordinary sense or in the chaos-theory sense. Unlike in the case of entropy, I have difficulty labeling this technical concept of “chaotic” worthwhile, since I’ve never encountered anyone making any practical use of it, and since I don’t know why labeling something “chaotic” would help with anything: you couldn’t predict it precisely before, and you still can’t predict it precisely.