Entropy

Entropy is often interpreted as a measure of the disorder or randomness of a system. It is probably more correct to say that it is a representation of the number of microstates available to a system. The second law of thermodynamics states that every spontaneous process moves towards greater entropy, so towards more available microstates.

You can read how the concept of entropy can be explained in terms of probability here

You can read some examples of processes where the entropy increases here