Statistical Entropy is probability theory applied to the principle of entropy showing it to be a measurement of the disorder in a system. It is based mainly on the probability of the positions of molecules explaining the tendency; seen in the 2nd Law of Thermodynamics; of entropy to increase. This tendency is because configurations with high entropy are more probable than configurations of low entropy.
The biggest problem with entropy is its tendency to increase. So understanding how to decrease entropy is vary important. The most common answer is adding energy to the system as if that is all that is needed to decrease its entropy. But such an answer is overly simplistic since when energy is applied to a system the way it affects the system’s entropy depends on the way the energy is applied to the system. Consider the difference between construction work and a bomb. Construction work will decrease the entropy of a building under construction. One the other hand a bomb with the same amount of energy and on the same site will inevitably increase the site’s entropy.
This shows that the manner by which energy is applied to a system affects how that energy changes that system’s entropy. What is needed is a general principle that describes this difference and statistical entropy shows exactly how and when entropy can be decreased it shows how to produce order from disorder.