Entropy

In statistical mechanics and thermodynamics, entropy (from ancient Greek ἐν en, “within”, and τροπή tropè, “transformation”) is a quantity interpreted as a measure of the disorder present in a physical system. It is generally represented by the letter S and is measured in the International System of Units in Joules Fract Kelvin (J/K).

Thermodynamics is the first field in which entropy was introduced in the 19th century, beginning with studies of the relationship between heat and work by William Rankine and Rudolf Clausius. Entropy is a state function of a system in thermodynamic equilibrium, which, by quantifying the unavailability of a system to produce work, is introduced along with the second principle of thermodynamics. Based on this definition, it can be argued, in a non-strict but explanatory form, that when a system moves from an ordered to a disordered equilibrium state, its entropy increases; this fact gives an indication of the direction in which a system spontaneously evolves, thus also the arrow of time (as already stated by Arthur Eddington).

It should be noted, however, that there is a class of phenomena called nonlinear phenomena (e.g., chaotic phenomena) for which the laws of thermodynamics (and thus entropy) must be thoroughly revised and no longer have general validity.

The molecular approach of statistical mechanics generalizes entropy to non-equilibrium states by relating it more closely to the concept of order, specifically to the possible different arrangements of molecular levels and thus different probabilities of the states in which a system can be macroscopically.

The concept of entropy has been extended to non-physical fields, such as the social sciences, signal theory, and information theory, and has gained wide popularity.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
Scroll to Top