explain entropy... properly
Answers
Answered by
1
Entropy is also a measure of the number of possible arrangements the atoms in a system can have. In this sense, entropy is a measure of uncertainty or randomness. ... Entropyis simply a quantitative measure of what the second law of thermodynamics describes: the spreading of energy until it is evenly spread.
madhu83:
tyxxx a lot
Answered by
1
1. A thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.
"the second law of thermodynamics says that entropy always increases with time"
2.
lack of order or predictability; gradual decline into disorder.
"a marketplace where entropy reigns supreme"
3.
(in information theory) a logarithmic measure of the rate of transfer of information in a particular message or language.
"the second law of thermodynamics says that entropy always increases with time"
2.
lack of order or predictability; gradual decline into disorder.
"a marketplace where entropy reigns supreme"
3.
(in information theory) a logarithmic measure of the rate of transfer of information in a particular message or language.
Similar questions
Social Sciences,
7 months ago
Science,
7 months ago
India Languages,
1 year ago
Math,
1 year ago
Math,
1 year ago