explain the entropy and entropy formation
Answers
Answered by
0
Information entropy is the average rate at which information is produced by a stochastic source of data.
The measure of information entropy associated with each possible data value is the negative logarithm of the probability mass function for the value:
S
=
−
∑
i
P
i
log
P
i
S=-\sum _{i}P_{i}\log {P_{i}}.[1]
When the data source produces a low-probability value (i.e., when a low-probability event occurs), the event carries more "information" ("surprisal") than when the source data produces a high-probability value. The amount of information conveyed by each event defined in this way becomes a random variable whose expected value is the information entropy. Generally, entropy refers to disorder or uncertainty, and the definition of entropy used in information theory is directly analogous to the definition used in statistical thermodynamics. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication".[2]
Aman 8529749297
The measure of information entropy associated with each possible data value is the negative logarithm of the probability mass function for the value:
S
=
−
∑
i
P
i
log
P
i
S=-\sum _{i}P_{i}\log {P_{i}}.[1]
When the data source produces a low-probability value (i.e., when a low-probability event occurs), the event carries more "information" ("surprisal") than when the source data produces a high-probability value. The amount of information conveyed by each event defined in this way becomes a random variable whose expected value is the information entropy. Generally, entropy refers to disorder or uncertainty, and the definition of entropy used in information theory is directly analogous to the definition used in statistical thermodynamics. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication".[2]
Aman 8529749297
Answered by
1
In statistical mechanics, entropy is an extensive property of a thermodynamic system. ... In this viewpoint, thermodynamic properties are defined in terms of the statistics of the ... The fact that entropy is a function of state is one reason it is useful..............
Similar questions