English, asked by pratikshamane1612, 1 month ago

What is entropy? Show that entropy is maximum when all the symbols are equi-probable. Assume M=2.​

Answers

Answered by soniatiwari214
2

Answer:

The amount of thermal energy per unit of temperature in a system that cannot be used to carry out beneficial work is known as entropy.

Explanation:

  • I am aware that entropy can be defined as the measure of randomness of a process or variable. set A for a random variable X∈: H(X) = ∑xi∈ A p(xi) log (p(xi)). This sentence is found in Ch2 of MacKay's book Entropy and Information Theory.
  • If p is uniform, entropy is maximized.
  • I can comprehend it intuitively, for example, that the randomness or entropy will grow if all datapoints in set A are chosen with an equal probability of 1/m (m being the cardinality of set A).
  • However, the randomness or entropy should decrease if we know that some points in set A will occur more frequently than others (for example, in the case of a normal distribution, where the maximum concentration of data points is in the vicinity of the mean and the small standard deviation region around it).

#SPJ2

Similar questions