In information theory, the entropy of a variable is the amount of information contained in the variable. One way to understand the concept of the amount of information is to tie it to how difficult or easy it is to guess the value. The easier it is to guess the value of the variable, the less “surprise” in the variable and so the less information the variable has. Rényi entropy of order q is defined for q ≥ 1 by the equation, S = (1/1-q) log (Σ p^q) As order q increases, the entropy weakens. Why we are concerned about higher orders? What is the physical significance of order when calculating the entropy?
Answers
Answered by
1
Answer:
Hi , your name is so cool
Similar questions