explain the relation between entropy and information.
Answers
Answered by
0
Information entropy is a concept from information theory. It tells how much information there is in an event. In general, the more certain or deterministic the event is, the less information it will contain. More clearly stated, information is an increase in uncertainty or entropy.
plz mark me brainlist dear
Answered by
1
Hie Mate ✌️
Information provides a way to quantify the amount of surprise for an event measured in bits. Entropy provides a measure of the average amount of information needed to represent an event drawn from a probability distribution for a random variable.... !
-StunningBabe27 ❤️
Similar questions