Explain the concept of entropy(information theory) with respect to digital communications
Answers
Answered by
0
Answer:
Entropy. When we observe the possibilities of the occurrence of an event, how surprising or uncertain it would be, it means that we are trying to have an idea on the average content of the information from the source of the event. Entropy can be defined as a measure of the average information content per source symbol.
✌✌✌✌✌✌✌✌✌✌✌✌✌✌✌✌
Answered by
0
Explanation:
Entropy refers to disorders or uncertainty, and definition of entropy is used in information theory is directly analogous to the definition used in statistical thermodynamics. The concept of information entropy was introduced by Claude Shannon in his 1948 paper. " a mathematical theory of communication. "„
Similar questions
Math,
5 months ago
Biology,
5 months ago
English,
5 months ago
Computer Science,
10 months ago
Math,
10 months ago