Is there a relation between complexity of a system and entropy?
Answers
Answered by
0
What's the relation between complexity and entropy?
Still have a question? Ask your own!
What is your question?
Ad by ETMONEY
SIP in direct mutual funds with 0% commission.
Start SIP in multiple funds with 1-tap access. Instant KYC. Zero paperwork. Free unlimited transactions.
Get the App
4 ANSWERS
Jiří Kroc, Complex Systems - Computational Biology - Quora Writer
Answered Aug 30, 2017 · Author has 648 answers and 678.9k answer views
There is one aspect related to entropy of complex systems, which is not mentioned in previous answers. Let us try to analyse it.
Entropy can be from the mathematical point of view understood as a measure of a given state of complex system under observation. It seems to be a trivial statement from the first sight but after a deeper analysis it reveals its powers.Entropy is actually a very simple measure of complexity. Despite this simplicity, it is still capable to clearly distinguish different states of the system under observation.Very briefly explained, entropy is acquired in the following way. The actual state of the system is assigned to one of bins (there is usually a small, finite number of them) according to its value. When it happens, the counter belonging to given bin is increased by one. After the a certain number of measurements, each bin contain a certain number of counts. This gives a histogram. This histogram is recomputed into probabilities and those are inserted into the equation evaluating entropy. That's it. :-)To sum it. We traced the state of the system for some period of time. A histogram was made and entropy evaluated.We do know that if the system is fixed then it's response stays constant. Constant value fills only one bin and others got zero counts. Because entropy formula contains logarithm and probability of event (flat line) is equal to one, entropy is equal to zero (log (1) = 0).If the system is respond gives random distribution (uniform) then it's entropy achieve the highest possible value.All other system responses gives entropy laying between the two above-mentioned cases (zero and the highest possible value).There are existing many types of entropy:Shannon, Information, Approximate, Sinai-Kolmogorov, Sample, Multiscale, etc. Each entropy is giving better results for some systems and fails for others.It is good to keep in someone's mind that entropy is just a measure that is dramatically symplyfying the responses of the system under observation.Entropy demonstrated itself as a powerful measure enabling classification and even prediction of biosignals. See my other post about prediction of arrhythmias.It is important to keep advantages and disadvantages of this method always in mind during any research.
Still have a question? Ask your own!
What is your question?
Ad by ETMONEY
SIP in direct mutual funds with 0% commission.
Start SIP in multiple funds with 1-tap access. Instant KYC. Zero paperwork. Free unlimited transactions.
Get the App
4 ANSWERS
Jiří Kroc, Complex Systems - Computational Biology - Quora Writer
Answered Aug 30, 2017 · Author has 648 answers and 678.9k answer views
There is one aspect related to entropy of complex systems, which is not mentioned in previous answers. Let us try to analyse it.
Entropy can be from the mathematical point of view understood as a measure of a given state of complex system under observation. It seems to be a trivial statement from the first sight but after a deeper analysis it reveals its powers.Entropy is actually a very simple measure of complexity. Despite this simplicity, it is still capable to clearly distinguish different states of the system under observation.Very briefly explained, entropy is acquired in the following way. The actual state of the system is assigned to one of bins (there is usually a small, finite number of them) according to its value. When it happens, the counter belonging to given bin is increased by one. After the a certain number of measurements, each bin contain a certain number of counts. This gives a histogram. This histogram is recomputed into probabilities and those are inserted into the equation evaluating entropy. That's it. :-)To sum it. We traced the state of the system for some period of time. A histogram was made and entropy evaluated.We do know that if the system is fixed then it's response stays constant. Constant value fills only one bin and others got zero counts. Because entropy formula contains logarithm and probability of event (flat line) is equal to one, entropy is equal to zero (log (1) = 0).If the system is respond gives random distribution (uniform) then it's entropy achieve the highest possible value.All other system responses gives entropy laying between the two above-mentioned cases (zero and the highest possible value).There are existing many types of entropy:Shannon, Information, Approximate, Sinai-Kolmogorov, Sample, Multiscale, etc. Each entropy is giving better results for some systems and fails for others.It is good to keep in someone's mind that entropy is just a measure that is dramatically symplyfying the responses of the system under observation.Entropy demonstrated itself as a powerful measure enabling classification and even prediction of biosignals. See my other post about prediction of arrhythmias.It is important to keep advantages and disadvantages of this method always in mind during any research.
Similar questions