![cover image](https://wikiwandv2-19431.kxcdn.com/_next/image?url=https://upload.wikimedia.org/wikipedia/commons/thumb/d/d4/Entropy-mutual-information-relative-entropy-relation-diagram.svg/640px-Entropy-mutual-information-relative-entropy-relation-diagram.svg.png&w=640&q=50)
Joint entropy
Measure of information in probability and information theory / From Wikipedia, the free encyclopedia
Dear Wikiwand AI, let's keep it short by simply answering these key questions:
Can you list the top facts and stats about Joint entropy?
Summarize this article for a 10 year old
SHOW ALL QUESTIONS
In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.[2]
![](http://upload.wikimedia.org/wikipedia/commons/thumb/d/d4/Entropy-mutual-information-relative-entropy-relation-diagram.svg/640px-Entropy-mutual-information-relative-entropy-relation-diagram.svg.png)