![cover image](https://wikiwandv2-19431.kxcdn.com/_next/image?url=https://upload.wikimedia.org/wikipedia/commons/thumb/2/22/Binary_entropy_plot.svg/640px-Binary_entropy_plot.svg.png&w=640&q=50)
Binary entropy function
Entropy of a process with only two probable values / From Wikipedia, the free encyclopedia
Dear Wikiwand AI, let's keep it short by simply answering these key questions:
Can you list the top facts and stats about Binary entropy function?
Summarize this article for a 10 year old
In information theory, the binary entropy function, denoted or
, is defined as the entropy of a Bernoulli process (i.i.d. binary variable) with probability
of one of two values, and is given by the formula:
![Thumb image](http://upload.wikimedia.org/wikipedia/commons/thumb/2/22/Binary_entropy_plot.svg/200px-Binary_entropy_plot.svg.png)
The base of the logarithm corresponds to the choice of units of information; base e corresponds to nats and is mathematically convenient, while base 2 (binary logarithm) corresponds to shannons and is conventional (as shown in the graph); explicitly:
Note that the values at 0 and 1 are given by the limit (by L'Hôpital's rule); and that "binary" refers to two possible values for the variable, not the units of information.
When , the binary entropy function attains its maximum value, 1 shannon (1 binary unit of information); this is the case of an unbiased coin flip. When
or
, the binary entropy is 0 (in any units), corresponding to no information, since there is no uncertainty in the variable.