Loading [MathJax]/jax/output/HTML-CSS/jax.js

20180919

Calculating the entropy of the Poisson distribution

The entropy of a Poisson distribution has no closed form. The entropy, in nats, is:

H(λ)=k=0..Pr(k)logPr(k)=k=0..λkeλk![klog(λ)λlog(k!)]=λ[1log(λ)]+eλk=0..λklog(k!)k!

These notes evaluate some numerical approximations to the entropy.

For high firing rates, the Poisson distribution becomes approximately Gaussian with μ=σ2=λ. Using the forumula for the entropy of the Gaussian, this implies

H(λ)=12log(2πeλ)+O(1λ)

Compare this to the fist few terms of the series expression, which is increasingly accurate for large λ, but diverges for small λ:

H(λ)=12log(2πeλ)112λ124λ219360λ3+O(1λ4)

Another way to calculate the entropy for low rates is calculate it using a finite number of terms in the series expansion. For low rates, the probability of k>λ+4λ events is neglegible, so we only need to sum a small number of terms.

H(λ)λ+4λk=0λkeλk![klog(λ)λlog(k!)]

Numerically, you can use the above calculation for λ<1.78, taking terms out to k<λ+4λ. For λ1.78, the third-order approximation for large λ becomes accurate.

H(λ){λ+4λk=0λkeλk![klog(λ)λlog(k!)]λ<1.7812log(2πeλ)112λ124λ219360λ3λ1.78

The relative error in this approximation is |ˆHH|/|H|<0.0016

No comments:

Post a Comment