The entropy of a Poisson distribution has no closed form. The entropy, in nats, is:
H(λ)=−∑k=0..∞Pr(k)logPr(k)=−∑k=0..∞λke−λk![klog(λ)−λ−log(k!)]=λ[1−log(λ)]+e−λ∑k=0..∞λklog(k!)k!These notes evaluate some numerical approximations to the entropy.
For high firing rates, the Poisson distribution becomes approximately Gaussian with μ=σ2=λ. Using the forumula for the entropy of the Gaussian, this implies
H(λ)=12log(2πeλ)+O(1λ)Compare this to the fist few terms of the series expression, which is increasingly accurate for large λ, but diverges for small λ:
H(λ)=12log(2πeλ)−112λ−124λ2−19360λ3+O(1λ4)Another way to calculate the entropy for low rates is calculate it using a finite number of terms in the series expansion. For low rates, the probability of k>λ+4√λ events is neglegible, so we only need to sum a small number of terms.
H(λ)≈−∑⌈λ+4√λ⌉k=0λke−λk![klog(λ)−λ−log(k!)]Numerically, you can use the above calculation for λ<1.78, taking terms out to k<⌈λ+4√λ⌉. For λ≥1.78, the third-order approximation for large λ becomes accurate.
H(λ)≈{−∑⌈λ+4√λ⌉k=0λke−λk![klog(λ)−λ−log(k!)]λ<1.7812log(2πeλ)−112λ−124λ2−19360λ3λ≥1.78The relative error in this approximation is |ˆH−H|/|H|<0.0016
No comments:
Post a Comment