Resume

Research

Learning

Blog

Teaching

Jokes

Kernel Papers

- The entropy of a discrete distribution over \(N\) elements is bounded by \(\log N\)

Proof: Use Gibb’s Inequality, setting \(q_n = 1/N\). Then \(H(p) \leq -\sum_n p_n \log \frac{1}{N} = \log N (\sum_n p_n) = \log N\)