Let \(X_{i}\) be the random variable such that probability of \(X_{i}=1\) is \(p\) and \(X_{i}=0\) with probability \((1-p)\) and all \(X_{i}\)s are independent of each other.
Let
\[S = \Sigma_{i=1}^{n} X_{i} \]
Then expectation of \(S\), \(E[S]= \Sigma_{i=1}^{n} E[X_{i}] = np \)
A schematic plot of upper and lower tail with respect to \(\delta\) or \(\gamma\) is shown in figure below.
Fig. 11 A Schematic of upper and lower tails of chernoff bound¶
Note
The upper tail is bounded by a factor of \(e^{-\frac{\delta^2}{2+\delta}np}\), for small \(\delta\) it will be proportional to \(e^{-\frac{\delta^2}{2}np}\) and for large \(\delta\) it is proportional to \(e^{-\delta np}\). Therefore the upper tail decreases slowly for large \(\delta\). The lower tail always decreases by factor of \(e^{-\gamma^2 \frac{ np}{2}}\), which is similar to a Gaussian distribution.
Proof. Apply Markov’s inequality, for any \(t> 0\), we have