“Entropy is the measure of unpredictability of information content.” — Wikipedia

```
from math import log
probabilities_for_certain_events = [
0.01417076, 0.60479652,
0.71056009, 0.07446565,
0.47562493, 0.04281425,
0.17884241, 0.8271124,
0.32106817, 0.98718695
]
entropy = sum(-p * log(p,2) for p in probabilities_for_certain_events)
print(entropy)
# 3.07488235236
# or use:
# import numpy as np
# number_of_events = 10
# probabilities = np.random.rand(number_of_events)
# entropy = sum(-p * np.log2(p) for p in probabilities_for_certain_events)
```