Entropy is the measure of ‘information content’ a random outcome gives us. Let S be the set of possible states and be a probability function on those states that satisfies the axioms of probability. Then the entropy H of some set of outcomes X in S given some domain Y is given by
H(X|Y) &=-\sum_{x\in X} P(x|Y) \lg(P(x|Y)) \\ &=\mathbb{E}(\lambda x: X.\lg(P(x|Y))) \end{align}$$