- Understand the concept of expectation in probability theory.
- Compute the expectation of discrete and continuous random variables.
- Learn the properties of expectation, including linearity.
- Explore the expectation of functions of random variables.
- Visualize expectation using probability distributions.
Definition of Expectation
The expected value (or mean) of a random variable is a measure of its central tendency.
- For a discrete random variable \(X\) with probability mass function (PMF) \( P(X = x) \), expectation is defined as: \[ E[X] = \sum x P(X = x) \]
- For a continuous random variable \(X\) with probability density function (PDF) \( f(x) \), expectation is given by: \[ E[X] = \int_{-\infty}^{\infty} x f(x) dx \]
Properties of Expectation
Expectation has several useful properties:
Property | Formula | Description |
---|---|---|
Linearity | \( E[aX + bY] = aE[X] + bE[Y] \) | Expectation is linear. |
Expectation of a Constant | \( E[c] = c \) | The expectation of a constant is the constant itself. |
Expectation of an Indicator Function | \( E[I_A] = P(A) \) | For an indicator function \( I_A \), expectation equals the probability of the event. |
Proof
By definition, expectation is:
\[ E[X] = \sum x P(X = x) \]For two random variables \(X\) and \(Y\), and constants \(a, b\):
\[ E[aX + bY] = \sum (aX + bY) P(X, Y) \]Expanding the summation:
\[ E[aX + bY] = a \sum X P(X) + b \sum Y P(Y) = aE[X] + bE[Y] \]Expectation of Functions of a Random Variable
For a function \( g(X) \) of a random variable \( X \), expectation is:
\[ E[g(X)] = \sum g(x) P(X = x) \quad \text{(Discrete)} \] \[ E[g(X)] = \int g(x) f(x) dx \quad \text{(Continuous)} \]Special cases:
- \( E[X^2] \) is used to compute variance.
- \( E[e^X] \) is important in financial models.
Expectation and Variance
The variance measures how much \( X \) deviates from its expected value:
\[ Var(X) = E[X^2] - (E[X])^2 \]Higher moments such as skewness and kurtosis measure asymmetry and tail behavior.
Visualization of Expectation
Examples
Example 1: A fair die is rolled. Compute \( E[X] \) and \( Var(X) \).
\[ E[X] = \sum x P(X = x) = \frac{1+2+3+4+5+6}{6} = 3.5 \] \[ Var(X) = \frac{1^2 + 2^2 + 3^2 + 4^2 + 5^2 + 6^2}{6} - (3.5)^2 = \frac{91}{6} - 12.25 = 2.92 \]Exercises
- Question 1: Compute \( E[X] \) for a Bernoulli random variable \( X \) with \( P(X=1) = p \).
- Question 2: If \( X \) follows an exponential distribution with \( \lambda = 0.5 \), compute \( E[X] \).
- Question 3: Compute \( Var(X) \) for a Poisson random variable with \( \lambda = 4 \).
- Question 4: If \( E[X] = 5 \) and \( E[Y] = 2 \), compute \( E[3X - 2Y] \).
- Question 5: A uniform random variable \( X \) is defined over \( [0, 10] \). Compute \( E[X] \).
- Answer 1: \( E[X] = p \).
- Answer 2: \( E[X] = \frac{1}{\lambda} = 2 \).
- Answer 3: \( Var(X) = \lambda = 4 \).
- Answer 4: \( E[3X - 2Y] = 3(5) - 2(2) = 11 \).
- Answer 5: \( E[X] = \frac{a + b}{2} = \frac{0 + 10}{2} = 5 \).