1/24
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Random variable
A rule that assigns a numerical value to each outcome of a chance process.
Discrete random variable
A random variable that takes on a countable set of possible values (often integers like 0, 1, 2, …).
Outcome vs. random variable value
The outcome is the raw result of the chance process (e.g., HHTHT); the random variable value is the number computed from the outcome (e.g., 3 heads).
Probability distribution (discrete)
A list of every possible value of a discrete random variable and the probability that each value occurs.
Valid discrete probability distribution
A distribution where (1) each probability is between 0 and 1 and (2) all probabilities sum to 1.
Probability notation P(X = x)
The probability that the random variable X takes the specific value x.
Probability histogram
A graph with bars at each possible discrete value whose heights equal the corresponding probabilities; it represents a model (long-run pattern), not a dataset.
Cumulative probability
A probability of the form P(X≤a) (or similar) found by adding probabilities of all values meeting the condition.
Cumulative distribution function (CDF)
The function F(x)=P(X≤x), giving cumulative probabilities up to x.
Expected value
The mean of a random variable; the long-run average value over many repetitions of the chance process.
Mean of a discrete random variable
μX=E(X)=Σxipi, the probability-weighted average of the possible values.
Probability-weighted average
An average where each value is multiplied by its probability before summing; more likely outcomes count more.
Variance of a discrete random variable
σX2=Σ(xi−μX)2pi, the long-run average of squared distance from the mean (weighted by probabilities).
Standard deviation of a random variable
σ_{X} = \text{√(σ_{X}²)}, describing the typical distance of X from its mean in the long run.
Variance–expectation shortcut
σX2=E(X2)−(E(X))2, where E(X2)=Σxi2pi.
E(X²)
The expected value of X2:E(X2)=Σxi2pi.
Fair game (expected value idea)
A game is “fair” when expected profit is 0 (fairness depends on expected value, not on winning half the time).
Profit random variable
If W is winnings and c is cost, profit can be defined as P = W − c, so E(P) = E(W) − c.
Linear transformation
A new variable formed by Y = a + bX (shift by a, scale by b).
Mean under linear transformation
If Y=a+bX, then μY=a+bμX.
Standard deviation under linear transformation
If Y = a + bX, then σY=∣b∣σX (adding a constant does not change spread).
Additivity of expected value
For any random variables X and Y: E(X+Y)=E(X)+E(Y) and E(X−Y)=E(X)−E(Y) (does not require independence).
Independence (random variables)
X and Y are independent if knowing the value of one provides no information about the other; often from separate trials or independently selected individuals.
Variance of a sum/difference (independent case)
If X and Y are independent: Var(X±Y)=Var(X)+Var(Y), so \boldsymbol{\sigma}_{X \boldsymbol{\pm} Y} = \text{√(σ_{X}² + σ_{Y}²)}.
Linear combination (AP-level)
A form like T = a + bX + cY; if X and Y are independent, then Var(T)=b2Var(X)+c2Var(Y) and μT=a+bμX+cμY.