Unit 4: Probability, Random Variables, and Probability Distributions

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/49

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 2:11 AM on 3/12/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

50 Terms

1
New cards

Probability (long-run model)

A way to model random processes by describing how often outcomes occur in the long run, not by predicting individual results.

2
New cards

Random process

A process where individual outcomes can’t be predicted with certainty, but long-run patterns (regularity) can be predictable.

3
New cards

Relative frequency

The proportion of times an event occurs in repeated trials; used as an estimate of probability.

4
New cards

Law of Large Numbers (LLN)

As an experiment is repeated many times (with an unchanging chance process), the relative frequency of an event tends to get closer to its true probability.

5
New cards

Short-run variability (streaks)

In the short run, random results can look unusual (including streaks), even when the long-run proportion is stable.

6
New cards

Gambler’s fallacy

The mistaken belief that after a long streak, the opposite outcome is “due”; random processes do not have memory in that way (e.g., a fair coin stays 0.5 for heads each flip).

7
New cards

Simulation

A method that imitates a random process (using random digits, random number generators, or physical randomizers) to estimate probabilities when exact calculations are difficult.

8
New cards

Simulation setup

The planning stage of a simulation: define outcomes, assign random numbers/digits to outcomes with the correct probabilities, and specify what counts as “success” in each trial.

9
New cards

Simulation trial

One run of the simulated chance process that mimics one repetition of the real situation (e.g., reading three two-digit numbers to represent three free throws).

10
New cards

Simulation estimate

An estimated probability found by repeating many simulation trials and using the proportion of trials in which the event occurs.

11
New cards

Sample space

The set of all possible outcomes of a random process (e.g., {1,2,3,4,5,6} for one die roll).

12
New cards

Event

A subset of the sample space; a collection of outcomes of interest (e.g., “roll an even number” = {2,4,6}).

13
New cards

Equally likely outcomes

A model assumption where each outcome in the sample space has the same probability, allowing P(A) = (outcomes in A)/(outcomes in sample space).

14
New cards

Complement rule

For an event A, the complement A^c is “A does not happen,” and P(A^c) = 1 − P(A).

15
New cards

Union (inclusive OR)

A ∪ B means “A or B or both” occur (inclusive OR).

16
New cards

Intersection (AND)

A ∩ B means both A and B occur.

17
New cards

Mutually exclusive (disjoint) events

Events that cannot happen at the same time; P(A ∩ B) = 0.

18
New cards

Addition rule (general)

For any events A and B: P(A ∪ B) = P(A) + P(B) − P(A ∩ B).

19
New cards

Multiplication rule (general)

For any events A and B: P(A ∩ B) = P(A)P(B|A) (equivalently, P(B)P(A|B)).

20
New cards

Complement strategy (“at least one”)

Compute “at least one” by subtracting the complement: P(at least one) = 1 − P(none).

21
New cards

Conditional probability

P(A|B) is the probability A occurs given B occurred: P(A|B) = P(A ∩ B)/P(B), for P(B) > 0.

22
New cards

Independence

Events A and B are independent if knowing one occurred does not change the probability of the other (e.g., P(A|B)=P(A) or P(A∩B)=P(A)P(B)).

23
New cards

Mutually exclusive vs. independent

Mutually exclusive events (with positive probabilities) cannot be independent because their intersection probability is 0, not P(A)P(B).

24
New cards

Joint probability

The probability two events happen together, written as an intersection (e.g., P(A ∩ B)).

25
New cards

Two-way table

A table of counts (or proportions) for two categorical variables; used to compute marginal, joint, and conditional probabilities clearly.

26
New cards

Conditional probability from a two-way table

A conditional probability computed using the appropriate row/column total as the denominator, determined by the “given” condition.

27
New cards

Tree diagram

A diagram for multistage probability where branches represent outcomes with conditional probabilities at each stage.

28
New cards

Total probability with a tree

Multiply probabilities along each path for sequence probabilities, then add path probabilities to find an overall probability (e.g., sum across groups).

29
New cards

Bayes’ rule

A method to reverse a conditional probability: P(A|B) = [P(B|A)P(A)]/P(B), often implemented more safely using a table of counts.

30
New cards

Base-rate effect

When the underlying rate of a condition is low, even a good test can produce many false positives; the prior probability heavily influences P(condition|positive).

31
New cards

Random variable

A numerical variable whose values come from a random process (turns outcomes into numbers for analysis).

32
New cards

Discrete random variable

A random variable that takes a countable set of possible values (e.g., 0,1,2,… or 0–5).

33
New cards

Probability distribution (discrete)

A list or rule giving P(X=x) for each possible value x of a discrete random variable X.

34
New cards

Valid probability distribution criteria

A discrete distribution is valid if every probability is between 0 and 1 and the total of all probabilities sums to 1 (within rounding).

35
New cards

Cumulative distribution function (CDF)

A function/table giving cumulative probabilities such as P(X ≤ x), used for “at most,” “no more than,” and “less than or equal to” questions.

36
New cards

Expected value E(X) (mean)

The long-run average value of a random variable: μ = E(X) = Σ x·P(X=x).

37
New cards

Expected net gain / fair game

Using expected value to judge long-run profitability/fairness; a game is fair when expected net gain is 0 (positive favors the player, negative favors the house).

38
New cards

Variance and standard deviation (discrete)

Measures of spread for a discrete random variable: variance σ² = Σ (x−μ)²P(X=x) and standard deviation σ = √σ².

39
New cards

Variance shortcut

An equivalent formula: Var(X)=σ²=E(X²)−(E(X))², where E(X²)=Σ x²P(X=x).

40
New cards

z-score

A standardized value measuring how many standard deviations an observation is from the mean: z = (value − mean)/SD; large |z| indicates a surprising outcome.

41
New cards

Linear transformation Y=a+bX

A transformation of a random variable where μY = a + bμX and σY = |b|σX (adding a constant shifts center; multiplying scales spread).

42
New cards

Mean of sums/differences

For random variables X and Y: μ{X+Y}=μX+μY and μ{X−Y}=μX−μY (means always add/subtract).

43
New cards

Variance of sums/differences (independent case)

If X and Y are independent: Var(X+Y)=Var(X−Y)=Var(X)+Var(Y); add variances (not standard deviations).

44
New cards

Binomial distribution (Bin(n,p), BINS)

Models the number of successes in a fixed number n of independent trials with binary outcomes and constant success probability p (BINS: Binary, Independent, Number fixed, Same p).

45
New cards

Binomial probability formula

For X ~ Bin(n,p), the probability of exactly k successes is P(X=k)=C(n,k)p^k(1−p)^{n−k}.

46
New cards

Binomial mean and standard deviation

For X ~ Bin(n,p): μ=np and σ=√(np(1−p)).

47
New cards

10% condition (binomial with sampling)

When sampling without replacement, a binomial model is often acceptable if the sample size is no more than 10% of the population, making independence reasonable.

48
New cards

Geometric distribution (Geometric(p))

Models the number of trials until the first success (including the success trial) for independent trials with constant success probability p; mean μ=1/p.

49
New cards

Geometric probability formula

For X ~ Geometric(p): P(X=k)=(1−p)^{k−1}p for k=1,2,3,… (the exponent k−1 counts failures).

50
New cards

Memoryless property (geometric)

For a geometric setting, past failures do not change future success probability; after many failures, the chance of success on the next trial is still p.

Explore top notes

note
Notes
Updated 1187d ago
0.0(0)
note
Photons
Updated 900d ago
0.0(0)
note
Biology - Evolution
Updated 1476d ago
0.0(0)
note
RIse of Democracy Vocab Pt. 3
Updated 1499d ago
0.0(0)
note
Indirect Values
Updated 1499d ago
0.0(0)
note
Notes
Updated 1187d ago
0.0(0)
note
Photons
Updated 900d ago
0.0(0)
note
Biology - Evolution
Updated 1476d ago
0.0(0)
note
RIse of Democracy Vocab Pt. 3
Updated 1499d ago
0.0(0)
note
Indirect Values
Updated 1499d ago
0.0(0)

Explore top flashcards

flashcards
faf
40
Updated 957d ago
0.0(0)
flashcards
faf
40
Updated 957d ago
0.0(0)