# Physics 212, 2019: Lecture 18

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Back to the main Teaching page.

### General notes

A good introduction to probability theory, one of my favorites, but more on the mathematical side, can be found at Introduction to Probability by CM Grinstead and JL Snell.

### Why do we need random numbers?

• Some processes are fundamentally random (quantum mechanics, statistical mechanics, mutations, chemical reactions).
• Some calculations are easier done using random numbers than using deterministic approaches (e.g., calculating area of a complex object).
• Avatars for randomness: a coin toss, a dice, a number of molecules in a certain volume of air, time to a click of a Geiger counter.

### Introducing concepts of randomness

To define the necessary probabilistic concepts, we need

• To define a set of outcomes that a random variable can take (e.g., head or tails, six sides of a die, etc.).
• Then we define a probability of a certain outcome ${\displaystyle x}$ as a limit of frequencies after many random draws, or events. That is, if after ${\displaystyle N}$ draws, the outcome happened ${\displaystyle n_{x}}$ times, then it's frequency is ${\displaystyle f_{x}=n_{x}/N}$, and the probability is ${\displaystyle P(x)=\lim _{N\to \infty }f_{x}=\lim _{N\to \infty }{\frac {n_{x}}{N}}}$.

Probabilities satisfy the following properties, which follow from their definition of limits of frequencies:

• nonnegativity: ${\displaystyle P_{i}\geq 0}$
• unit normalization: ${\displaystyle \sum _{i=1}^{N}P_{i}=1}$
• nesting: if ${\displaystyle A\subset B}$ then ${\displaystyle P(A)\leq P(B)}$
• additivity (for non-disjoint events): ${\displaystyle P(A\cup B)=P(A)+P(B)-P(A\cap B)}$
• complementarity ${\displaystyle P(not\,A)=1-P(A)}$

#### What if we are studying more than one random variable?

Multivariate distributions ${\displaystyle P(x,y)}$ is the probability of both events happening. It contains all of the information about the variables, including

• Marginal distribution: ${\displaystyle P(x)=\sum _{y\in Y}P(x,y)}$
• The conditional distribution, which can then be defined as ${\displaystyle P(y|x)=P(x,y)/P(x)}$, so that the probability of both events is the probability of the first happening, and then the probability of the second happening given that the first one has happened.

The conditional distributions are related using the Bayes theorem, which says: ${\displaystyle P(x,y)=P(x|y)P(y)=P(y|x)P(x)}$, so that ${\displaystyle P(x|y)={\frac {P(y|x)P(x)}{P(y)}}}$.

We can also now formalize the intuitive concept of dependence among variables. Two random variables are considered to be statistically independent if and only if ${\displaystyle P(x,y)=P(x)P(y)}$, or, equivalently, ${\displaystyle P(x|y)=P(x)}$ or ${\displaystyle P(y|x)=P(y)}$.

### How easy is it to generate random numbers?

• Do exercises on this web page to get a better feel for random numbers. Were you successful in generating random numbers without the help of a coin?
• Linear congruential method for generating random numbers. See http://apps.nrbook.com/c/index.html, Chapter 7.1 for details.
• Many standard systems use: multiplier = 7**5, modulus = 2**31-1, increment = 0