Probability Fact
@ProbFact
Daily tweets on probability from @JohnDCook.
Putting probability to work johndcook.com/blog/applied-p…

If two independent random variables have log-concave PDFs, their sum also has a log-concave PDF.
There are a lot more people who know how to move data around than who know what to do with it.
In (̣ε, δ)-differential privacy, exp(ε) is the probability multiplier and δ is an additive term.
If X is a beta distribution with mean μ, then Var(X) < μ(1 - μ).
Jensen's inequality: If f is convex and X is a random variable, f( E(X) ) <= E( f(X) ). en.wikipedia.org/wiki/Jensen%27…
The mean of a Poisson random variable equals its variance. This can be a problem in applications.
More like a Laplace distribution, commonly used in differential privacy x.com/graphcrimes/st…
What a normal church
Many things are approximately normal in the middle but not in the tails.
'Probability has reference partly to our ignorance, partly to our knowledge.' -- Laplace
Connection between the negative binomial probability distribution and Pascal's triangle johndcook.com/blog/2024/08/2…
'Everybody believes in the [normal approximation], the experimenters because they think it is a mathematical theorem, the mathematicians because they believe it is an experimental fact.' -- G. Lippmann
The Wishart distribution is a generalization of the gamma distribution to positive definite matrices.
Rényi entropy generalizes the more familiar Shannon entropy and has several useful special cases. johndcook.com/blog/2018/11/2…
The Hermite polynomials are orthogonal on (-∞, ∞) with respect to a weight given by a normal distribution pdf.
Differential entropy and privacy johndcook.com/blog/2023/11/0…
The Laguerre polynomials are orthogonal on [0, ∞) with respect to a weight given by a gamma distribution pdf.
Approximation and bounds for the Beta function johndcook.com/blog/2023/07/0…
Rapidly mixing random walks on graphs johndcook.com/blog/2016/12/2…