Definition (Random Variable)
A random variable is a variable whose possible values are numerical outcomes of a random phenomenon. There are two main types of random variables: discrete and continuous.
Discrete Random Variables: These are random variables that can take on a countable number of values. For example, the number of heads in 10 coin flips is a discrete random variable.
Continuous Random Variables: These are random variables that can take on an infinite number of values within a given range. For example, the time it takes for a computer to solve a problem is a continuous random variable.
Formally, a random variable is a measurable function that maps outcomes of a random process to real numbers. This mapping allows us to assign probabilities to different outcomes and analyze them statistically using existing mathematical tools.
Let be the sample space of a random process, and let be a random variable. The function assigns a real number to each outcome in . The probability distribution of a random variable describes how the probabilities are distributed over the possible values of the random variable.
Probability Functions
For discrete random variables, we use the Probability Mass Function (PMF):
Properties:
- for all
For continuous random variables, we use the Probability Density Function (PDF):
Properties:
- for all
Cumulative Distribution Function (CDF)
The CDF is defined for both discrete and continuous random variables:
For discrete:
For continuous:
In summary, random variables are functions that map outcomes of random processes(sample space) to real numbers, allowing us to analyze and quantify the behavior of random phenomena.
A more rigorous definition is possible by introducing measurement and probability space, you may access here optionally: Random Variable - stackexchange.
For more details on expectation and variance calculations, see Expectation and Variance.