Playlist

Random Variables

by David Spade, PhD

My Notes
  • Required.
Save Cancel
    Learning Material 2
    • PDF
      Slides Statistics pt1 Random Variables.pdf
    • PDF
      Download Lecture Overview
    Report mistake
    Transcript

    00:01 Welcome to Lecture 13. Where we’re going to discuss Random Variables.

    00:05 Let’s introduce ourselves to probability models first.

    00:09 What we’re wanna do is we wanna relate probability models to random variables.

    00:13 In order to be able to make predictions about random phenomena, we need to have an idea of what to expect.

    00:20 In order to do that, we need a probability model.

    00:23 Probability models are centered around what we know as random variables.

    00:27 Random variables are numeric values based on the outcome of a random event.

    00:33 For example, the amount of money that you win on a lottery ticket is a random variable because it is a number based on the outcome of a random lottery drawing.

    00:42 We have several types of random variables. In fact, there are two of them.

    00:46 Discrete random variables are random variables for which we can list all the possible outcomes.

    00:51 For example, if our variable is a number that comes up on the roll of a six-sided die, then the possible outcomes are 1, 2, 3, 4, 5, and 6.

    01:01 The number that comes up is a random variable and it is discrete because we can list all of the outcomes.

    01:07 Continuous random variables do not have that property.

    01:11 Continuous random variables are random variables for which all the possible outcomes cannot be listed.

    01:16 For example, the amount of time when one might wait at the bus stop.

    01:20 It could be 10 minutes, 5 minutes, it could be 4.36694 seconds or minutes, seconds, whatever.

    01:28 We can’t list all the possible outcomes here for this random variable, so this is a continuous random variable.

    01:34 How do we describe random phenomena? We build a probability model and what that means is that for a random variable, we simply collect all the possible values of the random variable and the probability with which each value occurs.

    01:49 For example, suppose the die from earlier comes up one with probability 1/12, 2 with probability 1/6, 3 with probability 1/3, 4 with probability 1/6, 5 with probability 1/12, and 6 with probability 1/6.

    02:05 This is a probability model for the random variable that corresponds to the number that comes up on the die.

    02:11 What do we expect? Let’s look at expected values which are also known as means or averages.

    02:18 Once a probability model for a random variable has been developed, it’s possible to evaluate certain characteristics of that random variable.

    02:26 A common question is what do we expect to happen in the long run? In probability models, we call this the expected value of the random variable and we denote it E of X.

    02:37 The expected value can be viewed as a measure of center for the distribution of that random variable.

    02:44 How do we find them? For discrete random variables, the expected value is found by multiplying each value of the variable by its probability and then adding up all of these products.

    02:56 Mathematically, we have the expected value of a random variable X is equal to the sum of all the possible values of X times the probability that X takes each of those values.

    03:06 Let’s look at the example from the die roll.

    03:11 Recall the probability model from the die roll and let X represent the random variable corresponding to the value that comes up on a roll of that die.

    03:20 We have the probability that X = 1, 5 with probability of 1/12 and probability that X = 6 is 1/12.

    03:27 Probability that X is 2, 4, or 6; all those are 1/6. The probability that X is 3 is 1/3.

    03:35 Let’s use the formula from the previous slide to find the expected value of X.

    03:39 We have 1 times 1/12 plus 2 times 1/6 plus 3 times 1/3 plus 4 times 1/6 plus 5 times 1/12 plus 6 times 1/6, so we get an expected value of 3 1/6. That’s weird.

    03:55 We wouldn’t expect to roll a 3 1/6. We can’t get that.

    04:00 This is just the average value of a roll over the long run.

    04:04 For continuous random variables, finding expected values is a lot more difficult.

    04:09 In this course, we’re gonna restrict our attention to discrete random variables.

    04:13 We also wanna evaluate the spread of the distribution of the random variable.

    04:18 Remember how we measure the spread, variance, standard deviation. With the data, in order to evaluate the spread, we found the standard deviation.

    04:28 We can do the same thing with the discrete random variable.

    04:30 In order to do this, we need to first find the variance just like we did with data.

    04:35 If this notation, V-A-R of X is used to denote the variance of the random variable X then we can find the variance using this formula here where just take the sum of the values of the random variable minus the expected value.

    04:50 We square those differences and multiply those square differences by the probability.

    04:54 Then we add all those up.

    04:57 Let’s find the variance of the number that comes up the roll of the die from the previous example.

    05:04 We’ll have the variance of X’s 1 minus 3 and 1/6 squared times the probability of a 1 which is 1/12 and we’ll continue through, adding up those squared differences times their probabilities to get a variance of 2.361.

    05:18 The spread of a random variable is also characterized by the standard deviation.

    05:23 This is just the square root of the variance if you recall from what we did with data.

    05:27 Here, the standard deviation of X is the square root of the variance which is 1.5366.

    05:33 Expected values have some nice properties especially for linear functions of random variables.

    05:39 Let’s let C be a constant and let’s let X be a random variable.

    05:44 Then, the expected value of X plus C is equal to the expected value of X plus that constant, C.

    05:52 The expected value of C times X is just C times the expected value of X.

    05:58 Now, let’s let X and Y be two random variables.

    06:02 Then the expected value of X plus Y is equal to the sum of the expectation of each of the individual random variables.

    06:09 In general, if we have 3 constants, A, B, and C, and X and Y are 2 random variables, then we have the two following properties.

    06:17 First the expected value of A times X plus C is equal to A times the expected value of X plus the constant, C.

    06:25 We have the expected value of A times X plus B times Y plus that little constant, C, is A times the expected value of X plus B times the expected value of Y plus that little constant, C.

    06:38 Let’s do this again for two dice.

    06:41 Let’s let X be the die in the previous example and we’ll let Y be a value that comes up on a different die.

    06:49 Let’s assume that the expected value of the roll on this die is 4.

    06:53 The game is that, if your score is negative, you lose a dollar. If it’s 0 or greater, you win a dollar.

    07:00 The score is calculated by taking three times the roll on the first die minus 2 times the roll on the second die plus 1.

    07:08 What’s the expected value of the score if you roll both dice? Well, we know that the expected value of X is 1/6. The expected value of Y is 4.

    07:19 The expected value of 3X minus 2Y plus 1 = 3 times the expected value of X minus 2 times the expected value of Y plus 1, so we get 2.5.

    07:30 The expected value of your score is 2.5.

    07:34 Now let’s look at some properties of variances especially for linear functions of random variables.

    07:42 Again, let’s let A, B, and C be constants and let’s let X and Y be 2 independent random variables.

    07:47 Independence is key here. Then we have the following properties for variances.

    07:52 The variance of X plus a constant is just the variance of X.

    07:57 Remember what we did with data where if we just add a constant to everything, it doesn’t spread anything out more or less.

    08:04 It just picks up the distribution and moves it. Same thing with random variables.

    08:09 We’re not spreading anything out more. We’re just picking up the distribution and moving it.

    08:13 The variance of A times X is equal to A squared times the variance of X.

    08:19 If X and Y are independent, then the variance of X plus Y is equal to the variance of X minus Y which is equal to the sum of the variances of each of those two variables.

    08:29 Finally, the variance of A times X plus B times Y plus C is equal to A squared times the variance of X plus B squared times the variance of Y.

    08:39 Let’s find the variance for the Two Dice Game.

    08:43 We assume that the variance of the value of a second roll is 1.98.

    08:48 What’s the variance of our score 3X minus 2Y plus 1? What is the standard deviation? Here’s the solution. The variance of 3X minus 2Y plus 1 is equal to 3-squared or 9 times the variance of X plus 4 or 2-squared times the variance of Y which gives us 21.169.

    09:10 The standard deviation should is just the square root of that or 5.401.

    09:15 What does this show us? Well, it shows that adding more random variables into the situation increases the amount of variation in the values of these functions of the random variables.

    09:26 That’s not unexpected.

    09:28 Each of these random variables has an amount of variability attached to it.

    09:32 We have to account for both of those when we deal with both of those random variables.

    09:37 Let’s look at just a little but at continuous random variables; just a few notes on this.

    09:43 Often, random phenomena are modeled with continuous random variables.

    09:48 We have seen some examples of this already.

    09:51 For instance, with the normal model.

    09:53 This is a continuous distribution that’s used to model continuous random variables.

    09:58 We won't deal with calculating expected values and variances for continuous random variables simply because this involves calculus and, for this course, we don’t assume that.

    10:07 We can do a lot of things wrong when using probability models and random variables.

    10:14 We need to look at the pitfalls that we need to avoid.

    10:16 Models are not always and, in fact, they’re almost never right.

    10:20 Some probabilities are based off models and you should question your probabilities just as you would data collection because you should question whether or not the model seems reasonable.

    10:29 If the model is wrong, so is everything else.

    10:33 We wanna be careful of variables that are not independent.

    10:37 Expected values always add together. This is not the case for variances.

    10:42 If 2 random variables are not independent, then the variances won’t add.

    10:47 The variables have to be independent in order for the variances to add.

    10:52 Variances of independent random variables add. Standard deviations do not.

    10:58 Variances of independent random variables add even when you’re looking at the difference between the two variables.

    11:04 That’s important. Whether it’s X plus Y or X minus Y, if they’re independent, the variance of that function is the sum of the variances of each of those random variables.

    11:13 All right, so what have we done? We described what random variables are.

    11:19 We described probability models.

    11:21 We did a couple of examples with the discrete random variable and found expected values and variances.

    11:26 We looked at the common issues in probability models and random variables.

    11:31 This is the end of Lecture 13 and we’ll see you back here for Lecture 14.


    About the Lecture

    The lecture Random Variables by David Spade, PhD is from the course Statistics Part 1. It contains the following chapters:

    • Random Variables
    • Expected Values
    • Evaluating Spread
    • Properties of Variances

    Included Quiz Questions

    1. A numeric value based on the outcome of a random event is known as a random variable.
    2. A numeric value based on the outcome of a random event is known as a probability model.
    3. A numeric value based on the outcome of a random event is known as an expected value.
    4. A numeric value based on the outcome of a random event is known as a random phenomenon.
    5. A numeric value based on the outcome of a random event is known as a unique variable.
    1. The expected value of X is 0.
    2. The expected value of X is 1.
    3. The expected value of X is -1.
    4. The expected value of X is 3.
    5. The expected value of X is 1/3.
    1. The variance of X is 2.
    2. The variance of X is -1.
    3. The variance of X is 0.
    4. The variance of X is 5.
    5. The variance is -2.
    1. The expected total is 0 and the variance is 4.
    2. The expected total is 0 and the variance is 0.
    3. The expected total is 1 and the variance is 4.
    4. The expected total is 0 and the variance is 2.
    5. The expected total is 1 and the variance is 2.
    1. For any two independent random variables X and Y, Var (X+Y) = Var (X) + Var (Y).
    2. For any two random variables X and Y, SD (X+Y) = SD (X) + SD (Y).
    3. For any two independent random variables X and Y, SD (X+Y) = SD (X) + SD (Y).
    4. For any two random variables X and Y, Var (X+Y) = Var (X) + Var (Y).
    5. For any two random variables X and Y, Var (X+Y) = Var (X) * Var (Y).

    Author of lecture Random Variables

     David Spade, PhD

    David Spade, PhD


    Customer reviews

    (1)
    5,0 of 5 stars
    5 Stars
    5
    4 Stars
    0
    3 Stars
    0
    2 Stars
    0
    1  Star
    0