4-Variance 

Given a random variable X along with its distribution function F,

it would be extremely useful if we were able to summarize the essential properties of

by certain suitably defined measures. One such measure would be E[X],

the expected value of X. However, although E[X] yields the weighted average of the possible values of X,

it does not tell us anything about the variation, or spread, of these values.

For instance, although random variables W, Y, and Z having probability mass functions determined by

π‘Š =0               π‘€π‘–π‘‘β„Ž π‘π‘Ÿπ‘œπ‘π‘Žπ‘π‘–π‘™π‘–π‘‘π‘¦  1

all  have  the  same  expectation—namely, 0—there  is  a much  greater spread 

in  the  possible  values  of Y than  in  those  of W (which  is  a constant) and in the possible

values of Z than in those of Y. Because we expect X to take on values around its mean E[X],

it would appear that a reasonable way of measuring the possible variation of X would be to

look at how far apart X would be from its mean, on the average.  One  possible  way  to  measure  this  variation

would  be  to consider the quantity E[|X – μ|], where μ = E[X].

However, it turns out to be mathematically inconvenient to deal with this quantity,

so a more tractable quantity is usually considered—namely, the expectation of the square of the difference between X and its mean.

We thus have the following definition.

If X is a random variable with mean μ, then the variance of X,

denoted by Var(X), is defined by π‘‰π‘Žπ‘Ÿ(𝑋)=

An alternative formula for Var(X) is derived as follows: 

That is,

In words, the variance of X is equal to the expected value of  minus the square of its expected value.

In practice, this formula frequently offers the easiest way to compute Var(X).

Calculate Var(X) if X represents the outcome when a fair die is rolled.

It was shown in Example 4 that E[X] = Also 

A useful identity is that, for any constants a and b,

π‘‰π‘Žπ‘Ÿ(π‘Žπ‘‹+𝑏)=π‘‰π‘Žπ‘Ÿ(𝑋)To prove this equality,

let μ = E[X] and note from Corollary that 

E[aX + b] =aμ + b. Therefore, 

                       =π‘‰π‘Žπ‘Ÿ(𝑋)

(a) Analogous to the means being the center of gravity of a distribution  of  mass,

the  variance  represents,  in  the  terminology  of mechanics, the moment of inertia. 

(b) The square root of the Var(X) is called the standard deviation of X, and we denote it by SD(X). That is,