3-Expected value

One of the most important concepts in probability theory is that of the 

expectation of a random variable. If X is a discrete random variable having

a probability mass function p(x), then the expectation,

or the expected value, of X, denoted by E[X], is defined by 

In words, the expected value of X is a weighted average of the possible values that X

can take on, each value being weighted by the probability that X assumes it.

For instance, on the one hand, if the probability mass function of X is given by

is just the ordinary average of the two possible values,

0 and 1, that X can assume. On the other hand, if 

is a weighted average of the two possible values 0 and 1, where the value 1

is given twice as much weight as the value 0, since p(1) = 2p(0). 

Now, consider a random variable X that must take on one of the values 

with respective probabilities 

then 

Find E(X), where X is the outcome when we roll a fair die.

Since p(1) = p(2) = p(3) = p(4) = p(5) = p(6) =  we obtain

A school class of 120 students is driven in 3 buses to a symphonic performance.

There are 36 students in one of the buses, 40 in another, and 44 in the third bus.

When the buses arrive, one of the 120 students is randomly chosen. 

Let X denote the number of students on the bus of that randomly chosen student, and find E(X).

Since the randomly chosen student is equally likely to be any of the 120 students, it follows that 

*Expectation of a function of a random variable

suppose that we are given a discrete random variable along with its probability 

mass function and that we want to compute the expected value of some function of X, say, g(X).

How can we accomplish this? One way is as follows: Since g(X) is itself a discrete random variable,

it has a probability mass function, which can be determined from the probability  mass  function  of X

Once  we  have  determined  the probability mass function of g(X),

we can compute E[g(X)] by using the definition of expected value.

Let X denote a random variable that takes on any of the values −1, 0,

and 1 with respective probabilities

𝑃(𝑋 =−1)=0.2    𝑃(𝑋 =0)=0.5      𝑃(𝑋 =1)=0.3

Compute 𝐸 

Let π‘Œ = Then the probability mass function of Y is given by

𝑃(π‘Œ =1)=𝑃(𝑋 =−1)+𝑃(𝑋 =1)=0.5

𝑃(π‘Œ =0)=𝑃(𝑋 =0)=0.5

Hence, 

=𝐸(π‘Œ)=1  (0.5)+  0  (0.5)=0.5

Note that 0.5= 

If X is a discrete random variable that takes on one of

the values π‘– ≥1, with respective probabilities p(),

then, for any real-valued function g

Before proving this proposition, let us check that it is in accord

with the results of the last Example. Applying it to that example yields

which is in agreement with the result given in the last Example

The proof of last Proposition proceeds, as in the preceding  verification,

by  grouping  together  all  the  terms  in having the same value of 

Specifically, suppose that, 𝑗≥1, represent the different values of  π‘– ≥1.

Then, grouping all the  having the same value gives 

If a and b are constants, then

𝐸(π‘Ž 𝑋+𝑏)=π‘Ž 𝐸(𝑋)+𝑏

 

The expected value of a random variable X, E[X], is also referred to as the mean or

the first moment of X. The quantity  π‘›≥1, is called the nth moment of X. Note that