Expectation or expected value of a random variable is sometimes thought of as the value that the random variable takes on average.
One consequence of this thinking is that we might think that the expectation of a function of the random variable would be equal to the function of the expectation of the random variable. In other words, if a random variable takes a certain value (its expectation) on average, we may think that a function of the random variable would take on average the value of the function at
. As an example, we may expect
to be equal to
.
We know that, in general, .
For example, consider a discrete random variable with
and
. Here,
, where as
.
The problem is that the above thinking hides an aspect about expectation. When we speak about the expected value of a random variable, we should not be thinking that the random variable takes this value majority of the time or that a function of the random variable must take the function of the expected value of the random variable majority of the time. It may be the case that a random variable never actually takes its expected value. See the discrete random variable example given above.
The expected value is merely the value around which the values that the random variable takes are distributed. This statement or this article is not of course meant to discount the importance of expected value. Among other things, the expected value becomes very important in the contexts explored by theorems such as the law of large numbers.