topic badge

9.03 Centre and spread of continuous random variables

Lesson

The median

The median value of a continuous probability distribution is the value in the middle of the distribution. That is, the probability of obtaining a value below or above this is $0.5$0.5. Graphically this is the value of $x$x which divides the area under the graph in half.

Median

For a continuous probability distribution, the median is the value of $x$x such that:

$F(x)=\int_a^x\ f(t)\ dt=0.5$F(x)=xa f(t) dt=0.5, where $f(x)$f(x) is the probability density function defined in the domain $[a,b]$[a,b].

 

Worked example

Example 1

Find the median of the continuous probability distribution defined as $f(x)=\frac{1}{24}(x+3)$f(x)=124(x+3) in the domain $[1,5]$[1,5] and $0$0 elsewhere.

Think: We want to find $x$x such that $\int_1^x\ f(x)\ dx=0.5$x1 f(x) dx=0.5. We can do this be finding the cumulative distribution function $F(x)$F(x) and then solving for $F(x)=0.5$F(x)=0.5.

Do: Integrating $f(x)=\frac{1}{24}(x+3)$f(x)=124(x+3), we can find $F\left(x\right)$F(x) over the domain $[1,5]$[1,5]:

Solving for $F(x)=\frac{1}{2}$F(x)=12:

$\frac{1}{24}(\frac{x^2}{2}+3x-\frac{7}{2})$124(x22+3x72) $=$= $\frac{1}{2}$12

 

$\frac{x^2}{2}+3x-\frac{7}{2}$x22+3x72 $=$= $12$12

Multiply both sides by $24$24

$\frac{x^2}{2}+3x-\frac{31}{2}$x22+3x312 $=$= $0$0

Take $12$12 from both sides

$x^2+6x-31$x2+6x31 $=$= $0$0

Multiply both sides by $2$2 to simplify

$\therefore x$x $=$= $-3\pm2\sqrt{10}$3±210

Use technology or quadratic formula to solve

 

Since $1\le x\le5$1x5, $x=-3+2\sqrt{10}\approx3.32$x=3+2103.32.

Hence, the median is approximately $3.32$3.32.

 

Expected value for continuous probability distributions

Recall that for discrete random variables, the expected value formula is $\Sigma xP(X=x)$ΣxP(X=x). In other words, we multiply each of the probabilities with their outcomes, and then add up all the results. The expected value of a random variable gives the mean outcome of the distribution.

The notation used for the expected value of a random variable $X$X is $\mu=E(X)$μ=E(X).

In the case of a continuous random variable, the expected value is defined analogously in terms of an integral. If the random variable $X$X has probability density function $f$f that is positive over the domain $[a,b]$[a,b] or $(a,b)$(a,b), we define:

$E(X)=\int_{-\infty}^{\infty}\ xf(x)\ dx=\int_a^b\ xf(x)\ dx$E(X)= xf(x) dx=ba xf(x) dx

Worked example

Example 2

A random variable $X$X has PDF given by $f(x)=\frac{1}{5}-\frac{x}{50}$f(x)=15x50 over the interval $\left[0,10\right]$[0,10] and $0$0 elsewhere. What is the mean or expected value of $X$X?

Think: By the definition $E(X)=\int_0^{10}\ x\left(\frac{1}{5}-\frac{x}{50}\right)dx$E(X)=100 x(15x50)dx.

Do:

Calculating variance ($\sigma^2$σ2) and standard deviation ($\sigma$σ) for a continuous probability distribution

Recall that variance is technically the average of the squared differences of each data value from the mean. Variance gives us a measure of how spread the data values in a distribution are.

Because the expected value, $E(X)$E(X), is equivalent to the mean of the random variable we can express the formula for variance in terms of $E(X)$E(X). For example for discrete probability distributions the variance is:

$Var\left(X\right)$Var(X) $=$= $E\left(\left(X-\mu\right)^2\right)$E((Xμ)2)
  $=$= $\Sigma\left(x-\mu\right)^2p\left(x\right)$Σ(xμ)2p(x)

 

As you can expect for continuous probability distributions the formula for variance changes to:

$Var\left(X\right)$Var(X) $=$= $E\left(\left(X-\mu\right)^2\right)$E((Xμ)2)
  $=$= $\int_a^b\ \left(x-\mu\right)^2\ f(x)dx$ba (xμ)2 f(x)dx

 

For ease of calculating, we often use the following form for both Discrete and Continuous variables:

$Var\left(X\right)=E(X^2)-\mu^2$Var(X)=E(X2)μ2 or $E(X^2)-E(X)^2$E(X2)E(X)2

Thus, using calculus we would calculate the variance as follows:

$Var\left(X\right)=\int_a^b\ x^2\ f\left(x\right)dx-\mu^2$Var(X)=ba x2 f(x)dxμ2

Remember also that standard deviation is the square root of the variance: $\sigma=\sqrt{Var}$σ=Var

 

Summary

 

  Discrete Probability Distribution Continuous Probability Distributions
Expected Value $E\left(X\right)=\Sigma xp(x)$E(X)=Σxp(x) $E(X)=\int_a^b\ xf(x)\ dx$E(X)=ba xf(x) dx
Variance

$Var\left(X\right)=\sigma^2=E\left(\left(X-\mu\right)^2\right)$Var(X)=σ2=E((Xμ)2)

$=\Sigma\left(x-\mu\right)^2p\left(x\right)$=Σ(xμ)2p(x)

$=\Sigma\ (x^2\ p(x))-\mu^2$=Σ (x2 p(x))μ2

 

$Var\left(X\right)=\sigma^2=\int_a^b\ \left(x-\mu\right)^2\ f(x)\ dx$Var(X)=σ2=ba (xμ)2 f(x) dx

$=\int_a^b\ x^2\ f(x)\ dx-\mu^2$=ba x2 f(x) dxμ2

Worked example

example 3

Let's revisit our previous worked example with the probability function $f(x)=\frac{1}{5}-\frac{x}{50}$f(x)=15x50 over the interval $\left[0,10\right]$[0,10] and $0$0 elsewhere, and this time calculate the standard deviation of $X$X.

Think: We already have $\mu=\frac{10}{3}$μ=103 from our previous example and for ease of calculation we'll first calculate the variance using $Var(X)=\int_a^b\ (x)^2\ f(x)dx-\mu^2$Var(X)=ba (x)2 f(x)dxμ2

Do:

Now we find the standard deviation by taking the square root of our variance and we obtain $\sigma=\frac{5\sqrt{2}}{3}\approx2.357$σ=5232.357.

Practice questions

question 1

Consider the probability density function $p$p where $p\left(x\right)=\frac{1}{20}$p(x)=120 when $25\le x\le45$25x45 and $p\left(x\right)=0$p(x)=0 otherwise.

  1. Use integration to determine the expected value of $p\left(x\right)$p(x).

  2. Use integration to determine the variance of $p\left(x\right)$p(x).

    Round your answer to two decimal places if necessary

question 2

Consider the probability density function $p$p, where $p\left(x\right)=k\cos\left(\frac{\pi}{2}x\right)$p(x)=kcos(π2x) when $0\le x\le1$0x1 and $p\left(x\right)=0$p(x)=0 otherwise.

  1. Integrate $\int_0^1\cos\left(\frac{\pi}{2}x\right)dx$10cos(π2x)dx.

  2. Find $k$k.

  3. Determine the expected value, $E\left(X\right)$E(X), using technology or otherwise.

  4. Calculate the standard deviation, using technology or otherwise.

    Round your answer to two decimal places.

 

Change of scale and origin and the effect on mean and variance

When we study a continuous random variable, we examine the mean and variance for a particular variable given in a particular unit of space or time. For example, we might be looking at time in minutes, or distance in kilometres. If we wish to convert these units we would end up scaling the data (multiplying by a scale factor) and we call this a change in scale.

Alternatively, or in addition to a change of scale, we might decide to shift or translate our data set by adding or subtracting a particular value to our entire data set. We call this a change in origin. This might happen when we decide to omit a $10$10 mark question from an examination because everyone got it right, thus deducting $10$10 marks from each student's result.

In general, when changing the scale or origin of our data, we transform our original random variable $X$X into a new random variable. Let's call the new random variable $Y$Y and it is related to $X$X such that $Y=aX+b$Y=aX+b where $a$a represents the scale factor for the change of scale and $b$b represents the translation or shift of the data.

 

Mean

Recall that a continuous random variable $X$X with density function $f(x)$f(x) defined on the interval $\left[m,n\right]$[m,n], and $0$0 elsewhere has mean:

$E(X)=\int_m^n\ xf(x)dx$E(X)=nm xf(x)dx.

Another useful fact, which we state without proof, is that if $Y=g(X)$Y=g(X) is a random variable that is a function of the random variable $X$X, then the mean of $Y$Y is given by:

$E(Y)=\int_m^n\ g(x)f(x)dx$E(Y)=nm g(x)f(x)dx.

For example, we used this fact to find $E(X^2)$E(X2). According to the rule just given, this must be $\int_m^n\ x^2f(x)dx$nm x2f(x)dx.

Now, suppose $g(X)$g(X) is a linear function of $X$X. That is, $g(X)=aX+b$g(X)=aX+b. Then,

$E\left[g\left(X\right)\right]$E[g(X)] $=$= $E\left[aX+b\right]$E[aX+b]

 

  $=$= $\int_m^n\ \left(ax+b\right)f\left(x\right)dx$nm (ax+b)f(x)dx

Definition of an expectation

  $=$= $\int_m^n\ \left(axf\left(x\right)+bf\left(x\right)\right)dx$nm (axf(x)+bf(x))dx

Expand the brackets

 

 

  $=$= $a\int_m^n\ xf\left(x\right)dx+b\int_m^n\ f\left(x\right)dx$anm xf(x)dx+bnm f(x)dx

Split the integral

 

  $=$= $aE\left[X\right]+b$aE[X]+b

Simplify using $a\int_m^n\ xf\left(x\right)dx=E\left(X\right)$anm xf(x)dx=E(X) and $\int_m^n\ f\left(x\right)dx=1$nm f(x)dx=1

We conclude that:

$E\left[aX+b\right]=aE\left[X\right]+b$E[aX+b]=aE[X]+b

This is as expected since the function $aX+b$aX+b is just a re-scaling of $X$X combined with a shift of location.

 

Variance

The variance of a random variable is the expected value of the squared difference of its value from the mean. $Var\left[X\right]=E\left[(X-E\left[X\right])^2\right]$Var[X]=E[(XE[X])2].

Our aim is to find an expression for the variance of $Y=aX+b$Y=aX+b. We could begin by considering the variance of $X+b$X+b and we will make use of the result obtained above for the mean. By substitution into the formula, we have:

$Var\left[X+b\right]$Var[X+b] $=$= $E\left[(X+b-E[X+b])^2\right]$E[(X+bE[X+b])2]

 

  $=$= $E\left[(X+b-E[X]-b)^2\right]$E[(X+bE[X]b)2]

Using $E\left[aX+b\right]=aE\left[X\right]+b$E[aX+b]=aE[X]+b

  $=$= $E\left[(X-E[X])^2\right]$E[(XE[X])2]

 

  $=$= $Var\left(X\right)$Var(X)

 

 

Next, we consider the variance of $aX$aX. According to the formula and again making use of the results for the mean, we have:

$Var\left(aX\right)$Var(aX) $=$= $E\left[(aX-E[aX])^2\right]$E[(aXE[aX])2]

 

  $=$= $E\left[\left(aX-aE\left[X\right]\right)^2\right]$E[(aXaE[X])2]

Using $E\left[aX+b\right]=aE\left[X\right]+b$E[aX+b]=aE[X]+b

  $=$= $E\left[\left(a\left(X-E[X]\right)\right)^2\right]$E[(a(XE[X]))2]

Factor out the $a$a

  $=$= $E\left[a^2(X-E[X])^2\right]$E[a2(XE[X])2]

Apply the power to both terms in the product

  $=$= $a^2E\left[(X-E[X])^2\right]$a2E[(XE[X])2]

Using $E\left[aX+b\right]=aE\left[X\right]+b$E[aX+b]=aE[X]+b bring the $a^2$a2 to the front as it is a constant

  $=$= $a^2Var\left[X\right]$a2Var[X]  

 

Putting these together, we can state:

$Var\left[aX+b\right]=a^2Var\left[X\right]$Var[aX+b]=a2Var[X]

Since standard deviation is the square root of variance, we have:

$STDEV\left[aX+b\right]$STDEV[aX+b] $=$= $\sqrt{Var\left[aX+b\right]}$Var[aX+b]
  $=$= $\sqrt{a^2Var\left[X\right]}$a2Var[X]
  $=$= $a\sqrt{Var\left[X\right]}$aVar[X]
  $=$= $aSTDEV\left[X\right]$aSTDEV[X]

Using the $\sigma$σ notation for standard deviation we have:

$\sigma_{aX+b}=a\sigma_X$σaX+b=aσX

 

Practice question

Question 3

A uniform probability density function, $P\left(x\right)$P(x), is positive over the domain $\left[20,50\right]$[20,50] and $0$0 elsewhere.

  1. State the function defining this distribution.

    $P\left(x\right)$P(x) $=$= $\editable{}$ if $\editable{}\le x\le\editable{}$x
    $\editable{}$ for all other values of $x$x
  2. Use integration to determine the expected value of the distribution.

  3. Use integration to determine the variance $V\left(X\right)$V(X) of the distribution.

  4. The distribution is transformed to the random variable $Y$Y by $Y=2X+4$Y=2X+4. Calculate $E\left(Y\right)$E(Y), the expected value of $Y$Y.

  5. Determine the variance $V\left(Y\right)$V(Y) of the random variable $Y$Y as defined by $Y=2X+4$Y=2X+4.

  6. Determine the standard deviation $SD\left(Y\right)$SD(Y) of $Y$Y.

    Round your answer to one decimal place.

Question 4

The probability density function of a random variable $X$X and its graph are given below:

$p\left(x\right)$p(x) $=$= $k\sin x$ksinx     if $0\le x\le\pi$0xπ
$0$0     for all other values of $x$x

Loading Graph...

  1. Solve for the value of $k$k.

  2. Find the derivative of $\sin x-x\cos x$sinxxcosx.

  3. Hence determine the expected value of the distribution. Express your answer in exact form in terms of $\pi$π.

  4. The distribution is transformed to the random variable $Y$Y by $Y=11-2X$Y=112X. Calculate $E\left(Y\right)$E(Y), the expected value of $Y$Y.

    Express your answer in exact form in terms of $\pi$π.

Outcomes

4.4.1.3

calculate the expected value, variance and standard deviation of a continuous random variable in simple cases

4.4.1.4

understand standardised normal variables (z-values, z-scores) and use these to compare samples

What is Mathspace

About Mathspace