انت هنا الان : شبكة جامعة بابل > موقع الكلية > نظام التعليم الالكتروني > مشاهدة المحاضرة

Expectation of a Random Variable

الكلية كلية التربية للعلوم الصرفة     القسم  قسم الرياضيات     المرحلة 4
أستاذ المادة كريمة عبد الكاظم مخرب الخفاجي       01/12/2016 18:40:06

Expectation of a Random VariableExpectation of a Random Variable


The expectation of a random variable is essentially the average value it is expected to take on. Therefore, it is calculated as the weighted average of the possible outcomes of the random variable, where the weights are just the probabilities of the outcomes. As a trivial example, consider the (discrete) random variable X (outcomes of some probabilistic experiment) whose sample space is the set {1,2,3} with probability function given by p(1)=0.3, p(2)=0.1 and p(3)=0.6. If we repeated this experiment 100 times, we would expect to get about 30 occurrences of X=1, 10 of X=2 and 60 of X=3. The average X would then be ((30)(1)+(10)(2)+(60)(3))/100 = 2.3. In other words, (1)(0.3)+(2)(0.1)+(3)(0.6). This reasoning leads to the defining formula:

for any discrete random variable. The notation E(X) for the expectation of X is standard, also in use is the notation .
For continuous random variables, the situation is similar, except the sum is replaced by an integral (think of summing up the average values of x by dividing the sample space into small intervals [x,x+dx] and calculating the probability p(x)dx that X falls into the interval.


The expectation of a random variable is essentially the average value it is expected to take on. Therefore, it is calculated as the weighted average of the possible outcomes of the random variable, where the weights are just the probabilities of the outcomes. As a trivial example, consider the (discrete) random variable X (outcomes of some probabilistic experiment) whose sample space is the set {1,2,3} with probability function given by p(1)=0.3, p(2)=0.1 and p(3)=0.6. If we repeated this experiment 100 times, we would expect to get about 30 occurrences of X=1, 10 of X=2 and 60 of X=3. The average X would then be ((30)(1)+(10)(2)+(60)(3))/100 = 2.3. In other words, (1)(0.3)+(2)(0.1)+(3)(0.6).

In other words, the expected value of X is a weighted average of the possible
values that X can take on, each value being weighted by the probability that X
assumes that value. For example, if the probability mass function of X is given by
p(1) = 21 = p(2)
. The notation E(X) for the expectation of X is standard, also in use is the notation .
For continuous random variables, the situation is similar, except the sum is replaced by an integral (think of summing up the average values of x by dividing the sample space into small intervals [x,x+dx] and calculating the probability p(x)dx that X falls into the interval.
The use of the letter E to denote expected value goes back to W.A. Whitworth in 1901,[10] who used a script E. The symbol has become popular since for English writers it meant "Expectation"

We may also de?ne the expected value of a continuous random variable. This is
done as follows. If X is a continuous random variable having a probability density
function f(x), then the expected value of X is de?ned by
E[X]=? ???xf (x) dx

This is the formula for the expectation of a continuous random variable.
on in the discrete case or its probability
density function in the continuous case). Suppose also that we are interested in
calculating, not the expected value of X, but the expected value of some function
of X,say,g(X). How do we go about doing this? One way is as follows. Since
g(X) is itself a random variable, it must have a probability distribution, which
should be computable from a knowledge of the distribution of X. Once we have
obtained the distribution of g(X), we can then compute E[g(X)] by the de?nition
of the expectation.
Expectation of a Random Variable


The expectation of a random variable is essentially the average value it is expected to take on. Therefore, it is calculated as the weighted average of the possible outcomes of the random variable, where the weights are just the probabilities of the outcomes. As a trivial example, consider the (discrete) random variable X (outcomes of some probabilistic experiment) whose sample space is the set {1,2,3} with probability function given by p(1)=0.3, p(2)=0.1 and p(3)=0.6. If we repeated this experiment 100 times, we would expect to get about 30 occurrences of X=1, 10 of X=2 and 60 of X=3. The average X would then be ((30)(1)+(10)(2)+(60)(3))/100 = 2.3. In other words, (1)(0.3)+(2)(0.1)+(3)(0.6). This reasoning leads to the defining formula:

for any discrete random variable. The notation E(X) for the expectation of X is standard, also in use is the notation .
For continuous random variables, the situation is similar, except the sum is replaced by an integral (think of summing up the average values of x by dividing the sample space into small intervals [x,x+dx] and calculating the probability p(x)dx that X falls into the interval.


The expectation of a random variable is essentially the average value it is expected to take on. Therefore, it is calculated as the weighted average of the possible outcomes of the random variable, where the weights are just the probabilities of the outcomes. As a trivial example, consider the (discrete) random variable X (outcomes of some probabilistic experiment) whose sample space is the set {1,2,3} with probability function given by p(1)=0.3, p(2)=0.1 and p(3)=0.6. If we repeated this experiment 100 times, we would expect to get about 30 occurrences of X=1, 10 of X=2 and 60 of X=3. The average X would then be ((30)(1)+(10)(2)+(60)(3))/100 = 2.3. In other words, (1)(0.3)+(2)(0.1)+(3)(0.6).

In other words, the expected value of X is a weighted average of the possible
values that X can take on, each value being weighted by the probability that X
assumes that value. For example, if the probability mass function of X is given by
p(1) = 21 = p(2)
. The notation E(X) for the expectation of X is standard, also in use is the notation .
For continuous random variables, the situation is similar, except the sum is replaced by an integral (think of summing up the average values of x by dividing the sample space into small intervals [x,x+dx] and calculating the probability p(x)dx that X falls into the interval.
The use of the letter E to denote expected value goes back to W.A. Whitworth in 1901,[10] who used a script E. The symbol has become popular since for English writers it meant "Expectation"

We may also de?ne the expected value of a continuous random variable. This is
done as follows. If X is a continuous random variable having a probability density
function f(x), then the expected value of X is de?ned by
E[X]=? ???xf (x) dx

This is the formula for the expectation of a continuous random variable.
on in the discrete case or its probability
density function in the continuous case). Suppose also that we are interested in
calculating, not the expected value of X, but the expected value of some function
of X,say,g(X). How do we go about doing this? One way is as follows. Since
g(X) is itself a random variable, it must have a probability distribution, which
should be computable from a knowledge of the distribution of X. Once we have
obtained the distribution of g(X), we can then compute E[g(X)] by the de?nition
of the expectation.







المادة المعروضة اعلاه هي مدخل الى المحاضرة المرفوعة بواسطة استاذ(ة) المادة . وقد تبدو لك غير متكاملة . حيث يضع استاذ المادة في بعض الاحيان فقط الجزء الاول من المحاضرة من اجل الاطلاع على ما ستقوم بتحميله لاحقا . في نظام التعليم الالكتروني نوفر هذه الخدمة لكي نبقيك على اطلاع حول محتوى الملف الذي ستقوم بتحميله .