Now, let us consider a pair of random variables defined on the same probability space. We can consider the sum of these random variables, how expected value behaves when they sum. So we have sum of random variables. Let X and Y be some random variables that are defined on the same probability space, and let Z be X plus Y. What can you say about expected value of Z? I want to show that expected value of Z, which is expected value of X plus Y is equal to expected value of X plus expected value of Y. Let us proof. To do so, I have to recall the definition of expected value of random variable not in terms of its distribution, but in terms of the original definition of random variable as a function of probability space. Let Omega consist of all outcomes, Omega one and so on, Omega N. Then we can say that expected value of X is a sum for i from one to N, probability of Omega i times value of X that it takes for Omega Y. In the same way, we can write expected value of Y. It is again the sum for i from one to N, probability of Omega i times Y of Omega Y. Now, we can write the definition of expected value of Z. Expected value of Z is equal to the sum of probability of Omega i and then we have to multiply it by the value of Z for given elementary outcome, which is X of Omega i plus Y of Omega i. Now, we see that we can expand on this product. We can rewrite it as a sum of two sums.First sum will involve only these terms and second sum will involve only these terms. So we have sum i. We see that the first sum here is equal to expected value of X and the second sum is equal to expected value of Y. So this finished our proof. We see that expected value of sum of two random variables is equal to sum of expected values. Is the same true for variance? Let us discuss it. Let us consider an example. Let X be arbitrary random variable. The only property we demand from X is that it is indeed random. It means that it has non-zero or better say positive variance. Let me denote this variance by V and I demand it to be positive. Now, let us consider another random variable Y, which is equal to negative X. What is variance of Y? According to the rules we discussed, variance of Y is equal to variance of negative X. It is equal to variance of negative one times X. We can move this negative one out of this variance, but we have to take a square of this negative one. So we have negative one squared times variance and negative one squared is just one. So this is just the variance of X. So this is just V. Now, let us consider a new variable Z, which is equal to X plus Y. What can you say about variance of Z? Indeed, variance of Z is variance of X minus X. Z actually is constant 0. So its variance is equal to 0 because variance of any constant is equal to 0, which is like nature. So we see that variance of Z does not equal to sum of variance of X and Y because variance of X plus variance of Y is equal to V plus V is equal to 2V and it is greater than 0. So we see that in general, variance of X plus Y is not equal to variance of X plus the variance of Y. If you take an arbitrary two variables, then this equality does not hold. But for some special case of independent variables, this can be true. We will discuss independent variables in the next section.