Important case is when two random variables are independent of each other. It means as in discrete case, that knowledge of a value of one of these variables gives you new information about the possible values and the probabilities of another variable. We can state it mathematically in different ways. First of all, we can discuss it in terms of cumulative distribution functions. In this case, we can say that X and Y are independent if their cumulative distribution function can be factorized as a product of two functions. This is in agreement with the idea of independence of the events that involves the corresponding random variables. For example, let us recall that cumulative distribution function is defined as a probability of the following event. These two, cumulative distribution functions are probabilities of the following events. So we see here that we have two events. This one, X is less than or equal to X, and this one the same with Y. Here is their intersection. This equality says that the probability of this intersection is equal to product of these probabilities. It means that these two events are independent of each other. So in other sense, it means that if we know something about the variable X, then it gives us no new information about the variable Y. If this relation is hold for all possible values of x small and y small, then we say that these two random variables are independent. Another way to define independence is to consider a probability density function. Their relation is very similar. Probability density function of X and Y, joint probability have to be equal to probability density function of X, times probability density function of Y. Actually this relation follows immediately from their relation like this about the probability of some event that involves the condition that, value of random variable lies inside some sediment or inside some re and so on. So this is an alternative way to define or to check independence of two random variables if they have probability density functions. Now let us discuss correlation and covariance, which is closely related to independence. Just like in case of discrete random variables, covariance is defined in the following way. Covariance of two variables X and Y is expected value of X minus expected value of X times Y minus expected value of Y. Correlation is defined as a fraction, which numerator is equal to covariance and in the denominator we have square root of variance X times variance Y. So like in discrete random variables, if X and Y are independent, then the covariance and correlation equals to zero. Again, like in discrete case, the inverse is not true. Let us now discuss little bit, what does it mean that correlation between X and Y equals to zero or to one? Again, like in discrete case covariance is related to the formula that gives us variance of sum of two random variables. It immediately follows that if two random variables are non-correlated, meaning that the covariance equals to zero, then variance of sum equals to sum of variances. This fact is very important and we will use it later when we will discuss law of large numbers and central limit theorem.