Next we're going to talk about joint probabilities. Joint probabilities are the probabilities that two separate events from two separate probability distributions are both true. So, we're looking at the probability that A is true and B is true and our notation allows us to abbreviate this simply placing both capital letters together with a comma between them. This is read as the joint probability of A and B or the probability that A is true and B is true. An important point here to note is that the ordering does not matter in joint probabilities. So also, I want to point out that our standard notation for probability uses capital letters to refer to the entire probability distribution. That's the entire collection of exclusive and exhaustive statements and we use lower case letters to refer to the individual statements and we can reference an individual probability the same way, okay? So, what we've written here suggests that the joint probabilities are equal for regardless of which order we write the terms. Or which order we consider the probability of distributions and the same is true of each individual probability within the distribution. So the probability of X1 and Y1 occurring together is the same as the probability of Y1 and X1 occurring together. This may perhaps seem obvious to you but, it turns out to be a very, very useful principle as well. Now, we're going to talk about the definition of the independence of two probability distributions. If two probability distributions are independent, knowing the outcome of one does not our belief in the truth of the other. Therefore, knowing the outcome of one, does not change the probability of the other. When two probabilities are independent, you can think of them as being unrelated and unconnected. I'm tossing a coin. It has a 50 percent probability of coming up heads. And I'm rolling a die and it has a 1 in 6 probability of coming up as a 3 ,okay? The joint probability heads and 3 is = to the probability of heads which is 1/2 times the probability of coming up three on a six-sided die is 1/6 and that = 1/12. So, for probability problems involving independent distributions, this is the formula to calculate the probability that both events happen at the same time. We have a special name for the product of the two probabilities that is known as the product distribution, okay? So, when the joint distribution equals the product distribution, the two probability distributions are by definition independent. We often use Venn diagrams. These are diagrams that show the intersection and union of sets to illustrate the intersection of two sets of events. And, get developed some intuition about what it means to say the probability that both events occur. So for example, if we have the probability of flipping a coin and it comes up heads and the probability of tossing a die and it comes up three. Okay, we can represent this area as the product of the two, okay? You can think of this outer area as the universe of all possible outcomes, okay? And this gives us the probability that both of the events that we have defined as relevant occur at the same time. Perhaps the Venn Diagram is even more useful in capturing the concept of or probability. What is the probability that our coin comes up heads or, now interestingly, this is written with a plus sign. But in probability, what we're doing is we're looking at the probability that either one event occurs or the other occurs or both occur, okay? So, we're interested in the probability that the coin comes up heads or the die comes up 3, okay? Well, if we simply added the two circles together, we'd count the central area twice. So to avoid that, we subtract the central area from the sum of the two circles. So we are subtracting the joint probability which as we know is equal to the product probability 1/12. So, the probability that either the coin comes up heads or the dye comes up 3 would be 1/2 + 1/6- 1/12 or 6/12 + 2/12- 1/12 = 7/12.