[SOUND] Now, we already discuss a very important Bernoulli scheme. It is a probabilistic model that represents a series of independent trials. Each trial can result either in success or in failure. We will model the sequence by sequence of coin testing. However, now, this is not a fair coin, but some unbalanced coin for which probability of head is different from probability of tail. To find the probabilities of outcomes in this random experiment, we have to use independence condition. Now, let us go to formulas. Let us assume that we toss a coin two times. So n the overall number of tossings is equal to 2. In this case, we can consider two events. H1, first tossing, Gives head. And H2, second tossing gives head. Of course, in the first tossing and in the second tossing, we use the same coin. So the probability of H1 is the same as the probability of H2. We'll denote this probability by P. In this experiment, we have four outcomes. We will denote them as usual by HH, HT, TH, and TT. However, in this case, when our coin is not necessarily fail, we cannot just say that probabilities of all of these outcomes are equal and equal to one-fourth. Because it is true only in case when the probability of head is in the same as probability of tail. However, we can use independence assumption to find probabilities of these outcomes. To do so, let us take into account the fact that H1 and H2 are independent. Because the second testing does not depend on the first testing. And the first testing does not depend on the second testing. Our coin doesn't have any memory. Now we can use this assumption that H1 and H2 are independent. And we can use it to find the probability of this outcome. We see that this outcome means that both of these events occured. So, probability of this outcome, Is the same as probability of intersection H1 and H2. Due to independence assumption, we can say that this probability equals to product of probabilities. Both of this probabilities are equal to P. So we have p times p or p squared. Now we have probability of this outcome. Now let us find probability of some other event. For example of this outcome, tail, tail. This outcome is the only outcome that belongs to the following event. Naught H1 intersect with naught H2 because if event H1 didn't occur so this event occurred, it means that at the first time, we have tail. And the here at the second toasting, we also have tail. So we have to find the probability of this event. It is easy to show, and it is a good exercise that if these two events are independent, then these two events are also independent. It means that this probability equals to probability of naught H1 times probability of naught H2. Now, if probability of H1 equals to P, then probability of naught H1 equals to 1- p. And here, we also have 1- p. So the probability that we are interested is (1- p) squared. In the same way, we can find probabilities of this outcome and this outcome. What is the probability of this outcome? In a similar way, we can show that it is a probability of head times probability of tail. So it equals to p times 1- p. This is the same probability for this outcome. Now, we fully defined probability space for this experiment. We know all the outcomes and we know their probabilities. We used independency assumption to do this. Now, we can consider a general case for arbitrary n. In this case, the set of all outcomes is a set of n element sequences of heads and tails. So how many outcomes we have in this case? The number of element in omega, the set of all outcomes is 2 to the power n. Let us find probability of arbitrary outcome. For example, Let us consider n = 5 and the outcome is head, head, tail, head, tail. In this outcome, we'll have three heads and two tails. It means that in the formula for probabilities, we will have three times factor B, and two times factor 1- p. So probability of this thing equals to p times p times 1 minus p times p times 1 minus p. Of course, we are using independency assumption in this case. Moreover, we assume that our coin tossings are not only fair-wise independent, but mutually independent which is quite nature of assumption. Because even if we know, for example, the result of first two tossings, it gives us known information about the result of the third tossings. So, we can write this equality due to mutual independence. Now we can simplify this expression and get P to the power of 3 times 1 minus p to the power 2. Here, 3 is the number of heads here and 2 is number of tails here. Now consider the general case, consider outcome, With, And n- minus tails, In this case, probability of this outcomes equals to p to the power m times (1-p) to the power n- m. This factor corresponds to probability of heads, and this factor corresponds to probability of tails. Note that this probability does not depend on the order with which heads and tails occur. Only we are interested in the overall number of heads and number of tails. Thus we fully define Bernoulli's scheme. So we described the probability space and described the probability of every outcome in this probability space. Bernoulli's scheme is a very popular way to model processes in real life. [SOUND]