[MUSIC] Hello and welcome to the Coursera course on approximation algorithms part two. You have already taken the course on approximation algorithms part one. Congratulations on making it all the way to the end and being ready to move on. In this part of the course, we will study some more advanced techniques for designing approximation algorithms. In particular, we will use linear programming duality and semi-definite programming. And more on geometric embeddings and use that to solve a variety of problems. For example, Steiner forest, facility location, maximum cut, and sparsest cut. For all of these problems, we will design approximation algorithms and proof theorems about them. But first, we need to learn a little bit about linear programming duality. This is a concept we have not yet used, that has not yet come up, and yet it is very important in the design of approximation algorithms. Linear programming duality, let's do it by example. Let's start from a particular linear program. Here it is. I borrowed this example from Vasarelli's textbook on on approximation algorithms. Minimize the function of three variables, x1, x2, x3. For the next one, plus x2 plus 5 x3 is the objective. And we have two constraints x1 minus x2 plus 3x3 is at least ten. Five x1 plus 2x2 minus x3 is at least six. Plus the non-negativity constraints. X1, x2, and x3 cannot be strictly negative. All right, now, instead of solving this linear problem, let's just look at it and think about how we can upper and lower bound the value of the optimal solution. For example, the optimal value is not known, but how can you prove that the optimal value is less than or equal to 54? Let's see 54, well, the answer is, try a particular value, set of values for x1, x2, x3. Let's try x1 equal to 7, x2 equal to 0, x3 equal to 1. Then x1-x2+3x3 equals 7- 0 + 3, that's 10. It is greater than or equal to 10, so the first constraint is satisfied. five x1 + 2x2- x3 is 35, plus 0 minus 1, that's 34. This is greater than or equal to 6, so the second constraint is satisfied. And of course seven, zero, and one are all greater than or equal to zero. So, 701 is a feasible solution. And what is its value? Its value is seven times seven, 49 plus zero plus five, that's 54. So, we have exhibited a particular x1, x2, x3 that has value 54. So, the minimum possible value, when you minimize over all feasible xs, the result must be less than or equal to 54. And so, with this exhibit of a feasible solution, we have proved that the optimal value is less than or equal to 54. In other words we have a minimization problem. How do we prove that OPT is less than something? How do we certify an upper bound on OPT? It suffices to give a solution that is feasible, satisfies all constraints. And then the value of the solution is an upper bound on OPT. So, that's a technique for proving an upper bound on the linear program. Okay, now, Lower Bound. Lets take the same linear program. how do we certify that OPT is at least 10? how do we certify that the minimum over all feasible x's of 7x 1 plus x2 plus 5 x3 Is at least 10? This looks more challenging, why? Because, even if we look at the particular feasible x, or 2, or 3. and if they're all greater than 10. How do know that when we minimize, the minimum over all the entire polytope, there isn't somewhere? Some x1, x2, x3, that is feasible and has value, at least 10. So, how can we certify that? For that, we have to look at the constraints. So, let's look. Let's look at 7x1 + x2 + 5x3. [COUGH] Compare this to x1 minus x2 plus 3x3. The first constraint. Lets look at the coefficients one by one. 7x1 must be greater than x1, because 7 is greater than 1. x2 must be greater than minus x2, because x2 is non negative. 5x3 must be greater than 3x3 because x3 is non negative. So, 7 greater than 1, 1 greater than minus 1, 5 greater than 3 implies that no matter what feasible x you take, its value 7x1+x2 +5x3 will be greater than or equal to x1-x2+3x3. But now this is exactly a constraint. So we know that for every feasible x, this is greater than or equal to 10. So, we have just certified that OPT must be at least ten now. All the coefficients of the objective function, are greater than or equal to the corresponding coefficients of the first constraint. Therefore, the value of the optimum must be at least the value of the right hand side of the first constraint. We got lucky here but what if we tried to prove something for some larger value? We won't be able to just plug in one particular constraint. Okay, let's try to prove a lower bound of 26. The idea is, that you can combine constraints to derive at new constraints. Consider the first constraint and the second one. Let's add 2 times the first constraint plus 1 times the second constraint. 2 times x1 minus x2 plus 3x3 and 1 times 5x1 plus 2x2 + 2x3. If you add 2 to the first constraints plus one time the second constraint what do you get? You get 7 x1, 2 plus 5. Now what's the coefficient of x2 -2+0? The coefficient on x3, 6-1, 5. 7x1 plus 5x3, must be greater than or equal to 2*10+6, 26. By combining the two constraints, we derived a new constraint, 7x1 plus 5x3 is at least 26. And now, 7x1plus x2 x5 x3 have the objective function. If you look at each coefficient, it's greater than or equal to the corresponding coefficient of this new constraint. And so, the value of the objective function must be for every feasible x greater than 7x1+5x3, which we know is at least 26. So, have just proved that OPT is at least 26. So, this is a way to get a lower bound for a minimization problem. How do we prove that OPT is greater than something if we have a minimization linear program? We find a convex combination of constraints, such that each coefficient is less than the corresponding coefficient in the objective function. Then, we know that the right hand side of this new constraint is a lower bound on OPT. So, these are two ways to get upper bounds, lower bounds, on the optimal value. And then we can get an interval, a range for the possible values of OPT. What is the best upper bound we can obtain? So, each upper bound, we look at an x that is feasible. We look at an x1, x2, x3, that satisfies all the constraints. Then, we take the objective function of that particular x, and it gives us an upper bound. What is the best upper bound we can obtain? Well, an upper bound would be better if it is tighter, if it is smaller. And the best possible upper bound is among all possibilities, the one the minimizes this quantity. What about the lower bound? Lower bound, we have two constraints. What we will do is a convex combination of the two constraints. We'll multiply the first constraint by some coefficient y1. The second constraint by sub-coefficient y sub 2. These coefficients must be non-negative of course, it's a convex combination. If each coefficient of the objective function is greater than the corresponding coefficient of our new constraint. Then we'll be good. What does it mean? For example, take the first coefficient in the objective function 7, 7. What is the coefficient of x1 in the new constraint? Let's see. We multiply the first constraint by y1, so we'll have y1 times 1. The second constraint by y2, so we get y2 times 5. Our coefficient is y1 + 5y2. If 7 is great than or equal to y1 + 5y2. And the corresponding conditions for the coefficient of x two and the coefficient of x three. Then, our lower bound is, let's see, the left had side has a 10 times y one. And then for the second constraint a six times y two. 10y one plus six y two. That's our new lower bound on OPT. What is the best lower bound we can get? The best lower bound is the tightest one. It's the one such that this value is as large as possible. It's the maximum value of 10 y1 plus 6 y2. So what are we trying to do? We're trying to find y1 and y2 that are known negative, that satisfy these three conditions, maximizes it. What did we do? We wrote a linear program. New program. This is a linear program. The best lower bound we can obtain can be expressed with a linear program. Here it is. Max of 10y1 plus 6y2 such that, y1 plus 5y2 is at most 7 minus y1 plus 2 2y2 is at most 1. Minus 3y1 minus y 2 is at most 5. And y1, y2 are greater than or equal to 0. 3y1 minus y2 is at most 5. This is the dual of the initial linear program we wrote, we started from in the example. So, that's what we've done. We've taken the primal LP, we've looked at the what kind of lower bonds we can get. We wrote this as a linear program and this is the dual linear program. Notice this -3 here should be a +3, this is a typo. Okay, so that's duality by example. Now, each constraint here gives us a coefficient variable in the dual. Constraint one gives us variable one one. Constraint two gives us variable y2. Similarly, each constraint in the dual corresponds to a variable in the primal. The 7 that is here is this objective, is there in the constraint of the dual. The 5 is here and what about these coefficients? These are terms 10 and 6. You find them in the objective of the dual. And if you consider this matrix, the matrix 1 -1, 352- 1. You find this matrix again here, except it's the transpose of the matrix. So with this example, we have already a hint of the general formulation of LP duality. We have seen on an example how you can construct the dual of a linear program. It only remains to make this more general.