Hello, everybody, welcome back to linear programming unit. Today, we're going to talk about one more algorithm for solving linear programs, namely the ellipsoid algorithm. So, remember last time we had the simplex algorithm, this solves linear programs. It works pretty well in most cases, but in some of the time, it's actually exponential which is a problem. Today we're going to talk about the ellipsoid algorithm, this again solves linear programs. It's actually polynomial time in all cases, but it turns out in practice is often not as good as the simplex method. So, to begin with, the ellipsoid algorithm solves a particular formulation, it solves the satisfiability version of a linear program. Given the set of constraints, will just tell you whether or not there is a solution. So, here's how the algorithm works. The first thing you do is you take all the equations and relax them by a tiny, tiny bit, you make them a tiny bit more lenient. And if there weren't solutions before, there still won't be solutions. But if there were solutions before, even if beforehand there was only a single point that was a solution, there's now going to be some small positive volume in your set of solutions. The next thing you do is you bound the set of solutions with a large ball. You find a large ball that either contains all of your solutions or maybe just contains a large fraction of, some notable fraction of your solutions, assuming that they exist. And this isn't too hard to do by just taking it really, really, really big. Then what you do, is you have a ball, or in general an ellipsoid, that contains all of your solutions. What you do is, you look at the center of this ellipsoid and say, is that a solution to your equations? On the one hand, it might be a solution to your system, in which case you've found a solution, your system is satisfiable, and you're done. On the other hand, it might not be a solution. If it's not a solution, we have a point that is not in a convex region. So you can find a separating hyperplane that separates this center from your region. What this means is that your entire convex region is on that side of the hyperplane. So instead of being contained in your ellipsoid it's actually contained in a half-ellipsoid. However, when you have a half-ellipsoid you actually can find a new ellipsoid whose volume is smaller than the one you started with that contains this entire half-ellipsoid. And thus, we now have a smaller ellipsoid that also contains your set of solutions. So now what we're going to do is we're going to iterate this. We keep finding smaller and smaller ellipsoids that contain our solution set. And eventually one of two things happens. Either eventually we find that the center of our ellipsoid is actually contained in our solution set, in which case we're done. Or eventually, we end up with ellipsoids that are really, really tiny and yet still guaranteed to contain our entire set of solutions. But from step one, when we did this relaxation, we knew that if there were solutions, our set of solutions has some small positive volume. And eventually we'll find ellipsoids that are smaller than that. And therefore, they're too small to contain our solution set if they existed. And then we will know that there must be no solutions. So, what's the runtime of this? Well, you have to figure out how many iterations it takes, it's a little bit of a mess. But the runtime of ellipsoid algorithm is something like O((m + n squared) n to the fifth log(nU)). Where here, n is the dimension of the space that you're working in, m is the number of equations that you're trying to solve, and U is the numerical size of the coefficients. So things to notice about this. One, it's polynomial, hooray. We have a polynomial time algorithm for solving linear programs. And this is pretty great. However, it's a bad polynomial, it runs in something like n to the seven time. n to the seven's really not that great. And there are a lots of circumstances where I'd rather taking an exponential algorithm, or a mildly exponential algorithm at least, over an n to the seven algorithm. Finally, we'll note that the runtime actually depends, albeit logarithmically, on the size of the coefficients. And this might be a problem if you have really, really complicated coefficients in your linear program. This affects how much you can relax your equations and how big the ball needs to be. And if you have these sorts of problems, ellipsoid will run very slowly. Whereas if you're running the simplex method, no matter how big your coefficients are, at least the number of algebraic operations that you need to perform doesn't depend on the size of the coefficients. Now there's one final thing to note about the ellipsoid algorithm. We don't really need that much information about our system of equations or inequalities. All we really need is what's called a separation oracle. That is, given a point, x, we need to either be able to tell is x in our system, or if not, we need to find some hyperplane that separates x from our set of solutions. And there are actually some circumstances where you don't have explicit sets of equations, or defining your system, but can produce a separation oracle. And in these cases you can actually use the ellipsoid algorithm to solve linear programs even though you don't have an explicit list of finitely minute equations. And this is really useful in some cases. So in summary, the ellipsoid algorithm is another way to solve linear programs. It has better worst-case performance than simplex. However, it's usually going to be slower. In practice, it's generally not as good. However, on the plus side, you can run the ellipsoid algorithm, only with access to a separation oracle, which is nice. And there are definitely a few contexts where being able to do this is quite useful. In any case, that wraps up our unit on linear programs. I hope you enjoyed it. Come back next time and Sasha will start talking about complexity theory. And in particular, we'll be talking about various aspects of NP-complete problems. So, I hope you come back for that, and I'll see you then.