55

You Must Know Multi-Objective Least Squares

 4 years ago
source link: https://www.tuicool.com/articles/qEziuyn
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Classical least squares is nice, but it is limited in many respects. Here we talk about how to adapt least squares to multi-objective optimization problems in an elegant way.

In a previousarticle, I talked about least squares, its simplicity being amazing and vast applications. Yet, again, the applications are limited to a certain type of optimization problems. Nevertheless, you should never underestimate the power and generality of linear algebra.

In many problems, we do not want to optimize only one objective function, we want to optimize multiple objective functions. This is where multi-objective least-squares comes in.

Recall how the equation for least-squares looked like in the case of fitting data:

imUfIjF.gif

The objective function to minimize then being:

7fQJB3Z.gif

So, imagine an objective that consists of multiple J’s, each of them being their own least-squares problem in a certain way, such as the following:

aQr6ZvZ.gif

Where we define each J_k as:

Or we can write this in shorthand linear-algebra-friendly notation, as a dot-product:

ZvAR7fm.gif

Introducing the vector λ is typical to solving multi-objective problems, it defines how much each objective contributes to the optimization problem (logically). So, now we ask the question, how does this fit into the least-squares framework? Well, one thing to do is to move λ into the J, arriving at a result:

qUFniau.gif

Do you notice a similar pattern here in comparison to classical least-squares, just by looking at what terms are in the norm? Well, you should, because by substituting these following parts we are going to arrive at a classical least-squares formulation (that we know how to solve):

jUrIVvI.gif
A3YbMze.gif

QB3IBnj.jpg!web

Because linear algebra is a beautiful thing, now we can write our multi-objective least squares as the following and everything works out out of the box:

FNnuqyU.gif

And this is something that we already know how to solve based on theprevious article! Nevertheless, let us write down the code so that we are concrete in the full formulation:

The result that you should get from the code above should look something like the following (modulo random generators working differently in different versions of numpy):

RJZnAzu.png!web

Notice how the blue line(the prediction) fits best the red data points? This is because we gave the red data points the highest cost through the λ vector. This is very nice, we can now balance between multiple objectives in our optimization problem. But, in the end, the real world is full of some hard constraints that tell us even which solutions may be considered. Luckily, there is a methodology to solving that also, entirely based on good old linear algebra! To be continued…


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK