Operations Research Asked by cfp on August 19, 2021
Let $yinmathbb{R}^m$, $tauinmathbb{R}$ and $Xinmathbb{R}^{mtimes n}$, with $tau>0$
I would like to efficiently solve the following problem:
Choose $alpha,zinmathbb{R}^m,betainmathbb{R}^n$ to minimize:
$$(y-alpha)^top (y-alpha) + tau beta^top beta$$
subject to the constraints that:
$$z=Xbeta$$
$$beta^top 1_n = 1$$
$$betage 0$$
$$forall i,jin{1,dots,m}, z_ile z_j rightarrow alpha_i le alpha_j$$
(Here $1_ninmathbb{R}^n$ is a vector of ones.)
The final constraint is equivalent to:
$$forall i,jin{1,dots,m}, (z_j-z_i,alpha_j-alpha_i)inleft{(c,d)inmathbb{R}^2middle|cle 0 vee dge 0right},$$
which is clearly non-convex. While the problem can be given a mixed integer quadratic programming formulation, this is unlikely to be computationally feasible.
However, if we knew $z=hat z$, Problem 1 reduces to:
Choose $alphainmathbb{R}^m$ to minimize:
$$(y-alpha)^top (y-alpha)$$
subject to the constraints that:
$$forall i,jin{1,dots,m}, hat z_ile hat z_j rightarrow alpha_i le alpha_j$$
This is the isotonic regression problem, and may be solved very efficiently by the pooled adjacent violators algorithm.
Likewise, if we knew $alpha=hatalpha$, then Problem 1 reduces to:
Choose $zinmathbb{R}^m,betainmathbb{R}^n$ to minimize:
$$beta^top beta$$
subject to the constraints that:
$$z=Xbeta$$
$$beta^top 1_n = 1$$
$$betage 0$$
$$forall i,jin{1,dots,m}, hatalpha_i > hatalpha_j rightarrow z_i > z_j$$
This is a simple quadratic programming problem (at least once the strict inequality on $z$ is replaced by a weak one with a small margin).
My question is whether Problem 2 or Problem 3 can be exploited to give a computationally feasible (iterative?) algorithm for Problem 1. I would of course also be interested in any other approach to efficiently solving Problem 1.
Note that the naïve algorithm of alternating between solving Problem 2 and solving Problem 3 cannot possibly converge to a solution of Problem 1, as neither Problem 2 nor 3 depend on $tau$.
Although it might be possible to prove that you can obtain a convergent algorithm by alternating between the two problems, intuitively it seems unlikely to achieve constraint satisfaction with certainty. For guaranteed convergence, this is a problem that would typically be solved using continuous branch-and-bound. If you are a student/academic, you can test this with our Octeract Engine which is free for non-commercial use.
That being said, a way to exploit the formulations algorithmically would be to warm-start the solution of Problem 1 with a feasible solution to either Problem 2 or Problem 3. This would start the algorithm at a point where a subset of the constraints is already satisfied.
You can experiment with either, but I suspect that the best way to go about it would be to solve Problem 2 first, which would give you a feasible point to the non-convex sub-problem. It should then be much easier to obtain a solution that satisfies the remaining constraints.
Answered by Nikos Kazazakis on August 19, 2021
I'm shooting from the hip here (meaning none of the following ideas are tested), but a few different possibilities for heuristics come to mind.
Answered by prubin on August 19, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP