Operations Research Asked on August 19, 2021
begin{align}min&quadsum_{i=1}^Nfrac{A_i}{x_i}\text{s.t.}&quadsum x_i le X\&quad x_i ge 0end{align}
wherein $A_i>0, (iin{1,dots,N})$ is constant, $x_i, (iin{1,dots,N})$ is a continuous optimization variable.
If some $A_i$ is negative the problem is unbounded: we can make the objective arbitrarily small by making $x_i$ arbitrarily close to 0.
Assuming $A_i geq 0$, the optimality is obtained when all $frac{A_i}{x_i^2}$ are equal (KKT optimality conditions), or equivalently all $frac{x_i^2}{A_i}$ are equal, and $sum x_i = X$ (otherwise you can increase some $x_i$ and reduce the objective).
Stating $x_i = sqrt{A_i} y$, you can deduce $y = frac{X}{sum sqrt{A_i}}$ from the equality.
Therefore $x_i = frac{sqrt{A_i}}{sum sqrt{A_j}}X$
Answered by Gabriel Gouvine on August 19, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP