TransWikia.com

SemidefiniteOptimization for operator norms: Stuck at the edge of dual feasibility

Mathematica Asked on September 16, 2020

Can someone give a workaround and/or explanation why Problem 1/Problem 2 fail to solve through SemidefiniteOptimization? Problem 3 works. (I’m using 12.1.0 on Mac). The main difference is that Problem 1+2 use diagonal matrix constraint, whereas Problem 3 matrix constraint has no 0’s. I could solve them without calling SemidefiniteOptimization, but prefer to have a single solution to cover a wide range of cases.

Problem 1

Implementation below below fails with Stuck at the edge of dual feasibility.

Find operator norm of $f(A)=5A$ by solving the following problem:

$$
text{min}_{A,x} x
$$

Subject to
$$
Isucc A succ -I
x I succ -5 A
$$

d = 1;
ii = IdentityMatrix[d];
(* Symbolic symmetric d-by-d matrix *)
ClearAll[a];
X = 5*ii;
A = Array[a[Min[#1, #2], Max[#1, #2]] &, {d, d}];
vars = DeleteDuplicates[Flatten[A]];

cons0 = VectorGreaterEqual[{A, -ii}, {"SemidefiniteCone", d}];
cons1 = VectorGreaterEqual[{ii, A}, {"SemidefiniteCone", d}];
cons2 = VectorGreaterEqual[{x ii, -X.A}, {"SemidefiniteCone", d}];
SemidefiniteOptimization[x, cons0 && cons1 && cons2, {x}~Join~vars]

Problem 2

This example in $2$ dimension gives a different error: The matrix {{0.,1.},{2.,0.}} is not Hermitian or real and symmetric. Where did this matrix come from?

Find operator norm of $f(A)=left(begin{matrix}1&0&2end{matrix}right) A$ by solving the following problem:

$$
text{min}_{A,x} x
$$

Subject to
$$
I succ A succ -I
x I succ -left(begin{matrix}1&0&2end{matrix}right) A
$$

d = 2;
ii = IdentityMatrix[d];
ClearAll[a];
extractVars[mat_] := DeleteDuplicates@Cases[Flatten@A, _a];
A = Array[a[Min[#1, #2], Max[#1, #2]] &, {d, d}];
vars = extractVars[A];
X = DiagonalMatrix@Range@d;
cons0 = VectorGreaterEqual[{A, -ii}, {"SemidefiniteCone", d}];
cons1 = VectorGreaterEqual[{ii, A}, {"SemidefiniteCone", d}];
cons2 = VectorGreaterEqual[{x ii, -X.A}, {"SemidefiniteCone", d}];
SemidefiniteOptimization[x, cons0 && cons1 && cons2, {x}~Join~vars]
(* Prints SemidefiniteOptimization::herm: The matrix {{0.,1.},{2.,0.}} is not Hermitian or real and symmetric. *)

Problem 3

For this problem SemidefiniteOptimization works, even though the problem seems harder than the previous two.

Find operator norm of $f(A)=sum_i^{d^2} V_i’ A V_i$ by solving the following problem:

$$
text{min}_{A,x} x
$$

Subject to
$$
I succ A succ -I
x I succ -sum_i^{d^2} V_i’ A V_i
$$

d = 4;
SeedRandom[1];
ii = IdentityMatrix[d];
ClearAll[a];
extractVars[mat_] := DeleteDuplicates@Cases[Flatten@A, _a];
A = Array[a[Min[#1, #2], Max[#1, #2]] &, {d, d}];
Vs = Table[RandomReal[{-1, 1}, {d, d}], {d^2}];
B = Total[Transpose[#].A.# & /@ Vs];
vars = extractVars[A];
X = DiagonalMatrix@Range@d;
cons0 = VectorGreaterEqual[{A, -ii}, {"SemidefiniteCone", d}];
cons1 = VectorGreaterEqual[{ii, A}, {"SemidefiniteCone", d}];
cons2 = VectorGreaterEqual[{x ii, -B}, {"SemidefiniteCone", d}];
solution = 
  A /. SemidefiniteOptimization[x, cons1 && cons2, {x}~Join~vars];
Print["result matches Russo-Dye: ", 
 Norm[solution - IdentityMatrix[d]] < 10^-7] (* True *)

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP