I’m trying to calculate operator norms of linear transformations over spaces of matrices. For instance, find norm of $f(A)=XA$ by optimizing following:

$$max_{|A|=1} |XA|$$

This looks like a semidefinite programme, but I’m having trouble solving it with SemidefiniteOptimization. Simplest failing example is to find operator norm of $f(A)=5A$ in 1 dimension. It fails with `Stuck at the edge of dual feasibility`

. Any suggestions?

Constraints

$$

A succ 0\

Isucc A \

x I succ -5 A

$$

Objective

$$

text{min}_{A,x} x

$$

```
d = 1;
ii = IdentityMatrix(d);
(* Symbolic symmetric d-by-d matrix *)
ClearAll(a);
X = 5*ii;
A = Array(a(Min(#1, #2), Max(#1, #2)) &, {d, d});
vars = DeleteDuplicates(Flatten(A));
cons0 = VectorGreaterEqual({A, 0}, {"SemidefiniteCone", d});
cons1 = VectorGreaterEqual({ii, A}, {"SemidefiniteCone", d});
cons2 = VectorGreaterEqual({x ii, -X.A}, {"SemidefiniteCone", d});
SemidefiniteOptimization(x, cons0 && cons1 && cons2, {x}~Join~vars)
```

**Edit** it seems I get the correct result if I drop cons0. Going to more dimensions still fails. It works when $X$ is a multiple of identity but when it’s a different diagonal matrix I get a message about some matrix not being symmetric, for instance the example below fails in this way.

```
d = 2;
ii = IdentityMatrix(d);
ClearAll(a);
extractVars(mat_) := DeleteDuplicates@Cases(Flatten@A, _a);
A = Array(a(Min(#1, #2), Max(#1, #2)) &, {d, d});
vars = extractVars(A);
X = DiagonalMatrix@Range@d;
cons1 = VectorGreaterEqual({ii, A}, {"SemidefiniteCone", d});
cons2 = VectorGreaterEqual({x ii, -X.A}, {"SemidefiniteCone", d});
SemidefiniteOptimization(x, cons1 && cons2, {x}~Join~vars)
```