Mathematics Asked by NikoWielopolski on November 26, 2021
Let $A in mathbb{R}^{n times n}$ be a symmetric, non-definite matrix. My general problem is to solve
$$begin{array}{ll} underset{Z in mathbb{R}^{n times d}}{text{minimize}} & displaystylesum_{ineq j} left( A_{i, j} – (ZZ^T)_{i, j} right)^2end{array}$$
An approximation of the solution may be achieved by minimizing
$$| A – Z Z^T |^2_{Frob}$$
with, for example, stochastic gradient descent. A solution I came up with is such that if we are not interested in the elements on the diagonal, we may just as well minimize
$$sum_{ineq j} (A_{i, j} + B_{i, j}- (ZZ^T)_{i, j})^2$$
where $B$ is any real diagonal matrix. This allows us to approximate the solution further, as we may found such $B$ that $A+B$ is positive definite – it is a simple conclusion from Gershgorin Circle Theorem. We could then use a Singular Value Decomposition of $A+B$:
$$A+B = UDU^T$$
$$A+B approx U_d D_d U_d^T$$
where $U_d$ is a $n times d$ matrix of first $d$ columns of $U$ and $D_d$ is the $d times d$ matrix of the first $d$ rows and $d$ columns of $D$ (recall that the SVD of a real matrix is $A = UDV^T$, but when $A$ is symmetric and positive definite, it simplifies to $UDU^T$). We could then set an approximation to the solution as $Z = U_d D_d^frac{1}{2}$. While minimizing the Frobenius norm of $A-ZZ^T$ with SGD provides better solutions for dense $A$ and small $d$, the SVD solution provides better results for sparse matrices $A$ and faster computation overall. We may, however, rewrite the optimization task as:
$$sum_{ineq j} (A_{i, j} +B_{i, j} – (ZZ^T)_{i, j})^2 = ||A+B-ZZ^T||^2_{Frob} – tr(B-ZZ^T)^2$$
where $tr$ denotes the trace function and we assume that $A$ has zeroes on the diagonal.
Does anyone know whether there exists an exact solution to the problem?
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP