Reducible Singular Value Decomposition

July 4, 2018

While reading the following review article on density-matrix renormalization group (DMRG), I encountered a linear algebra excise based on the last paragraph of page 125.

The paper concludes the result of the excise in one sentence, “they are reduced basis transformations to orthonormal subspaces, hence the largest singular value must be less or equal to that of $S$”. However, I found that the proof is beyond such a single sentence. Therefore, it worth to post the excise together with the solution here.


We first briefly introduce the singular value decomposition (SVD) and another matrix decomposition called reducible SVD as preliminary for the excise.

Let $A \in \mathbb{R}^{n \times n}$ be a real $n$ by $n$ full rank matrix. The SVD of $A$ is $A = U\Sigma V^\top$, where $U \in \mathbb{R}^{n \times n}$ is a unitary matrix being the left singular vectors, $\Sigma \in \mathbb{R}^{n \times n}$ is a diagonal matrix with $A$’s singular values in decreasing order, and $V \in \mathbb{R}^{n \times n}$ is another unitary matrix being the right singular vectors. It is well-known that the SVD is unique up to a complex sign (if the singular vectors are restricted to real matrices, complex sign is simply $\pm$).

We now define another decomposition which has the same structure as the SVD,

\[A = P S Q^\top\]

where $P \in \mathbb{R}^{n \times m}$ satisfies $PP^\top = I$, $S \in \mathbb{R}^{m \times m}$ is a diagonal matrix with positive diagonal in decreasing order, $Q \in \mathbb{R}^{n \times m}$ satisfies $QQ^\top = I$, and $I$ denotes the identity matrix. This is very different from the usual case that $m$ denotes the numerical rank which is upper bounded by $n$. In our excise, we have $m \geq n$. In this post, we call such a decomposition reducible SVD.

If $m$ is strictly larger than $n$, through a trivial example,

\[1 = \begin{bmatrix} \frac{\sqrt{2}}{2} & \frac{\sqrt{2}}{2} \end{bmatrix} \cdot \begin{bmatrix} 1 & \\ & 1 \end{bmatrix} \cdot \begin{bmatrix} \frac{\sqrt{2}}{2} \\ \frac{\sqrt{2}}{2} \end{bmatrix} = \begin{bmatrix} \frac{\sqrt{2}}{2} & \frac{\sqrt{2}}{2} \end{bmatrix} \cdot \begin{bmatrix} 1.5 & \\ & 0.5 \end{bmatrix} \cdot \begin{bmatrix} \frac{\sqrt{2}}{2} \\ \frac{\sqrt{2}}{2} \end{bmatrix},\]

we are convinced that such a decomposition is not unique.


With these definitions, the excise can be summarized as follows. Let $P S Q^\top$ be any reducible SVD of $A$. The following statement holds,

\[\max_i \sigma_i \leq \max_i s_{i},\]

where $\sigma_i$ denotes the $i$-th singular value of $A$ and $s_{i}$ is the $i$-th diagonal entry of $S$. The statement is equivalent to say that the largest singular value is smaller or equal to largest reducible singular value.


Now we will solve the excise. Since both SVD and reducible SVD are decomposition of $A$, we have,

\[A = U \Sigma V^\top = P S Q^\top \Leftrightarrow \Sigma = U^\top P S Q^\top V = G S H^\top,\]

where $G = U^\top P$ satisfies $GG^\top = I$, and $H = V^\top Q$ satisfies $HH^\top = I$. In terms of $G,S$, and $H$, the largest singular value of $A$ can be written as,

\[\sigma_1 = \sum_i G_{1,i} s_i H_{1,i},\]

where $G_{1,i}$ denotes the $(1,i)$-th entry of $G$, and $H_{1,i}$ is defined analogously. The conditions on $G$ and $H$ led to,

\[\sum_i G_{1,i}^2 = 1 \quad \text{and} \quad \sum_i H_{1,i}^2 = 1.\]

Then we have,

\[\sigma_1^2 = \left( \sum_i G_{1,i} s_i H_{1,i} \right)^2 \leq \left( \sum_i G_{1,i}^2 \right) \cdot \left( \sum_i s_i^2 H_{1,i}^2 \right) \leq \left( \sum_i H_{1,i}^2 \right) \cdot \max_i s_i^2 = s_1^2,\]

where the first inequality is due to Cauchy-Schwartz inequality and the second inequality is due to Hölder’s inequality with $p=1$ and $q=\infty$. Finally, since both $\sigma_1$ and $s_1$ are positive, we conclude $\max_i \sigma_i = \sigma_1 \leq s_1 = \max_i s_i$.


Yingzhou Li


I'm a faculty at Fudan University. Follow me on Github and Bitbucket. Hopefully, you will find my code useful. You are also welcome to either email me or post comments below to discuss on these posts.