Two-dimensional singular-value decomposition

From testwiki
Jump to navigation Jump to search

Template:Short description Template:Use dmy dates

In linear algebra, two-dimensional singular-value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular-value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).

SVD

Let matrix X=[𝐱1,,𝐱n] contains the set of 1D vectors which have been centered. In PCA/SVD, we construct covariance matrix F and Gram matrix G

F=XX𝖳 , G=X𝖳X,

and compute their eigenvectors U=[𝐮1,,𝐮n] and V=[𝐯1,,𝐯n]. Since VV𝖳=I and UU𝖳=I we have

X=UU𝖳XVV𝖳=U(U𝖳XV)V𝖳=UΣV𝖳.

If we retain only K principal eigenvectors in U,V, this gives low-rank approximation of X.

2DSVD

Here we deal with a set of 2D matrices (X1,,Xn). Suppose they are centered iXi=0. We construct row–row and column–column covariance matrices

F=iXiXi𝖳 and G=iXi𝖳Xi

in exactly the same manner as in SVD, and compute their eigenvectors U and V. We approximate Xi as

Xi=UU𝖳XiVV𝖳=U(U𝖳XiV)V𝖳=UMiV𝖳

in identical fashion as in SVD. This gives a near optimal low-rank approximation of (X1,,Xn) with the objective function

J=i=1n|XiLMiR𝖳|2

Error bounds similar to Eckard–Young theorem also exist.

2DSVD is mostly used in image compression and representation.

References

  • Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp. 32–43, April 2005. http://ranger.uta.edu/~chqding/papers/2dsvdSDM05.pdf
  • Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167–191, 2005.