next up previous
Next: Orthogonal Projections Up: CSL361 Problem set 5: Previous: SVD

Least-squares

  1. Consider the least-squares problem:
    Find the least-squares solution to the $ m \times n$ set of equations $ {\bf Ax} = {\bf b}$, where $ m > n$ and
    Show that the following constitute a solution:
    1. Find the SVD $ {\bf A} = {\bf U}{\bf\Sigma}{\bf V}^T$
    2. Set $ {\bf b'} = {\bf U}^T{\bf b}$
    3. Find the vector $ {\bf y}$ defined by $ y_i = b'_i/\sigma_i$, where $ \sigma_i$ is the $ i^{th}$ diagonal entry of $ {\bf\Sigma}$
    4. The solution is $ {\bf x} = {\bf Vy}$
  2. Consider the least-squares problem:
    Find the general least-squares solution to the $ m \times n$ set of equations $ {\bf Ax} = {\bf b}$, where $ m > n$ and
    Show that the following constitute a solution:
    1. Find the SVD $ {\bf A} = {\bf U}{\bf\Sigma}{\bf V}^T$
    2. Set $ {\bf b'} = {\bf U}^T{\bf b}$
    3. Find the vector $ {\bf y}$ defined by $ y_i = b'_i/\sigma_i$, for $ i = 1,\ldots,r$, and $ y_i = 0$ otherwise.
    4. The solution $ {\bf x}$ of minimum norm $ \Vert{\bf x}\Vert$ is $ {\bf Vy}$
    5. The general solution is

      $\displaystyle {\bf x} = {\bf Vy} + \lambda_{r+1}{\bf v}_{r+1} + \ldots +
\lambda_{n}{\bf v}_{n}
$

      where are the last columns of $ {\bf V}$.
  3. Given a square diagonal matrix $ \Sigma$, its pseudo-inverse is defined as the diagonal matrix $ \Sigma^+$ such that

    $\displaystyle \sigma_{ii}^+ = \left\{
\begin{array}{cc}
0 & \mbox{if $\sigma_{ii} = 0$} \\
\Sigma_{ii}^{-1} & \mbox{otherwise}
\end{array}\right.
$

    For an $ m \times n$ matrix $ A$ with $ m \geq n$ with $ A = U \Sigma V^t$, its pseudo-inverse is defined as

    $\displaystyle A^+ = V \Sigma^+ U^t
$

    Show that the least-squares solution to an $ m \times n$ system of equations $ {\bf Ax} = {\bf b}$ of rank $ n$ is given by (pseudo-inverse). In the case of a deficient-rank system, $ {\bf x} = {\bf A}^+{\bf b}$ is the solution that minimizes $ \Vert{\bf x}\Vert$.
  4. Show that if $ {\bf A}$ is an $ m \times n$ matrix of rank , then $ {\bf A}^+ = ({\bf A}^T{\bf A})^{-1}{\bf A}^T$ and, in general, a least-squares solution can be obtained by solving the normal equations

    $\displaystyle ({\bf A}^T{\bf A}){\bf x} = {\bf A}^T{\bf b}
$

  5. Weighted least-squares: Let $ {\bf C}$ be a positive definite matrix. Then the $ {\bf C}$-norm is defined as $ \Vert{\bf a}\Vert _{\bf C} = ({\bf a}^T{\bf C}{\bf a})^{1/2}$. The weighted least-squares problem is one of minimizing . The most common weighting is when $ {\bf C}$ is diagonal. Show that weigthed least-sqaures solution can be obtained by solving:

    $\displaystyle ({\bf A}^T{\bf C}{\bf A}){\bf x} = {\bf A}^T{\bf C}{\bf b}
$

  6. Consider the constrained least squares problem:
    Given $ {\bf A}$ of size $ n \times n$, find $ {\bf x}$ that minimizes $ \Vert{\bf Ax}\Vert$ subject to $ \Vert{\bf x}\Vert = 1$.
    Show that the solution is given by the last column of $ {\bf V}$ where $ {\bf A} = {\bf U}{\bf\Sigma}{\bf V}^T$ is the SVD of $ {\bf A}$.
  7. Consider the following constrained least-squares problem: Given an $ m \times n$ matrix $ {\bf A}$ with $ m \geq n$, find the vector $ {\bf x}$ that minimizes $ \Vert{\bf Ax}\Vert$ subject to $ \Vert{\bf x}\Vert = 1$ and $ {\bf Cx} = {\bf0}$.
    Show that a solution is given as:
    1. If $ {\bf C}$ has fewer rows than columns, then add $ {\bf0}$ rows to $ {\bf C}$ to make it square. Compute the SVD $ {\bf C} = {\bf U}{\bf\Sigma}{\bf V}^T$. Let $ {\bf C}^\perp$ be the matrix obtained from $ {\bf V}$ after deleting the first $ r$ columns where $ r$ is the number of non-zero entries in $ {\bf\Sigma}$.
    2. Find the solution to minimization of $ \Vert{\bf A}{\bf C}^\perp{\bf x'}\Vert$ subject to .
    3. The solution is obtained as $ {\bf x} = {\bf C}^\perp{\bf x'}$.
  8. Consider the following constrained least-squares problem: Given an $ m \times n$ matrix $ {\bf A}$ with $ m \geq n$, find the vector $ {\bf x}$ that minimizes $ \Vert{\bf Ax}\Vert$ subject to $ \Vert{\bf x}\Vert = 1$ and .
    Show that a solution is given as:
    1. Compute the SVD $ {\bf G} = {\bf U}{\bf\Sigma}{\bf V}^T$. Let
    2. Let $ {\bf U'}$ be the matrix of first $ r$ columns of $ {\bf G}$ where .
    3. Find the solution to minimization of $ \Vert{\bf A}{\bf U'}{\bf x'}\Vert$ subject to $ \Vert{\bf x'}\Vert = 1$.
    4. The solution is obtained as $ {\bf x} = {\bf U'}{\bf x'}$.

next up previous
Next: Orthogonal Projections Up: CSL361 Problem set 5: Previous: SVD
Subhashis Banerjee 2005-10-03