In linear algebra, a decomposition (also called a factorization) of a matrix is a decomposition of the matrix into an orthogonal and a right triangular matrix. decomposition is often used to solve the linear least squares problem, and is the basis for a particular eigenvalue algorithm, the algorithm.

**Square matrix **

A QR decomposition of a real square matrix is a decomposition of as where is an orthogonal matrix (meaning that ) and is an upper triangular matrix (also called right triangular matrix). This generalizes to a complex square matrix and a unitary matrix . If is nonsingular, then this factorization is unique if we require that the diagonal elements of are positive.

**Rectangular matrix **

More generally, we can factor a complex matrix , with , as the product of an unitary matrix and an upper triangular matrix . As the bottom rows of an upper triangular matrix consist entirely of zeroes, it is often useful to partition , or both and :

where is an upper triangular matrix, is , is , and and both have orthogonal columns.

Golub & Van Loan call the thin factorization of . If is of full rank and we require that the diagonal elements of are positive then and are unique, but in general is not. is then equal to the upper triangular factor of the Cholesky decomposition of ( if is real).

**, and decompositions **

Analogously, we can define , , and decompositions, with being a lower triangular matrix.

**How to solve the Linear Least Square problems**

Having the decomposition, that is, , one can solve the Least Square Problem where a matrix, and with . To be precise, by mean of , we see that is an matrix and is an matrix. Therefore, we can assume both and to be

Now equation is equivalent to . Thus .

**Computing the decomposition **

There are several methods for actually computing the QR decomposition, such as by means of the Gram–Schmidt process, Householder transformations, or Givens rotations. Each has a number of advantages and disadvantages. In this section, we mainly consider the Gram–Schmidt process. More precisely, consider the Gram–Schmidt process, with the vectors to be considered in the process as the columns of the matrix . Assume that after finishing the Gram–Schmidt process, we obtain . Therefore, we put

and thus,

.

**Example **

Let us consider the case when

Then

From , we obtain from , that is,

To obtain , we firstly project onto , then we subtract this result from . More precisely,

where

which yields

Thus

and

Finally,

In Maple, you can practice as shown below.

I like your computation with Maple 😀

A maplet will be useful for summarizing your works in only an aplication.

I’m also writing a paper about numerical method for eigenvalues problem. QR factorization plays an important roll in QR method.

Comment by Tuan Minh — June 18, 2009 @ 19:09

Thanks Minh, since I am not working in Applied Mathematics, so there may be some mistakes in my posts. Please let me know if applicable.

Comment by Ngô Quốc Anh — June 18, 2009 @ 19:22