nlcpy.linalg.qr

nlcpy.linalg.qr(a, mode='reduced')[source]

Computes the qr factorization of a matrix.

Factor the matrix a as qr, where q is orthonormal and r is upper-triangular.

Parameters
a(M, N) array_like

Matrix to be factored.

mode{‘reduced’, ‘complete’, ‘r’, ‘raw’, ‘full’, ‘economic’}, optional

If K = min(M, N), then

  • ‘reduced’ : returns q, r with dimensions (M, K), (K, N) (default)

  • ‘complete’ : returns q, r with dimensions (M, M), (M, N)

  • ‘r’ : returns r only with dimensions (K, N)

  • ‘raw’ : returns h, tau with dimensions (N, M), (K,)

  • ‘full’ or ‘f’ : alias of ‘reduced’, deprecated

  • ‘economic’ or ‘e’ : returns h from ‘raw’, deprecated.

Returns
qndarray, optional

A matrix with orthonormal columns. When mode = ‘complete’ the result is an orthogonal/unitary matrix depending on whether or not a is real/complex. The determinant may be either +/- 1 in that case.

rndarray, optional

The upper-triangular matrix.

(h, tau)ndarray, optional

The array h contains the Householder reflectors that generate q along with r. The tau array contains scaling factors for the reflectors. In the deprecated ‘economic’ mode only h is returned.

Note

This is an interface to the LAPACK routines dgeqrf, zgeqrf, dorgqr, and zungqr.

For more information on the qr factorization, see for example: https://en.wikipedia.org/wiki/QR_factorization

Note that when ‘raw’ option is specified the returned arrays are of type “float64” or “complex128” and the h array is transposed to be FORTRAN compatible.

Examples

>>> import numpy as np
>>> import nlcpy as vp
>>> from nlcpy import testing
>>> a = vp.random.randn(9, 6)
>>> q, r = vp.linalg.qr(a)
>>> vp.testing.assert_allclose(a, vp.dot(q, r))  # a does equal qr
>>> r2 = vp.linalg.qr(a, mode='r')
>>> r3 = vp.linalg.qr(a, mode='economic')
>>> # mode='r' returns the same r as mode='full'
>>> vp.testing.assert_allclose(r, r2)
>>> # But only triu parts are guaranteed equal when mode='economic'
>>> vp.testing.assert_allclose(r, np.triu(r3[:6,:6], k=0))

Example illustrating a common use of qr: solving of least squares problems

What are the least-squares-best m and y0 in y = y0 + mx for the following data: {(0,1), (1,0), (1,2), (2,1)}. (Graph the points and you’ll see that it should be y0 = 0, m = 1.) The answer is provided by solving the over-determined matrix equation Ax = b, where:

A = array([[0, 1], [1, 1], [1, 1], [2, 1]])
x = array([[y0], [m]])
b = array([[1], [0], [2], [1]])

If A = qr such that q is orthonormal (which is always possible via Gram-Schmidt), then x = inv(r) * (q.T) * b. (In practice, however, we simply use lstsq().)

>>> A = vp.array([[0, 1], [1, 1], [1, 1], [2, 1]])
>>> A
array([[0, 1],
       [1, 1],
       [1, 1],
       [2, 1]])
>>> b = vp.array([1, 0, 2, 1])
>>> q, r = vp.linalg.qr(A)
>>> p = vp.dot(q.T, b)
>>> vp.dot(vp.linalg.inv(r), p)
array([1.1102230246251565e-16, 1.0000000000000002e+00])