Skip to main content

Gram Schmidt Calculator

Our free linear algebra calculator solves gram schmidt problems. Get worked examples, visual aids, and downloadable results.

Share this calculator

Formula

u_k = v_k - sum_{j=1}^{k-1} proj(v_k, u_j), where proj(v, u) = (v dot u)/(u dot u) * u

Each vector u_k is obtained by subtracting from v_k its projections onto all previously computed orthogonal vectors u_1, ..., u_{k-1}. The result is perpendicular to all previous vectors. Normalizing gives orthonormal vectors e_k = u_k / ||u_k||.

Worked Examples

Example 1: Orthogonalizing Three 3D Vectors

Problem: Apply Gram-Schmidt to v1 = [1, 1, 0], v2 = [1, 0, 1], v3 = [0, 1, 1].

Solution: u1 = v1 = [1, 1, 0], ||u1|| = sqrt(2)\ne1 = [1/sqrt(2), 1/sqrt(2), 0]\n\nproj(v2, u1) = (1+0+0)/(1+1+0) = 1/2\nu2 = [1, 0, 1] - (1/2)[1, 1, 0] = [1/2, -1/2, 1], ||u2|| = sqrt(3/2)\ne2 = [1/sqrt(6), -1/sqrt(6), 2/sqrt(6)]\n\nproj(v3, u1) = (0+1+0)/2 = 1/2, proj(v3, u2) = (0-1/2+1)/(3/2) = 1/3\nu3 = [0,1,1] - (1/2)[1,1,0] - (1/3)[1/2,-1/2,1] = [-2/3, 2/3, 2/3]\ne3 = [-1/sqrt(3), 1/sqrt(3), 1/sqrt(3)]

Result: Orthonormal basis: e1 = [0.707, 0.707, 0], e2 = [0.408, -0.408, 0.816], e3 = [-0.577, 0.577, 0.577]

Example 2: Detecting Linear Dependence

Problem: Apply Gram-Schmidt to v1 = [1, 2, 3], v2 = [2, 4, 6], v3 = [1, 0, 1].

Solution: u1 = v1 = [1, 2, 3], ||u1|| = sqrt(14)\n\nproj(v2, u1) = (2+8+18)/14 = 28/14 = 2\nu2 = [2, 4, 6] - 2*[1, 2, 3] = [0, 0, 0]\nv2 is linearly dependent on v1 (it is exactly 2*v1).\n\nu3 = v3 - proj(v3, u1) = [1,0,1] - (1+0+3)/14 * [1,2,3] = [1-2/7, -4/7, 1-6/7]

Result: Only 2 independent vectors found (rank = 2). v2 = 2*v1 is dependent.

Frequently Asked Questions

What is the Gram-Schmidt process?

The Gram-Schmidt process is an algorithm that takes a set of linearly independent vectors and produces a set of orthogonal (or orthonormal) vectors that span the same subspace. It works by iteratively subtracting the projections of each new vector onto all previously computed orthogonal vectors. The result is a set of mutually perpendicular vectors that are much easier to work with in computations. The process is named after Jorgen Pedersen Gram and Erhard Schmidt, though the concept predates them. It is a cornerstone of numerical linear algebra and is used in QR decomposition, least squares problems, and signal processing.

How does the Gram-Schmidt process work step by step?

The Gram-Schmidt process starts with the first vector unchanged: u1 = v1. For the second vector, subtract its projection onto u1: u2 = v2 - proj(v2, u1), where proj(v2, u1) = (v2 dot u1)/(u1 dot u1) * u1. For the third vector, subtract projections onto both u1 and u2: u3 = v3 - proj(v3, u1) - proj(v3, u2). This continues for each additional vector. After computing each orthogonal vector, normalize it by dividing by its length to get an orthonormal vector. The projection subtraction removes the component parallel to each previous vector, leaving only the perpendicular component.

What is the QR decomposition and how does Gram-Schmidt relate to it?

The QR decomposition factors a matrix A into Q (an orthogonal or unitary matrix) and R (an upper triangular matrix) such that A = QR. The Gram-Schmidt process directly computes the QR decomposition: the orthonormal vectors become the columns of Q, and R is formed from the projection coefficients. Specifically, R_ij = (v_j dot e_i) for i <= j and R_ij = 0 for i > j. QR decomposition is crucial for solving least squares problems, computing eigenvalues (QR algorithm), and providing numerically stable solutions to linear systems. It is one of the most important matrix factorizations in numerical computing.

What is the modified Gram-Schmidt process and why is it preferred?

The modified Gram-Schmidt (MGS) process rearranges the order of computations to improve numerical stability. In classical Gram-Schmidt, all projections for a vector are computed using the original orthogonal vectors. In MGS, after computing each projection and subtracting it, the vector is immediately updated before computing the next projection. This seemingly minor change significantly reduces the accumulation of rounding errors. Classical Gram-Schmidt can produce vectors that are far from orthogonal in floating-point arithmetic, while MGS maintains much better orthogonality. For practical numerical computation, MGS or Householder reflections are strongly preferred over classical Gram-Schmidt.

What is a projection in the context of Gram-Schmidt?

A projection of vector v onto vector u, written proj(v, u), is the component of v that lies along the direction of u. The formula is proj(v, u) = (v dot u)/(u dot u) * u. Geometrically, it is the shadow of v cast onto the line defined by u. The scalar coefficient (v dot u)/(u dot u) tells you how many copies of u make up this shadow. In Gram-Schmidt, subtracting the projection removes the component of the new vector that is parallel to a previously computed orthogonal vector, leaving only the perpendicular component. This is why the resulting vectors are guaranteed to be orthogonal to each other.

What are the applications of the Gram-Schmidt process?

The Gram-Schmidt process has wide applications across mathematics and engineering. In signal processing, it creates orthogonal filter banks for efficient signal decomposition. In statistics, it orthogonalizes predictor variables in regression to diagnose multicollinearity. In quantum mechanics, it constructs orthonormal basis states from general state vectors. In computer graphics, it re-orthogonalizes rotation matrices that drift due to numerical errors. In communications engineering, it designs orthogonal codes for CDMA systems. In numerical methods, it is the foundation of the Arnoldi and Lanczos algorithms for computing eigenvalues of large sparse matrices.

References