Null SpaceEdit

Null space is a foundational concept in linear algebra that describes the set of input directions that a linear map sends to zero. Given a matrix A, the null space (also called the kernel of the linear transformation represented by A) collects all vectors x for which A x = 0. In more formal terms, if A is an m-by-n matrix, the null space is a subspace of R^n: - N(A) = { x ∈ R^n : A x = 0 }. This idea sits at the heart of understanding constraints and degrees of freedom in systems of linear equations, and it plays a key role in many applications, from solving homogeneous systems to analyzing the behavior of linear models.

The null space is intimately linked to the concept of a linear map T: R^n → R^m defined by A. It is the kernel of T, representing all input vectors that produce no output. Because it is a subspace, it contains the zero vector and is closed under addition and scalar multiplication. Moreover, the null space is orthogonal to the row space of A: any vector in the null space is orthogonal to every row of A, which means it lies in the orthogonal complement of the row space within R^n. This relationship is a geometric way to think about how the rows of A constrain the input directions.

Definition and basic properties

  • Formal definition: For A ∈ Matrix^{m×n}, the null space N(A) = { x ∈ R^n : A x = 0 } is the set of all solutions to the homogeneous system of linear equations A x = 0.
  • Subspace structure: N(A) is a subspace of Vector space; it contains the zero vector and is closed under addition and scalar multiplication.
  • Rank-nullity connection: The dimension of N(A), called the nullity of A, satisfies the rank-nullity theorem: nullity(A) + rank(A) = n, where rank(A) is the dimension of the column space (or image) of A.
  • Orthogonality with the row space: N(A) is the orthogonal complement of the row space of A in R^n; this reflects the idea that vectors in the null space are precisely those directions that fail to produce any component along the rows.

Linking terms: for background on these ideas, see Linear transformation, Kernel (linear algebra), Matrix, Rank (linear algebra), Nullity (linear algebra), Row space.

Computation and representation

  • Solving Ax = 0: The standard method is to perform Gaussian elimination to reduce A to its reduced row-echelon form (RREF). The solutions are typically expressed in parametric form, x = ∑ t_j v_j, where the v_j form a basis for N(A) and the t_j are free parameters.
  • Basis for the null space: A basis for N(A) can be obtained by solving Ax = 0 and collecting the free-variable solutions into vectors that span the space. The dimension of this basis is the nullity of A.
  • Practical steps: Compute a basis by row-reducing A to RREF, identify pivot and free columns, and read off the free-variable expressions to form the basis vectors. This process is closely tied to the ideas behind Gaussian elimination and basis (linear algebra).

See also the relations to Matrix, Rank (linear algebra), and Row space for broader context.

Examples

  • Example 1: A 2×3 matrix A = [[1, 2, 3], [4, 5, 6]].

    • Row-reducing A gives a system with constraints x1 − x3 = 0 and x2 + 2 x3 = 0.
    • Let x3 = t. Then x1 = t and x2 = −2 t, so x = t [1, −2, 1]^T.
    • The null space is N(A) = span{(1, −2, 1)}.
    • This illustrates how a higher-dimensional input space can have nontrivial directions that produce zero output.
  • Example 2: If A is invertible (for example, A = I_n, the identity matrix), then the only solution to Ax = 0 is x = 0, so N(A) = {0}. In this case the null space has dimension 0 and the linear map has full rank.

These examples connect to core ideas about linear maps and their kernels, and they illustrate how algebraic constraints translate into geometric directions in input space. For more intuition, see treatments of Linear transformation and Kernel (linear algebra).

Applications and interpretation

  • Solving homogeneous systems: The null space describes all solutions to A x = 0. When solving systems of equations, understanding the null space reveals the degrees of freedom in the solution.
  • Geometry and invariants: The null space encodes directions in input space that do not affect the output, which is crucial in understanding how a system can be moved without changing its behavior.
  • Differential equations and operators: In many differential problems, the null space of a differential operator corresponds to the general solution of the homogeneous equation, providing a structure for building full solutions.
  • Engineering and data analysis: The null space arises in design and analysis tasks where constraints must be satisfied without producing certain outputs, such as in control theory, signal processing, and subspace methods in data analysis.

See also Linear transformation, Kernel (linear algebra), Rank (linear algebra), and Basis (linear algebra) for foundational connections.

See also