Kohn Sham EquationsEdit
The Kohn–Sham equations stand as the cornerstone of density functional theory (DFT) in modern computational science. They provide a practical route to solving the electronic structure of atoms, molecules, and solids by transforming a complex interacting many-electron problem into a set of self-consistent single-particle equations. The essential idea is to reproduce the true electron density of an interacting system with a fictitious non-interacting reference system that is easier to solve numerically.
Developed in the mid-1960s by Walter Kohn and Lu Jeu Sham, the approach builds on the foundational Hohenberg–Kohn theorems of 1964, which establish that all ground-state properties of an electronic system are determined by its density. The Kohn–Sham construction introduces a non-interacting framework that shares the same density as the real, interacting system. This makes the problem tractable while retaining the correct physics through an exchange–correlation term that encapsulates all many-body effects beyond the classical electrostatic interaction. In practice, the exact form of this exchange–correlation functional is unknown, so researchers employ a hierarchy of approximations such as the Local Density Approximation (Local Density Approximation), the Generalized Gradient Approximation (Generalized Gradient Approximation), meta-GGA functionals, and various hybrid functionals. These choices influence the accuracy of computed properties, from bond energies to electronic band structures.
Theory
The Kohn–Sham formulation recasts the problem in terms of a set of orbital equations for fictitious non-interacting electrons moving in an effective potential. For spinless or spin-polarized cases, the central equation can be written as:
- [Kohn–Sham equation in form] The effective single-particle equations (one for each occupied orbital) are solved self-consistently to yield the electron density n(r). The effective potential v_eff(r) comprises three main pieces:
- the external potential v_ext(r) produced by nuclei,
- the Hartree potential v_H(r), which accounts for classical electrostatic repulsion between electrons, and
- the exchange–correlation potential v_xc(r), defined as the functional derivative δE_xc[n]/δn(r) of the exchange–correlation energy E_xc[n].
The electron density is reconstructed from the occupied Kohn–Sham orbitals φ_i(r) via n(r) = ∑_i f_i |φ_i(r)|^2, where f_i are occupation numbers. The total energy functional in this framework is E[n] = T_s[n] + ∫ v_ext(r) n(r) dr + E_H[n] + E_xc[n], where T_s[n] is the kinetic energy of the non-interacting reference system and E_H[n] is the classical (Hartree) electrostatic energy.
Solving the Kohn–Sham equations requires a self-consistent field (SCF) cycle: - start with an initial guess for the density n(r), - build v_eff(r) from n(r), - solve the Kohn–Sham equations to obtain orbitals φ_i(r), - compute a new density n(r) from the orbitals, and - iterate until convergence of n(r) (and typically the total energy) is achieved.
This approach reduces the many-electron problem to a tractable set of one-electron equations while still capturing essential quantum effects through E_xc[n].
Computational implementation
In practice, the Kohn–Sham equations are solved within a chosen basis or grid representation. Popular choices include plane-wave basis sets for periodic systems and localized basis sets (e.g., Gaussian or numeric atomic orbitals) for molecules. Pseudopotentials or projector-augmented wave (PAW) methods are frequently employed to remove core electrons from the explicit calculation, focusing computational effort on valence electrons where chemical bonding occurs. The SCF cycle is implemented with a variety of mixer schemes and convergence accelerators to handle challenging systems, including metals and large biomolecules.
A key advantage of the Kohn–Sham approach is computational efficiency: the heavy lifting is done by solving a system of single-particle equations rather than handling an explicit many-body wavefunction. This has allowed DFT to become a standard tool in chemistry and materials science, enabling routine predictions of geometries, vibrational frequencies, reaction energetics, and electronic structures for systems ranging from small molecules to extended solids.
Approximations and functionals
Because the exact exchange–correlation functional E_xc[n] is unknown, practitioners rely on a family of approximations with different trade-offs: - Local Density Approximation (Local Density Approximation): uses only the local density, often performing well for close-packed metals but tending to overbind and to underestimate lattice spacings in many systems. - Generalized Gradient Approximation (Generalized Gradient Approximation): incorporates density gradients to improve upon LDA for a broad class of systems. - Meta-GGA functionals: include additional information such as kinetic-energy density or higher-order derivatives, offering improved accuracy for diverse materials. - Hybrid functionals: mix a portion of exact exchange from Hartree–Fock method with a DFT exchange–correlation functional, often yielding improved thermochemistry and band gaps for many molecules and solids. - Nonlocal and dispersion corrections: van der Waals interactions are essential for weakly bound systems, leading to methods such as DFT-D corrections or nonlocal van der Waals functionals.
These choices influence calculated properties in systematic ways. The continued development of E_xc[n] aims to broaden reliability across chemistry, solid-state physics, and catalysis. Related concepts include exchange–correlation functional theory as a framework for understanding these approximations, and the ongoing discussion about the best balance between accuracy and cost.
Limitations and debates
Despite its successes, the Kohn–Sham framework has well-known limitations. Some materials with strong electronic correlations—for example, certain transition-metal oxides—can be challenging for standard functionals, sometimes failing to reproduce insulating states or correct magnetic behavior. In such cases, practitioners employ approaches like DFT+U to partially restore correlation effects or turn to more sophisticated many-body techniques such as the GW approximation for improved band structure predictions.
A persistent issue is the band-gap problem: common functionals systematically underestimate fundamental gaps in insulators and semiconductors. This stems from the derivative discontinuity of the exact exchange–correlation functional, which approximate functionals do not capture completely. As a result, researchers compare not only total energies but also excitation energies and optical properties with caution, often turning to time-dependent methods (Time-dependent density functional theory or TDDFT) for excited-state descriptions.
There is also ongoing discussion about the balance between accuracy and computational cost, especially for large systems or high-throughput screening efforts. Hybrid functionals improve accuracy in many cases but at a greater computational expense. For systems where dispersion forces play a dominant role, including van der Waals corrections is essential, and new nonlocal functionals are continually being developed.
Applications and impact
Kohn–Sham density functional theory has become the workhorse of computational chemistry and solid-state physics. Its scope includes: - predicting molecular geometries, vibrational spectra, and reaction energetics in chemistry, - exploring catalytic surfaces, adsorption phenomena, and reaction mechanisms, - modeling electronic band structures, densities of states, and magnetic properties in materials science, - investigating defects, surfaces, and interfaces in semiconductors and oxide materials, - guiding materials design for energy storage, electronics, and photonics.
The method's widespread adoption is reflected in the breadth of software packages and community resources, as well as the extensive literature on functional development and practical benchmarks. The Kohn–Sham framework continues to evolve, integrating advances from many-body theory and machine learning to extend its applicability and reliability.