Coulomb UnitEdit
The coulomb is the standard SI unit used to quantify electric charge. It expresses how much charge is moved, stored, or held by physical systems in a wide range of contexts—from the flow of current in a wire to the charged particles inside a capacitor. By definition, one coulomb corresponds to the amount of charge transported by a constant current of one ampere in one second. In symbols, Q = I·t, so charge is the product of current and time.
This unit is named after Charles-Augustin de Coulomb, a French physicist whose work in electrostatics laid the foundation for understanding how charges interact. His formulation of the inverse-square dependence of force between charges—now known as Coulomb's law—helped establish the fundamental idea that charge comes in discrete, interacting quantities. The term coulomb honors his contributions to the science of electricity and magnetismCharles-Augustin de Coulomb; Coulomb's law and related concepts remain central to the study of electrostatics and electromagnetism.
The coulomb is a derived unit in the International System of Units (SI): it is built from the base units for current and time. The modern SI framework, finalized in 2019, defines the ampere by fixing the numerical value of the elementary charge e to exactly 1.602176634×10^-19 coulombs per elementary charge. Since the coulomb equals ampere times second, this fixes the coulomb in terms of these constants and ensures consistency across measurements and standards. In practice, this means the coulomb remains the natural unit for describing how much charge is present or moved in a given process, while the underlying current and time standards are anchored to fundamental constantsampere, second, elementary charge.
History and definition
Historical background
In the 18th and 19th centuries, researchers measured forces between charges and developed the quantitative description now known as Coulomb's law. This work demonstrated that electric charges produce forces that diminish with the square of the distance between them, a cornerstone of electrostatics and the broader theory of electromagnetismCoulomb's law.
As electrical science matured, practical systems required a standardized way to express how much charge was involved in processes such as electrolysis, electroplating, and signal transmission. The unit that would become the coulomb emerged as a convenient measure of charge, tying together instantaneous current and the duration over which charge moves in a circuitelectric charge.
Modern definition
With the 2019 redefinition of SI units, the ampere—previously defined in terms of a physical experiment—was redefined by fixing the value of the elementary charge e. Concretely, 1 ampere equals 1 coulomb per second, and the coulomb remains the product of current and time: 1 C = 1 A·s. This change anchored the coulomb to immutable natural constants and reinforced its role as a practical measure of charge in both laboratory and engineering settingsampere, elementary charge.
Measurement and magnitude
A coulomb represents a substantial amount of charge on macroscopic objects. For perspective, the charge carried by a single electron is about −1.602×10^-19 coulombs, so enormous quantities of charge are accumulated when many electrons participate in a current or reside on a capacitor plate. In practical terms, charges are commonly discussed in fractions or multiples of a coulomb, such as microcoulombs (μC) and nanocoulombs (nC), to reflect the scale of everyday devices and experiments.
The amount of charge moved in a process is the product of the current and the duration of that process, Q = ∫ I dt. This relationship makes the coulomb central to both the analysis of circuits and the study of electrochemical reactions, where the amount of substance transformed is linked to the total charge via the Faraday constant: about 96485 C per mole of electrons. This connection provides a bridge between electrical measurements and chemical quantitiesFaraday constant.
The elementary charge also anchors intuition about magnitudes: approximately 6.241×10^18 elementary charges comprise one coulomb. Because charge is quantized, the coulomb serves as a natural and practical unit when dealing with macroscopic systems, whereas the individual electron charge remains the fundamental discrete unit at the microscopic levelelementary charge.
Units, practice, and connections
In practice, the coulomb is encountered across several contexts: - In electronics and circuits, current is measured in amperes, and the total charge passed over a time interval is given in coulombsampere. - In electrochemistry and electroplating, the quantity of material deposited or dissolved is related to the total charge moved, using the Faraday constantFaraday constant. - In capacitors and energy storage, the stored charge on plates is quantified in coulombs, with the voltage across the plates reflecting energy relationships tied to capacitance and chargecapacitor. - In electromagnetism and field theory, the concept of charge is fundamental to Gauss's law and Coulomb's law, which describe how charges influence electric fields and forcesGauss's law, Coulomb's law; these ideas underpin modern technology ranging from motors to sensors.
Common prefixes extend the Coulomb to practical scales: - microcoulomb (μC) for intermediate charges - nanocoulomb (nC) for smaller scale measurements - picocoulomb (pC) for very small charges in precision experiments