The Modern Definition of a Meter
The meter, the fundamental unit of length in the International System of Units (SI), is currently defined as the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second. This definition was adopted in 1983 by the General Conference on Weights and Measures (CGPM) and provides an exact value for the speed of light.
Why This Definition Was Chosen
Prior to 1983, the meter was defined by physical artifacts or specific wavelengths of light, which had inherent limitations in precision and reproducibility. The current definition links the meter to a fundamental constant of nature—the speed of light (c)—which is universally constant. This ensures extreme precision, stability, and reproducibility of the standard across all laboratories worldwide, removing reliance on physical objects that can degrade or be lost.
Historical Evolution of the Meter
Originally, in 1793, the meter was defined as one ten-millionth of the distance from the North Pole to the Equator along a meridian passing through Paris. Later, in 1889, a physical platinum-iridium bar was established as the international prototype meter. Subsequent definitions involved the wavelength of light emitted by a krypton-86 atom (1960), before the ultimate transition to the current speed of light definition.
Practical Implications and Applications
This light-based definition allows scientists and engineers to determine the length of a meter anywhere, anytime, with unprecedented accuracy, primarily using precise time and frequency measurements. It is crucial for advanced technologies such as Global Positioning Systems (GPS), laser ranging, and high-precision manufacturing, where exact and consistent length measurements are paramount for functionality and reliability.