The Celsius scale is based on the properties of water. It uses two anchor points: 0 degrees at the freezing point of water and 100 degrees at the boiling point, dividing the range between them into 100 equal steps. That straightforward design is why it was originally called “centigrade,” from the Latin for “hundred steps.” Since 2019, Celsius has been formally tied to a fundamental physical constant rather than water alone, but water remains the practical foundation most people recognize.
The Original 1742 Scale Was Upside Down
Swedish astronomer Anders Celsius proposed his temperature scale in 1742, but it looked nothing like the version we use today. He assigned 100 degrees to the freezing point of water and 0 degrees to the boiling point. The scale ran in the opposite direction from what feels intuitive: higher numbers meant colder temperatures.
After Celsius died in 1744, other scientists flipped the scale. Swedish botanist Carolus Linnaeus is generally credited as the first to reverse the values, placing 0 at freezing and 100 at boiling. That reversed version caught on across Europe and became the standard.
Why the Name Changed From Centigrade
For most of its history, the scale was called “centigrade.” In 1948, the 9th General Conference on Weights and Measures, representing 33 nations, officially adopted the name “Celsius” instead. The change avoided confusion with a unit used in some countries to measure angles (also called a “grade” or “gradian”), and it honored the astronomer who started it all.
How Celsius Relates to Kelvin
One degree Celsius is exactly the same size as one kelvin. The only difference is where zero sits. The Kelvin scale starts at absolute zero, the theoretical temperature where all molecular motion stops. That point falls at minus 273.15 degrees Celsius. So converting between the two is simple: add 273.15 to any Celsius temperature to get kelvins, or subtract 273.15 to go the other direction. Water freezes at 273.15 K and boils at 373.15 K.
This relationship matters because the kelvin is the official base unit for temperature in the International System of Units (SI). Celsius is technically defined through the kelvin: a Celsius reading is just the kelvin value minus 273.15. So any change to how the kelvin is defined automatically changes the technical foundation of Celsius, too.
The 2019 Redefinition
Until 2019, the kelvin (and therefore Celsius) was formally defined by a single physical event: the triple point of water, the precise temperature and pressure at which ice, liquid water, and water vapor all coexist. That point sits at 0.01 degrees Celsius. International standards bodies set it as exactly 273.16 kelvins, and every other temperature measurement traced back to it.
In 2019, the international measurement community redefined the kelvin by locking in a fixed value for a fundamental physical constant called the Boltzmann constant, which links temperature to energy at the molecular level. The practical effect for everyday thermometers is zero: water still freezes at 0°C and boils at 100°C. But scientifically, the scale no longer depends on one particular property of one particular substance. It’s anchored to a universal constant of nature, making it more stable and precise at extreme temperatures far from where water exists as a liquid.
Why Water, Specifically
Celsius chose water because it’s everywhere, and its phase changes are easy to observe and reproduce. Freezing and boiling happen at consistent, predictable temperatures under standard atmospheric pressure (101,325 pascals, or about the pressure at sea level). That made water-based calibration practical for laboratories around the world long before modern precision instruments existed.
Even the water used for scientific calibration is carefully specified. The international standard is called Vienna Standard Mean Ocean Water (VSMOW), a reference sample with a defined isotopic composition. Water molecules contain slightly different versions of hydrogen and oxygen atoms, and those differences subtly affect physical properties like freezing and boiling points. VSMOW ensures that when scientists in different countries calibrate instruments against “water,” they mean exactly the same thing.
How Thermometers Are Calibrated Today
In practice, precise thermometers aren’t checked against just two points. The International Temperature Scale of 1990 (ITS-90), still the working standard in laboratories, uses 17 fixed points spanning from extremely cold to extremely hot. These are temperatures at which specific pure substances freeze, melt, or reach their triple point under controlled conditions.
A few examples give a sense of the range:
- Triple point of hydrogen: minus 259.35°C
- Triple point of mercury: minus 38.83°C
- Triple point of water: 0.01°C
- Melting point of gallium: 29.76°C
- Freezing point of zinc: 419.53°C
- Freezing point of gold: 1,064.18°C
- Freezing point of copper: 1,084.62°C
Between these fixed points, calibration uses platinum resistance thermometers (for lower and mid-range temperatures) and radiation-based measurements (above the freezing point of silver, 961.78°C). For everyday thermometers, none of this complexity matters. But it’s the infrastructure that makes Celsius readings trustworthy and consistent worldwide.
Celsius vs. Fahrenheit
Fahrenheit, used primarily in the United States, was developed earlier and is based on a different set of reference points. The gap between its values is smaller: one degree Celsius equals 1.8 degrees Fahrenheit. To convert Celsius to Fahrenheit, multiply by 1.8 and add 32. Some common equivalents worth knowing: 0°C is 32°F, 37°C (normal body temperature) is 98.6°F, and 100°C is 212°F.
Celsius is the standard in science and in most countries worldwide because its 0-to-100 framework aligns neatly with the metric system and makes calculations simpler. The scale’s logic is easy to grasp: 0 is when water freezes, 100 is when it boils, and everything else falls relative to those two familiar events.

