|Fahrenheit||Celsius||°F = (°C × 1.8) + 32|
|Celsius||Fahrenheit||°C = (°F − 32) ÷ 1.8|
|kelvin||Celsius||K = °C + 273.15|
|Celsius||kelvin||°C = K − 273.15|
|For temperature intervals rather than specific temperatures,
1 °C = 1 kelvin
1 °C = 1.8 °F
Conversion calculator for units of temperature
Celsius (or centigrade) is a temperature scale named after the Swedish astronomer Anders Celsius (1701–1744), who first proposed such a system two years before his death. The term degrees Celsius (symbol: °C) refers to a specific temperature on the Celsius temperature scale. The degree Celsius is also a unit increment of temperature for use in indicating a temperature interval (a difference between two temperatures).
The Celsius scale has been adopted as a standard for regular temperature measurements by most countries of the world and by the entire scientific community. In the United States, however, the Celsius scale is used mainly by scientists and many engineers (especially in high-tech fields), while the Fahrenheit scale is commonly used by the lay public and by people in government, industry, and meteorology.
Until 1954, 0 °C on the Celsius scale was defined as the melting point of ice and 100 °C was the boiling point of water under a pressure of one standard atmosphere; this simplified definition is still commonly taught in schools. However, the unit “degree Celsius” and the Celsius scale are now, by international agreement, defined by two points: absolute zero and the triple point of specially prepared water (Vienna Standard Mean Ocean Water, or VSMOW).
Absolute zero—the temperature at which nothing could be colder and no heat energy remains in a substance—is defined as being precisely 0 K and −273.15 °C. The temperature of the triple point of water is defined as being precisely 273.16 K and 0.01 °C.
This definition fixes the magnitude of both the degree Celsius and degree Kelvin as being precisely 1/273.16 the difference between absolute zero and the triple point of water. Thus, it sets the magnitude of one degree Celsius and one Kelvin to be exactly equivalent. In addition, it establishes the difference between the two scales’ null points as being precisely 273.15 degrees Celsius (−273.15 °C = 0 K and 0.01 °C = 273.16 K).
For an exact conversion between Fahrenheit and Celsius, the following formulas can be applied. Here, f is the value in Fahrenheit and c the value in Celsius:
This is also an exact conversion making use of the identity -40 °F = -40 °C. Again, f is the value in Fahrenheit and c the value in Celsius:
Some key temperatures relating the Celsius scale to other temperature scales are shown in the table below.
(precise, by definition)
|0 K||−273.15 °C||−459.67 °F|
|Melting point of ice||273.15 K||0 °C||32 °F|
|Water’s triple point
(precise, by definition)
|273.16 K||0.01 °C||32.018 °F|
|Water’s boiling point A||373.1339 K||99.9839 °C||211.9710 °F|
A For Vienna Standard Mean Ocean Water at a pressure of one standard atmosphere (101.325 kPa) when calibrated solely per the two-point definition of thermodynamic temperature. Older definitions of the Celsius scale once defined the boiling point of water under one standard atmosphere as being precisely 100 °C. However, the current definition results in a boiling point that is actually 16.1 mK less. For more about the actual boiling point of water, see The melting and boiling points of water below.
In 1742, Anders Celsius created a “backwards” version of the modern Celsius temperature scale, using zero to represent the boiling point of water and 100 to represent the melting point of ice. In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that ice’s melting point was effectively unaffected by pressure. He also determined with remarkable precision how water’s boiling point varied as a function of atmospheric pressure. He proposed that zero on his temperature scale (water’s boiling point) would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. In 1954, Resolution 4 of the 10th CGPM (the General Conference on Weights and Measures) established internationally that one standard atmosphere was a pressure equivalent to 1,013,250 dynes per cm2 (101.325 kPa).
In 1744, coincident with the death of Anders Celsius, the famous botanist Carolus Linnaeus (1707 – 1778) effectively reversed Celsius’s scale upon receipt of his first thermometer featuring a scale where zero represented the melting point of ice and 100 represented water’s boiling point. His custom-made “linnaeus thermometer,” for use in his greenhouses, was made by Daniel Ekström, Sweden’s leading maker of scientific instruments at the time. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale; among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Christian of Lyons; Daniel Ekström, the Swedish instrument maker; and Mårten Strömer (1707–1770), who had studied astronomy under Anders Celsius.
The first known document reporting temperatures in this modern “forward” Celsius scale is the paper Hortus Upsaliensis dated 16 December 1745 that Linnaeus wrote to his student, Samuel Nauclér. In it, Linnaeus recounted the temperatures inside the orangery at the Botanical Garden of Uppsala University:
For the next 204 years, the scientific and thermometry communities worldwide referred to this scale as the “centigrade scale.” Temperatures on the centigrade scale were often reported simply as “degrees” or, when greater specificity was desired, “degrees centigrade.” The symbol for temperature values on this scale was °C (in several formats over the years). Because the term “centigrade” was also the French-language name for a unit of angular measurement (one-hundredth of a right angle) and had a similar connotation in other languages, the term “centesimal degree” was used when very precise, unambiguous language was required by international standards bodies such as the Bureau international des poids et mesures (BIPM). The 9th CGPM (Conférence générale des poids et mesures) and the CIPM (Comité international des poids et mesures) formally adopted “degree Celsius” (symbol: °C) in 1948. For lay-people worldwide—including school textbooks—the full transition from centigrade to Celsius required nearly two decades after this formal adoption.
The term "degrees Celsius" can be used in a couple of different ways: (a) to express temperature measurements, and (b) to express temperature intervals, that is, differences between temperatures or uncertainties in temperature measurements. Examples of the first case would be: “Gallium melts at 29.7646 °C”; or, “The temperature outside is 23 degrees Celsius.” Examples of the second case would be: “This heat exchanger has an output that is hotter by 40 degrees Celsius”; or, “The standard uncertainty in the measurement of this temperature is ±3 °C.”
Given this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval.
The temperature interval of one degree Celsius is the same as that of one kelvin. For this reason, in science (especially) and engineering, the Celsius and Kelvin scales are often used simultaneously in the same article (for example: “…its measured value was 0.01023 °C with an uncertainty of 70 µK…”). Notwithstanding the official endorsements of Resolution 3 of the 13th CGPM (1967/68) and Resolution 7 of the 9th CGPM (1948), the practice of simultaneously using both “°C” and “K” remains widespread throughout the technical world, as the use of SI prefixed forms such as “µ°C” or “millidegrees Celsius” to express a temperature interval has not been well-adopted.
The effect of defining the Celsius scale at the triple point of VSMOW water (273.16 kelvins and 0.01 °C), and at absolute zero (zero kelvin and −273.15 °C), is that the melting and boiling points of water under a pressure of one standard atmosphere (1013.25 mbar) are no longer the defining points for the Celsius scale. In 1948, when the 9th General Conference on Weights and Measures (CGPM) in Resolution 3 first considered using the triple point of water as a defining point, the triple point was so close to being 0.01 °C greater than water’s known melting point, it was simply defined as precisely 0.01 °C. However, current measurements show that the triple and melting points of VSMOW water are actually very slightly (<0.001 °C) greater than 0.01 °C apart. Thus, the actual melting point of ice is very slightly (less than a thousandth of a degree) below 0 °C. Also, defining water’s triple point at 273.16 K precisely defined the magnitude of each 1 °C increment in terms of the absolute thermodynamic temperature scale (referencing absolute zero).
Now decoupled from the actual boiling point of water, the value “100 °C” is hotter than 0 °C—in absolute terms—by a factor of precisely 373.15/273.15 (approximately 36.61% thermodynamically hotter). When adhering strictly to the two-point definition for calibration, the boiling point of VSMOW water under one standard atmosphere of pressure is actually 373.1339 K (99.9839 °C). When calibrated to ITS-90 (a calibration standard comprising many definition points and commonly used for high-precision instrumentation), the boiling point of VSMOW water is slightly less, about 99.974 °C.
This boiling–point difference of 16.1 millikelvins (thousandths of a degree Celsius) between the Celsius scale’s original definition and the current one (based on absolute zero and the triple point) has little practical meaning in real life, because the boiling point of water is extremely sensitive to variations in barometric pressure. For example, an altitude change of only 28 cm (11 inches) alters this boiling point by one millikelvin.
The “degree Celsius” is the only SI unit that has an uppercase letter in its full unit name in English.
The word “degree” may be abbreviated as “deg.” Accordingly, the following are permissible ways to express degree Celsius: singular / (plural)
As with most other unit symbols and all the temperature symbols, a space is placed between the numeric value and the °C symbol; e.g., “23 °C” (not “23°C” or “23° C”). Only the unit symbols for angles are placed immediately after the numeric value without an intervening space; e.g., “a 90° turn”.
Unicode, which is an industry standard designed to allow text and symbols from all of the writing systems of the world to be consistently represented and manipulated by computers, includes a special “°C” character at U+2103. One types
℃ when encoding this special character in a Web page. Its appearance is similar to that obtained by typing its two components (° and C) one after the other. To better see the difference between the two, below in brown text is the degree Celsius character followed immediately by the two-component version:
When viewed on computers that properly support and map Unicode, the above line may be similar to the line below (size may vary):
Depending on the operating system, Web browser, and the default font, the “C” in the Unicode character may be narrower and slightly taller than a plain uppercase C; precisely the opposite may be true on other platforms. However, there is usually a discernible difference between the two.
1) All common temperature scales would have their units named after someone closely associated with them; namely, Kelvin, Celsius, Fahrenheit, Réaumur and Rankine.
2) Notwithstanding the important contribution of Linnaeus who gave the Celsius scale its modern form, Celsius’s name was the obvious choice because it began with the letter C. Thus, the symbol °C that for centuries had been used in association with the name centigrade could continue to be used and would simultaneously inherit an intuitive association with the new name.
3) The new name eliminated the ambiguity of the term “centigrade,” freeing it to refer exclusively to the French-language name for the unit of angular measurement.
All links retrieved June 28, 2017.
New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:
Note: Some restrictions may apply to use of individual images which are separately licensed.