Loading AI tools
Unit of measurement for temperature From Wikipedia, the free encyclopedia
The degree Celsius is the unit of temperature on the Celsius temperature scale[1] (originally known as the centigrade scale outside Sweden),[2] one of two temperature scales used in the International System of Units (SI), the other being the closely related Kelvin scale. The degree Celsius (symbol: °C) can refer to a specific point on the Celsius temperature scale or to a difference or range between two temperatures. It is named after the Swedish astronomer Anders Celsius (1701–1744), who proposed the first version of it in 1742. The unit was called centigrade in several languages (from the Latin centum, which means 100, and gradus, which means steps) for many years. In 1948, the International Committee for Weights and Measures[3] renamed it to honor Celsius and also to remove confusion with the term for one hundredth of a gradian in some languages. Most countries use this scale (the Fahrenheit scale is still used in the United States, some island territories, and Liberia).
degree Celsius | |
---|---|
General information | |
Unit system | SI |
Unit of | temperature |
Symbol | °C |
Named after | Anders Celsius |
Conversions | |
x °C in ... | ... corresponds to ... |
SI base units | (x + 273.15) K |
Imperial/US units | (9/5x + 32) °F |
Throughout the 19th century, the scale was based on 0 °C for the freezing point of water and 100 °C for the boiling point of water at 1 atm pressure. (In Celsius's initial proposal, the values were reversed: the boiling point was 0 degrees and the freezing point was 100 degrees.)
Between 1954 and 2019, the precise definitions of the unit degree Celsius and the Celsius temperature scale used absolute zero and the triple point of water. Since 2007, the Celsius temperature scale has been defined in terms of the kelvin, the SI base unit of thermodynamic temperature (symbol: K). Absolute zero, the lowest temperature, is now defined as being exactly 0 K and −273.15 °C.[4]
In 1742, Swedish astronomer Anders Celsius (1701–1744) created a temperature scale that was the reverse of the scale now known as "Celsius": 0 represented the boiling point of water, while 100 represented the freezing point of water.[5] In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that the melting point of ice is essentially unaffected by pressure. He also determined with remarkable precision how the boiling point of water varied as a function of atmospheric pressure. He proposed that the zero point of his temperature scale, being the boiling point, would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. The BIPM's 10th General Conference on Weights and Measures (CGPM) in 1954 defined one standard atmosphere to equal precisely 1,013,250 dynes per square centimeter (101.325 kPa).[6]
In 1743, the Lyonnais physicist Jean-Pierre Christin, permanent secretary of the Academy of Lyon, inverted the Celsius temperature scale so that 0 represented the freezing point of water and 100 represented the boiling point of water. Some credit Christin for independently inventing the reverse of Celsius's original scale, while others believe Christin merely reversed Celsius's scale.[7][8] On 19 May 1743 he published the design of a mercury thermometer, the "Thermometer of Lyon" built by the craftsman Pierre Casati that used this scale.[9][10][11]
In 1744, coincident with the death of Anders Celsius, the Swedish botanist Carl Linnaeus (1707–1778) reversed Celsius's scale.[12] His custom-made "Linnaeus-thermometer", for use in his greenhouses, was made by Daniel Ekström, Sweden's leading maker of scientific instruments at the time, whose workshop was located in the basement of the Stockholm observatory. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale;[13] among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Daniel Ekström [sv], the instrument maker; and Mårten Strömer (1707–1770) who had studied astronomy under Anders Celsius.
The first known Swedish document[14] reporting temperatures in this modern "forward" Celsius temperature scale is the paper Hortus Upsaliensis dated 16 December 1745 that Linnaeus wrote to a student of his, Samuel Nauclér. In it, Linnaeus recounted the temperatures inside the orangery at the University of Uppsala Botanical Garden:
... since the caldarium (the hot part of the greenhouse) by the angle of the windows, merely from the rays of the sun, obtains such heat that the thermometer often reaches 30 degrees, although the keen gardener usually takes care not to let it rise to more than 20 to 25 degrees, and in winter not under 15 degrees ...
Since the 19th century, the scientific and thermometry communities worldwide have used the phrase "centigrade scale" and temperatures were often reported simply as "degrees" or, when greater specificity was desired, as "degrees centigrade", with the symbol °C.
In the French language, the term centigrade also means one hundredth of a gradian, when used for angular measurement. The term centesimal degree was later introduced for temperatures[15] but was also problematic, as it means gradian (one hundredth of a right angle) in the French and Spanish languages. The risk of confusion between temperature and angular measurement was eliminated in 1948 when the 9th meeting of the General Conference on Weights and Measures and the Comité International des Poids et Mesures (CIPM) formally adopted "degree Celsius" for temperature.[16][a]
While "Celsius" is commonly used in scientific work, "centigrade" is still used in French and English-speaking countries, especially in informal contexts. The frequency of the usage of "centigrade" has declined over time.[17]
Due to metrication in Australia, after 1 September 1972 weather reports in the country were exclusively given in Celsius.[18] In the United Kingdom, it was not until February 1985 that forecasts by BBC Weather switched from "centigrade" to "Celsius".[19]
All phase transitions are at standard atmosphere. Figures are either by definition, or approximated from empirical measurements.
Kelvin | Celsius | Fahrenheit | Rankine | |
---|---|---|---|---|
Absolute zero[A] | 0 K | −273.15 °C | −459.67 °F | 0 °R |
Boiling point of liquid nitrogen | 77.4 K | −195.8 °C[20] | −320.4 °F | 139.3 °R |
Sublimation point of dry ice | 195.1 K | −78 °C | −108.4 °F | 351.2 °R |
Intersection of Celsius and Fahrenheit scales[A] | 233.15 K | −40 °C | −40 °F | 419.67 °R |
Melting point of ice[21] | 273.1499 K | −0.0001 °C | 31.9998 °F | 491.6698 °R |
Common room temperature[B][22] | 293 K | 20 °C | 68 °F | 528 °R |
Average normal human body temperature[23] | 310.15 K | 37.0 °C | 98.6 °F | 558.27 °R |
Boiling point of water[b] | 373.1339 K | 99.9839 °C | 211.971 °F | 671.6410 °R |
The "degree Celsius" has been the only SI unit whose full unit name contains an uppercase letter since 1967, when the SI base unit for temperature became the kelvin, replacing the capitalized term degrees Kelvin. The plural form is "degrees Celsius".[24]
The general rule of the International Bureau of Weights and Measures (BIPM) is that the numerical value always precedes the unit, and a space is always used to separate the unit from the number, e.g. "30.2 °C" (not "30.2°C" or "30.2° C").[25] The only exceptions to this rule are for the unit symbols for degree, minute, and second for plane angle (°, ′, and ″, respectively), for which no space is left between the numerical value and the unit symbol.[26] Other languages, and various publishing houses, may follow different typographical rules.
Unicode provides the Celsius symbol at code point U+2103 ℃ DEGREE CELSIUS. However, this is a compatibility character provided for roundtrip compatibility with legacy encodings. It easily allows correct rendering for vertically written East Asian scripts, such as Chinese. The Unicode standard explicitly discourages the use of this character: "In normal use, it is better to represent degrees Celsius '°C' with a sequence of U+00B0 ° DEGREE SIGN + U+0043 C LATIN CAPITAL LETTER C, rather than U+2103 ℃ DEGREE CELSIUS. For searching, treat these two sequences as identical."[27]
The degree Celsius is subject to the same rules as the kelvin with regard to the use of its unit name and symbol. Thus, besides expressing specific temperatures along its scale (e.g. "Gallium melts at 29.7646 °C" and "The temperature outside is 23 degrees Celsius"), the degree Celsius is also suitable for expressing temperature intervals: differences between temperatures or their uncertainties (e.g. "The output of the heat exchanger is hotter by 40 degrees Celsius", and "Our standard uncertainty is ±3 °C").[28] Because of this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval.[c] This is sometimes solved by using the symbol °C (pronounced "degrees Celsius") for a temperature, and C° (pronounced "Celsius degrees") for a temperature interval, although this usage is non-standard.[29] Another way to express the same is "40 °C ± 3 K", which can be commonly found in literature.
Celsius measurement follows an interval system but not a ratio system; and it follows a relative scale not an absolute scale. For example, an object at 20 °C does not have twice the energy of when it is 10 °C; and 0 °C is not the lowest Celsius value. Thus, degrees Celsius is a useful interval measurement but does not possess the characteristics of ratio measures like weight or distance.[30]
In science and in engineering, the Celsius and Kelvin scales are often used in combination in close contexts, e.g. "a measured value was 0.01023 °C with an uncertainty of 70 μK". This practice is permissible because the magnitude of the degree Celsius is equal to that of the kelvin. Notwithstanding the official endorsement provided by decision no. 3 of Resolution 3 of the 13th CGPM,[31] which stated "a temperature interval may also be expressed in degrees Celsius", the practice of simultaneously using both °C and K remains widespread throughout the scientific world as the use of SI-prefixed forms of the degree Celsius (such as "μ°C" or "microdegrees Celsius") to express a temperature interval has not been widely adopted.
from Celsius | to Celsius | |
---|---|---|
Fahrenheit | x °C ≘ (x × 9/5 + 32) °F | x °F ≘ (x − 32) × 5/9 °C |
Kelvin | x °C ≘ (x + 273.15) K | x K ≘ (x − 273.15) °C |
Rankine | x °C ≘ (x + 273.15) × 9/5 °R | x °R ≘ (x − 491.67) × 5/9 °C |
For temperature intervals rather than specific temperatures, 1 °C = 1 K = 9/5 °F = 9/5 °R Conversion between temperature scales |
The melting and boiling points of water are no longer part of the definition of the Celsius temperature scale. In 1948, the definition was changed to use the triple point of water.[32] In 2005, the definition was further refined to use water with precisely defined isotopic composition (VSMOW) for the triple point. In 2019, the definition was changed to use the Boltzmann constant, completely decoupling the definition of the kelvin from the properties of water. Each of these formal definitions left the numerical values of the Celsius temperature scale identical to the prior definition to within the limits of accuracy of the metrology of the time.
When the melting and boiling points of water ceased being part of the definition, they became measured quantities instead. This is also true of the triple point.
In 1948 when the 9th General Conference on Weights and Measures (CGPM) in Resolution 3 first considered using the triple point of water as a defining point, the triple point was so close to being 0.01 °C greater than water's known melting point, it was simply defined as precisely 0.01 °C. However, later measurements showed that the difference between the triple and melting points of VSMOW is actually very slightly (< 0.001 °C) greater than 0.01 °C. Thus, the actual melting point of ice is very slightly (less than a thousandth of a degree) below 0 °C. Also, defining water's triple point at 273.16 K precisely defined the magnitude of each 1 °C increment in terms of the absolute thermodynamic temperature scale (referencing absolute zero). Now decoupled from the actual boiling point of water, the value "100 °C" is hotter than 0 °C – in absolute terms – by a factor of exactly 373.15/273.15 (approximately 36.61% thermodynamically hotter). When adhering strictly to the two-point definition for calibration, the boiling point of VSMOW under one standard atmosphere of pressure was actually 373.1339 K (99.9839 °C). When calibrated to ITS-90 (a calibration standard comprising many definition points and commonly used for high-precision instrumentation), the boiling point of VSMOW was slightly less, about 99.974 °C.[33]
This boiling-point difference of 16.1 millikelvins between the Celsius temperature scale's original definition and the previous one (based on absolute zero and the triple point) has little practical meaning in common daily applications because water's boiling point is very sensitive to variations in barometric pressure. For example, an altitude change of only 28 cm (11 in) causes the boiling point to change by one millikelvin.[citation needed]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.