Newton scale

[1][2] He called his device a "thermometer", but he did not use the term "temperature", speaking of "degrees of heat" (gradus caloris) instead.

Newton's publication represents the first attempt to introduce an objective way of measuring (what would come to be called) temperature (alongside the Rømer scale published at nearly the same time).

He set as 0 on his scale "the heat of air in winter at which water begins to freeze" (Calor aeris hyberni ubi aqua incipit gelu rigescere), reminiscent of the standard of the modern Celsius scale (i.e. 0 °N = 0 °C), but he has no single second reference point; he does give the "heat at which water begins to boil" as 33, but this is not a defining reference; the values for body temperature and the freezing and boiling point of water suggest a conversion factor between the Newton and the Celsius scale of between about 3.08 (12 °N = 37 °C) and 3.03 (33 °N = 100 °C) but since the objectively verifiable reference points given result in irreconcilable data (especially for high temperatures), no unambiguous "conversion" between the scales is possible.

For higher temperatures, Newton used a "sufficiently thick piece of iron" that was heated until red-hot and then exposed to the wind.

Newton then determined the "degrees of heat" of these samples based on the solidification times, and tied this scale to the linseed one by measuring the melting point of tin in both systems.