Maybe because I was brought up with centigrade it makes more sense to me. The centigrade number is how far you are from water freezing. If it goes up 100% then you are twice as far away. I'm not aware that doubling the fahrenheit number has a similar easy to understand meaning?
> The centigrade number is how far you are from water freezing
The Fahrenheit scale is how far your are from your own body temperature. It was designed that 100 is the temperature of a human. (Adjusted later to 98.6 due to inaccuracies.)
0 was designed to be as cold as you can get with ice and salt (also ended up being slightly inaccurate).
> Maybe because I was brought up with centigrade it makes more sense to me.
Yup. People brought up on Fahrenheit think it is superior. For temperature neither argument is objectively better. (In contrast to imperial distance measurement with non-powers of 10 and factions, where there are good arguments against it, with temperature both scales are ultimately arbitrary.)
> For temperature neither argument is objectively better.
I think Celsius is objectively better in that:
(1) its endpoints–freezing and boiling point of water–are more natural / less arbitrary / more fundamental than Fahrenheit's–coldest temperature you can reach with salt and ice to average human body temperature. Water is a fundamental substance to all known life; the freezing point of pure water is much more fundamental than the freezing point of a water + NaCl mixture (actually apparently Fahrenheit used ammonium chloride not sodium chloride, which is arguably even more arbitrary than sodium chloride would be). If you imagine some extraterrestrial civilisation independently invents a temperature scale, they'd be more likely to come up with something close to Celsius than something close to Fahrenheit
(2) while both scales contain some error in that the nominal value of their endpoints differs from the real value, the error is greater for Fahrenheit
(3) According to Wikipedia, Fahrenheit didn't have 100 degrees between his two endpoints, he originally had 90 then increased it to 96 – given base 10 is the standard number base used by humans, 100 divisions is less arbitrary than 90 or 96
(4) nowadays, all other temperature scales are officially defined in terms of Kelvin – and Celsius has a simpler relationship to Kelvin than Fahrenheit does (for Celsius it is purely an additive offset, for Fahrenheit it involves both addition and multiplication)
(5) conforming to the global standard is objectively better than sticking with an alternative which lacks clearcut advantges
They're natural, sure, but not really natural in a way that directly affects the most common everyday use case for temperature: the weather.
Fahrenheit's 0-100 is for now approximately the range of "normal" weather for the continental US, which is a useful if accidental property for Fahrenheit as a weather temperature system. Celsius's use of water's boiling point as a value for 100 is a nice property from an aesthetic perspective but means that it doesn't use the full 100 degrees when representing any earth-based climate.
> (2) ... (3)
Both of these aren't really advantages if you treat the scales as arbitrary, which in either case you are for most real-world uses of either scale. Water's boiling point at sea level is totally irrelevant when you're measuring the temperature of the outdoor air, a human body, or a steak.
> (4)
Only matters if you're actually doing work that needs to translate to and from Kelvin, which most people will never do in their entire lives.
> (5)
Standardizing would be helpful for sure, but it's not obvious that it would be worth the headache of making the switch given that the only real advantage to Celsius for most use cases is the mere fact that it's a standard.
> Fahrenheit's 0-100 is for now approximately the range of "normal" weather for the continental US, which is a useful if accidental property for Fahrenheit as a weather temperature system.
Only if you live in the continental US - and even the continental US has parts where that generalisation doesn’t hold true. From a global perspective, there’s nothing special weather-wise about 0 Fahrenheit or 100 Fahrenheit-where I live, it has literally never been that cold (our record daily minimum is positive even in Celsius), while days of 38 Celsius or above are infrequent but far from exceptional. As a global standard, Fahrenheit has no meteorological advantage over Celsius - and indeed, Celsius is the global standard for meteorology, used by >95% of the planet
> Standardizing would be helpful for sure, but it's not obvious that it would be worth the headache of making the switch given that the only real advantage to Celsius for most use cases is the mere fact that it's a standard.
You are ignoring all the costs incurred by immigrants and international travellers having to learn to juggle two different systems in their heads, the obstacle it poses to international communication (e.g. “40 degree day” means radically different things in American vs Australian English), products having to support both units and then having configuration settings to change them, people being inconvenienced when they can’t work out how to change the units setting, publications forced to include both units to ensure all readers understand, etc. Those costs are ongoing and cumulative over time, whereas the cost of switching is a once-off which most of the world has already paid
Fahrenheit is about half the size of increments without decimals or fractions which, having grown up with it ,seems useful. Yeah, I kinda want to know about 32 degrees but that doesn't seem a huge cognitive load. Knowing sub-zero is fricking cold is a decent benchmark as is knowing that >100 degrees is fricking hot. Yeah, 212 degrees for boiling is a bit weird but don't really need that much and that's only at standard pressure anyway.
The slope of the scales has no bearing on whether percentages are meaningful here. The problem with both systems when it comes to percentages is that neither system has 0 set to a natural 0. This leads to an entirely arbitrary point on the scale where decreases in the unit will approach a 100% difference and then suddenly start decreasing again.
If anything Fahrenheit should be less insane because at least the artificial 0 is likely to stay much further away in the data they're quantifying so the percentages stay reasonable.
The slope of the Fahrenheit scale matches that of the Rankine scale.
I would still say that the in the Rankine scale percentage increases make sense, and Fahrenheit changes to not.
The thing that matters isn't the slope, but the zero point; "X% farther from absolute zero" is a useful measurement, "X% farther from an arbitrary zero point" is not. Especially when negative or zero temperatures are involved.