Fahrenheit is yet another one of those nonsensical, antiquated arbitrary measurements that have, fortunately, been superseded by the metric/SI system in the thinking world (ie everything bar america). There's no problem between Kelvin and Celsius because there's an exact conversion standard that @VBS mentioned above thus making them compatible for alongside use. I feel this is related to the topic:
Scientists: "Kelvin is Absolute, so Celsius is now defined by Kelvin." Historians: "Celsius came first. Kelvin stole its degree increment, so Kelvin is based on Celsius." Mathematicians: "The equation balances either way, so who cares!?"
Most fields of sience use Kelvin, not Celcius/Fahrenheit (tho it's based upon the celcius scale, just changed the temperature the scale begins at). They do this so they don't have to deal with those pesky negative numbers. And it's totaly wrong to say that one scale is easy to use and the other is not so easy. It's all about what you are used to. Here i will make a scale 0⁰K = -1⁰ 273⁰K = 164⁰ 546⁰K = 396⁰ Teach your children to use this scale and they will find other scales anoying and usless like almost everyone in the world do with fahrenheit.
As a science student i prefer Kelvin as it starts at 0 and there's no need for negatives but normally Celsius is better especially in daily life so i use Celsius while i don't get Fahrenheit and the conversion from Celsius to it is a long and weird calculation
When i looked into the three temps, i actually found that Celsius is an "approved" secondary measurement that can be used alongside Kelvin to express certain things that cant directly be expressed in Kelvin. (i probably expressed that a bit wrong but close enough.)
Why do someone even use farenheit, 99% countries use Celsius. It's like, there always one as*hole to mess up for everyone.
By the way, I like to reconfigure thermostats in hotel rooms that I visit. If they're not the type with a dedicated changeover button/switch, I bet they leave the next guest confused or frustrated.
tbh I'm really bad with temperatures. I know that 0C and 100C are the freezing and boiling point of water, ~180-220C is what I put my oven to, and that the thermostat is about 20-30C. But I have no idea of what a cold or warm temperature for outside would be, or the temperature of anything that I eat or drink, or anything like that at all. Kelvin is the way to go, because it is good for science.
I ended up reading a little bit more of this, and it seems Fahrenheit was designed in the early thermometer days so it was easy to divide into whole scale into whole numbers (avoiding fractions) for ambient temperature. 0 was defined as the coldest solution that cold be easily made at the time. 32 was freezing. 96 was human body temperature. It conveniently splits into intervals of 2, making it easy to mark on a glass tube, thus avoiding fractions and generally is never negative in ambient temperature in most places where humans live.
As a layman I disagree. Fahrenheit is freaky. There's nothing wrong with decimals. Besides, every measuring system in US is just as freaky...
I use Celsius because it's relatively normal. 0 is when water freezes, 100 is when water boils. It's just easier to use than the other systems.
Well, to me it kind of relates to a base-10 system versus some other base? Like why do we divide time into 24 hours per day or 60 seconds to a minute? Ancient Babylonians used a base 60 system, which from a mathematical standpoint makes very convenient divisions and a high number of factors. If we used something other than a base-10 modulus, you could argue that other counting systems make more “sense”. For instance hexadecimal or something else.