Hacker News new | past | comments | ask | show | jobs | submit login

Well yes, but also the faster it dries your clothes and shuts off.



That's not true though. The additional energy to create that heat doesn't have to equal the time saved.

You'll notice this in heat pump dryers. They cannot generate the same amount of heat. They take way longer to dry the clothes. But they're way more energy efficient than other forms of dryers.

Edit: I thought of another example. Heating your home with hot water running through radiators. It's significantly more energy efficient to reduce the temperature of the water. This outweighs the additional time it takes to heat up your home. There are various drawbacks and considerations though, e.g. if the house has terrible insulation (noticeable draft) then it'll not be beneficial. There's various other things that'll significantly reduce energy usage, this while anyone would assume that generating heat is already very efficient.


To some extent drying clothes is generating heat (evaporation heat). If you're clever about it you might be able to avoid heating the (wet) clothes and rest of the contents of the dryer (or the outside!) too much. However evaporating water requires an incredible amount of energy, even if you just boil water away then most of the energy is still spent evaporating the water rather than heating the water, so it's not really too clear-cut that running a dryer hot is massively inefficient.

Edit: Also it's not that using lower-temperature water to convey heat is somehow more efficient, the thing with heat pumps is that they are more efficient at heating things to a lower temperature. If you're burning gas it doesn't really matter either way, you just get the energy out you put in.


> so it's not really too clear-cut that running a dryer hot is massively inefficient.

My heat pump dryer came with an energy estimate for various functions and loads. The various functions which shorten the time, or the functions which increase the heat (often related) are specified to use way more energy. To me, it's pretty clear, plus the manufacturer specifies it.

> Also it's not that using lower-temperature water to convey heat is somehow more efficient. [..] If you're burning gas it doesn't really matter either way, you just get the energy out you put in.

That's what I used to assume as well. It isn't accurate though. If the water that comes back to the heating element is too hot it'll not be as efficient as when the temperate is lower. Similarly, the additional energy that's needed to heat the water to e.g. 75+ degrees Celsius is wasteful. You can save around 30% of the energy by reducing the temperature of the water that's used to heat your home (though might not work due to various considerations). There are loads of other things that are possible which also significantly reduce the energy usage.

Regarding how to save energy when using a boiler there's a huge Dutch topic about it with loads of tips: https://gathering.tweakers.net/forum/list_messages/2027810. I assume similar information can be found in other languages, though heating using gas and water is really popular in NL (more so than any other country I assume).


> energy needed to heat water to 75+℃ is wasteful. You can save around 30% by reducing water temperature to heat your home

That depends. For resistive electric it shouldn't make a difference, pretty much all heat is transported.

For non-condensing gas (or wood etc.), if your heater is going full blast and a lot of the heat goes up your chimney and lowering the temperature makes a smaller, slower flame, that gets absorbed better, I think you could get 10-30% difference. The heating of water itself to 20℃, 75℃ or 110℃ shouldn't make much of difference, as you're not supposed to cool the effluents too much, or you get condensation, acids, rust ... which will likely kill you equipment.

Condensing gas is cool, extracting so much heat, that water condenses, but the gas must be clean enough and the condenser resistant to corrosion. Here, lowering water temperature can safely lower the effluent gas temperature for more heat extraction (even in optimal power range), and condensation of resulting water vapor from burning gas is about 10% extra energy that would otherwise go up the chimney. I'd expect about 15-30% more heat than non-condensing, especially if run on lower temperatures.

Heat pumps are quite efficient at moving heat, where 1W of electricity can move 3W of heat for a 4W heating yield. A steeper gradient means more work, so pumping heat from 20℃ to 75℃, 1W may only move 0.5W of heat for 1.5W yield (numbers not accurate). Lowering the temperature can make a 2x difference, or even more in extreme cases.


I wouldn't be too trusting of the claims of a manufacturer who's main selling point is the savings in energy...

They might still be true though, but if you keep in mind that it takes about 5 times more energy to evaporate water than to heat it to 100C, and that heating water is more difficult than most other substances it is really not clear why using more heat would be (far) less efficient. Sure it would consume heat at a higher rate, but also less long.


So would microwaves be the way to go?

Edit: too much metal on clothing...



I'm not sure if heating with cooler water is more efficient.

Some places you pay for the joules delivered into your home. You have flow meter and temperature meters on he input and the output of the radiators and the price for joule is constant regardless of input and output temperatures.

What saves you money is keeping your interior cooler because heat loss is propotional to the temaperature difference.


> I'm not sure if heating with cooler water is more efficient.

Unfortunately I only have a Dutch link which goes into way more detail: https://gathering.tweakers.net/forum/list_messages/2027810.

Dutch energy companies by law have to advise their customers how to save money. The app I use give exactly this advice (lower the temperature), plus various other advices.

> Some places you pay for the joules delivered into your home

That's something different than what I said, no? I'm talking about when you generate the heat in your home. I'm aware of that solution as well, they're efficient because of volume plus part of the heat (energy required) is waste-heat from some industry.

There's still various ways to save energy despite exactly measuring the temperature out and in. E.g. radiator fans.

I know this all seems entirely illogical. Energy in (or required) should stay the same. Practically though, it's probably energy losses that somehow occur and are avoided.

E.g. for the radiator fans people measured if they save energy. They do, though the cost of buying them might outweigh the savings. DIY is cheap though.


IF that is how you are billed it doesn't matter to you.

Someone is paying to heat that water, and that someone would get a bit more efficiency out of the system if the water temperatures were lower.


Yes, but I think it's only because losses during transfer would be lower.

If you generate energy inside your isolated house and transfer it to radiators also inside your house temperature shouldn't matter.


> The additional energy to create that heat doesn't have to equal the time saved.

Right, it doesn't have to, but it's also possible that more heat makes it take proportionally less time (or close enough, with negligible decrease in efficiency).

Obviously, yes, using a heat pump will use less energy than a resistive heating element. But the question is more about how much and how quickly heat is input (regardless of how it was generated) and how that affects drying times.


Yes, but it's not a simple linear use of energy. For example, it might use 10x energy to dry twice as fast. That's a gain if you're in a hurry, but not so much if you're relaxing at home, on a tight budget, and/or have unusually high cost of electricity.


It might, but does it actually? Or does it use 2.05x energy to dry twice as fast, making the energy use difference negligible?

Edit: Consider also that the shorter you run the dryer for, the shorter you are running the (substantial) motor and fan, as well as less time spent heating the shell of the dryer and the air surrounding it.


Depends, for most dryers the temperature is limited because water evaporation is taking all the energy. The motor takes the same energy per time, so twice as fast actually uses less energy. However there is a limit to this, eventually (the end of the cycle) you reach the point where water isn't evaporating fast enough to use up all the input energy and temperatures go up to heating clothing fibers to no useful purpose and this is wasteful.


Also, is the dryer located in a climate-controlled part of your home? If so, the air that it exhausts will be made up in equal volume by outdoor air pulled into your living space. How much extra energy does that make your heater or AC use?


Idk about the rest but the ratio of energy used by the heater compared to the motor is greater than 10:1.

Edit: 20:1 mentioned here https://qr.ae/pGNTie


If we assume for just a moment that the time to dry clothes scales linearly with heat applied, we can try to run some numbers:

    Total power usage for full heat is (5000W + 250W) * 0.5h = 2625 Wh
    Total power usage for half heat is (2500W + 250W) * 1h = 2750 Wh
So that's a 4.6% increase in efficiency for using full heat over half heat.

So the big question remains, does lower heat dry clothes more efficiently, and if so, how much?

The Stack Overflow answer you linked raises some interesting points, but doesn't seem rigorous.


From an industry pdf I stumbled across, it looked like moisture sensing improvements were the best bet to save the most energy. Though, I didn't see anything about comparing heat settings in that doc, which may be telling.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: