So I have this silly idea/longterm project of wanting to run a server on renewables on my farm. And I would like to reuse the heat generated by the server, for example to heat a grow room, or simply my house. How much heat does a server produce, and where would you consider it best applied? Has anyone built such a thing?
A server produces an amount of heat equivalent to it’s wattage.
A 500W server rack will produce 1/3rd the amount of heat as a 1500W space heater. If your rack draws 100W at idle, than that’s how much heat it produces. So if it’s cold outside you could spin up folding at home or some other thing to burn excess CPU cycles
As long as your server is inside your house it is offsetting the amount of heat your HVAC system needs to produce - granted it is also greatly increasing the amount of work your AC needs to do in the summer
There is a cricket farm in Quebec that heats it’s enclosures with Bitcoin mining rigs.
Servers are 100% efficient at heating, but heat pumps are 300% efficient. Get the most energy efficient devices you can, and heat your house with a proper heat pump.
You can even dump the cold air from the heat pumps into the server room and pull the hot air back into the heat pump again to gain even better efficiency
Well, that was my plan when I set things up with an air-to-hot-water heatpump in the same room as my homelab. But the reality is that when it is hot outside, I don’t need to run the heatpump (mainly because the solar-thermal water heater is so much more effective). And otherwise the there is no need for cooling in that room.
You can always dump the hot water in an outside radiator if you HAVE to cool the server room to keep the temps down. A simple fan duckt would probably doo fine too. However with the heat pump you can also heat a little hot tub to use outside in summer
Sure, but if you’re running the server anyway, it’s basically “free” heat.
I think OP’s point is he’s going to be running the server regardless, so why not recoup the heat.
I’m sure the neighbors just love living next to that farm.
I do this. If you want to actually want to use or donate the processing power, this is kind of a good thing. However, there are a lot of downsides:
- Computers are generally much lower power than a heater. This makes them very slow to “react” to heating needs. Heating a small room, even with a 500W PC, could take an hour or maybe more.
- Heaters have a thermostat, which computers don’t, so even though they are very laggy, they also don’t stop heating when the temperature is right. This means they can overshoot and make the room uncomfortably hot.
- You could set up an external thermostat but then you need a load which can be switched on and off.
- I was using folding@home, but the work items take a long time, and switching them on and off will increase the time taken to resolve the work item, which in turn means the system could get annoyed and use someone else’s computer to resolve the work item faster, or worse, blacklist your computer.
- Using your PC to generate heat will use up its maximum lifetime. The fans aren’t built to be running at max speed all the time, the CPU & GPU could wear out, and the power systems will also wear as time goes on. You sort of have to align that lifetime against usage. This is likely fine if you see the computation as a donation or if you have important stuff to compute, but it’s probably not worth just wasting the cycles.
I would love to hear more about your setup and build. How are you using the heat, and what did you have to build to make use of it?
I’m just using it as a space heater for my study, which is also where I work from. While using the computer in Winter I just switch on f@h for both CPU and GPU (AMD 5700x and 6700xt), and this heats up the room. It’s a good 300-400W. I have home assistant telling me the temperature in the room and it bugs me to turn it off if it’s too hot. That’s my “temperature control”. I didn’t build anything, the computer is just under my desk and it heats up my room.
Originally my plan was to have F@H automatically turn on and off based on temperature, but it turns out the power is low enough and the lag is high enough that you switch it on in the morning, and then once the room is upto temperature you can just switch it off and the room will stay warm the rest of the day.
I can’t remember what youtuber did this. But some guy tried to heat a pool this way with their server rack
That was LTT https://youtu.be/wjO6OLmZB9A He admits that its a dumb idea on the wan show later so take that as you will. While i havent read the other comments yet, i bet at least a couple have recommended heatpumps. They are the best solution here.
But if you want to have some fun, grab a large pump and water cool the server, make sure all your water cooling connections can withstand the pressure your giving them and then run the cooling tubes out to your green house.
I run a quite powerful server rack for my business (two servers with 64 core Threadrippers, redundant power supplies, nice SSDs, etc) and it puts out some heat, but not even as much as my gaming PC. The CPUs are usually sitting near 2% utilization, so it’s barely drawing any wattage, and thus, barely putting out heat. So it really depends on how much the server is doing that will determine the heat it outputs. You can build a 1000 watt server, but if it’s only drawing 50 watts, it won’t be generating much heat.
That being said, if the alternative is an electric heater, then using a server for something productive like BOINC or Folding at Home is a better use of your electricity, and may produce the heat you’re looking for. And for your use case, an older (and thus cheaper) CPU would probably suit you, so win-win.
As far as heating efficiency goes, servers aren’t really that good. 100% efficiency as all the electricity they use ends up as heat. Compare with heat pumps that can get 300+% efficiency.
What I do is have my server in the same “room” as my water heater (share airspace), and use a heat pump electric water heater.
At least then all the heat my server makes is used for something good, instead of extra AC usage in summer (winter is a wash, as I need the heat in the house anyways)
I wonder how to take on the efficiency question when considering waste heat. Would and older model generating more heat be the better choice? Has anybody started to dig into the complexities of calculating efficiency for circular systems?
There’s no real complexity. Computers are first and foremost electric space heaters, a negligible amount of energy is used to perform computation.
If you would be heating a room with resistive electric heating, a computer drawing the same wattage can do the same job while also, in theory, doing some useful work.
If you are just evaluating heating options, heat pumps use less energy to output the same heat.
Unless you can physically separate the heat from the server… This is a bad idea. There’s a reason servers don’t sit in hot humid areas.
You could probably mod a water cooler system to connect to a water to water heat exchanger like this: https://www.heata.co/
In general, like others have said, lots of humidity is bad for the hardware, you will need to separate this somehow. But heating a small grow-room with an extension of a CPU water cooler might be possible.
Computers are just heaters that do maths as a side effect, so it makes sense. There’s even a company monetising the idea for ESG credits
Two issues i can see are
- Humidity in a grow room will be bad for a computer
- With modern computers the heat output will in part depend on the workload, so you will need to find a way to make the computer do more work when you want more heat.
I miss the GPU mining days from long ago. Used to have a five old computers with potato GPUs on a time switch to do just this or folding at home as a fun experiment many years ago. It kinda worked in the earliest days but asics killed the income offset after a few months.
The old Core 2 Quad Dell with three dvb-t tuners and a few hard drives still dumps a fair bit of heat into the room.
This is actually a really bad idea.
At “best”, your server is a resistive heater. Aka “a space heater”. Except that your server also has hardware designed to convert power into negative temperature (you know… fans). So you are at a lower efficiency than the space heater in the corner.
Also? Computers aren’t meant to run all that hot for all that long. Yes, the safe margin for hardware is a lot higher than people would think. But if you want this to make a meaningful difference you are going to be running REAL hot for extended periods of time. Because you don’t need heating when it is warm outside. You need it when it is cold and you are already going to be fighting a low ambient temperature.
The reason this works for larger data centers and specialized installations is that they are designed with this in mind. You generally either have direct water cooling of the racks (plural) or you have “water cooling” of the server room itself. With the water then being recirculated amongst the radiators in the building itself. And… those are quite often borderline “scams” because they don’t actually keep the building all that warm in the winter (as discovered during The Pandemic when the lack of body heat from human beings caused issues for a lot of hybrid office/data centers) and they mean more HVAC costs to keep the building cool during the summer.
Which gets to the other aspect. Are you going to change all your fan and cooling settings on a weekly (or even daily) basis? Because maybe you want to get right up to thermal throttling during the winter because the ambient temperature means that heat will “dissipate” fast. But during the summer or even a warm winter day? You are turning your server room into the kind of inferno that even Tom Cruise has someone else deal with.
Don’t get me wrong. Having a chonky and inefficient PC is great for late night gaming in the winter when you should have gone to sleep hours ago and your zone is already set for the “nobody but the cat is in there” setting. But, even at the datacenter level, it is not a good replacement for HVAC. And, as a lot of us will attest: Summer is when you grab the Steam Deck or go downstairs and use the xbox.
Anecdote:
I have a server running 24/7 in my office, drawing 120 watts average (tested). Office is 10x10. It alone keeps that room 2-5 degrees warmer than the rest of the house. If I turn it off, room equalize to house.
As for comparison, those little square plug in space heaters consume 500 to 1500 watts, and you can see how much th heat.
1 watt = 3.4 btu
Depending on your use case, why not look to reduce power consumption? I’ve replaced that server with one that draws <20w at idle. That’s negligible.
Sure the Processors can get hot. But is it enough to heat a room tremendously for your needs? I somehow doubt it.
You would need to retinker the Heat spreader on your cooler.