TransWikia.com

Why not put servers in a refrigerator?

Engineering Asked by The Dreams Wind on May 19, 2021

Recently I was thinking about home improvement, and one idea that is haunting me is that I could use some kind of refrigerator for gaming consoles and laptops.

The obvious benefit is that it maintains a constant low temperature. Also, the doors usually can be closed firmly and it won’t let dust come inside, so there isn’t any need to disassemble and clean hardware. That is a good thing in long term. Despite it makes airflow less intensive, according to my understanding it should not be a problem, as all that is needed for hardware is cooled air, not new air.

And whenever I get what I consider a good idea, I usually stop for a minute and ask myself a question "why doesn’t this exist yet?". So there are definitely big-scale expensive enterprise servers which could possibly use the same approach to reduce maintenance effort, but that is not the case. So the disadvantages I can think of is that refrigerators are just not designed to cool a source of constant heat for a long period of time and (probably) will break quickly. Another problem that came to mind is that refrigerator boxes tend to generate condensate and it might be harmful for hardware.

Am I correct with my assumptions, or is it not that crucial and are there in fact similar solutions?

10 Answers

We do. It's just an up-sized (i.e. more powerful) version of a refrigerator known as an air conditioning unit.

Essentially all server rooms and most spaces where PCs are located (speaking for the U.S., at least) are air conditioned. Server rooms almost universally have dedicated HVAC systems and they will indeed be designed to keep the room at a more-or-less constant temperature (and typically humidity, too.)

Other answers mention water blocks, heat sinks, fans, etc., but those are mostly just used to move the heat from the CPUs and other such hot components to the ambient air in the server room. You still have to get rid of the heat from the ambient air and air conditioning is by far the most common solution for that.

Incidentally, this is actually frequently one of the largest costs associated with running servers. It's not uncommon for the HVAC system for a large server room to use more power than the actual servers (or, at least, that was the case back when I was a sysadmin a decade ago. I'd guess it still is.) As a result, there have been experiments with more environmentally-friendly (i.e. energy-efficient) ways to cool server rooms, including even locating them underwater. These have had some success, but, obviously, they're quite a bit more complicated and geographically limiting than just installing air conditioning, so air conditioning is still by far the most common solution to cool server rooms.

Correct answer by reirab on May 19, 2021

refrigerator boxes tend to generate condensate

There are specifications for acceptable relative humidity (RH) for servers. Air conditioning in server rooms will maintain RH within the allowed limits as well as maintaining temperature limits. You can't risk having condensation forming on electrical wiring or circuit boards as it will disrupt operation.

Cooling air without removing water vapour will increase its RH and increase the risk of condensation. So you can't put your equipment in a refrigerator.

I seem to remember from my time in mainframe/server rooms that we aimed for 50% RH. You don't want the air to be too dry either otherwise you get problems with static electricity.

Answered by Graham Nye on May 19, 2021

A refrigerator is basically a heat pump designed to achieve a temperature close to 0 °C. There are a couple of problems here:

  1. Most refrigerators are way too weak to effectively tackle the heat output of modern PCs. It will not magically make your PC 6 °C - instead the PC will de-cool all your groceries.

  2. Condensation. Computers and electronics do not like low temperatures because of moisture building up. Some manufacturers even warn from using your equipment before it has warmed up to room temperature.

  3. There are better ways to remove heat from hot pieces, like biiig heatsinks, water blocks and so on. A refrigerator basically uses air, so it's the least efficient method.

(If you connected the refrigerator coolant pipes directly to hot spots of the device it would be a different thing, but simpler alternatives, like water, exist.)

Answered by Thomas on May 19, 2021

The price-to-effect ratio for heat pumps is just not there for any large scale applications.

Especially since air or watercooling is far less complex, and therefore a lot, lot cheaper.

Dust is actually not a problem, just put the air intake through a room with filters (just like most mid- to high end PC cases have nowadays). For watercooled server centers, sturdy outdoor radiators already exist and are easy to maintain.

The gain in performance would be marginal compared to the insane costs of setting this up. Dusting off a server room occasionally is nothing compared to maintaining an enormous heat pump system.

Your concern about not running constantly is true for consumer fridges (though here the problem of the throughput rate is of more concern), but for industrial applications it's fine. But your run of the mill fridge simply won't be able to handle a mid tier PC for more than 5 minutes of gaming.

Answered by Hobbamok on May 19, 2021

They've been doing that with computers for 30+ years. Back when computers were the size of refrigerators. They use to get a regular air conditioner. Build a platform for the computer to stand on. A hole in the center of the platform. Then duct the air conditioner to the bottom of the platform. Forced air rose through the computer.

For your needs - can you redirect one of the air conditioning ducts in your room to run across your computer equipment?

Answered by Keith on May 19, 2021

Here's a webpage from a random guy who put a light bulb in his refrigerator to see what would happen, and took careful data to monitor it. Results:

The next experiment was to put a 60-watt incandescent light bulb inside the fridge. ... Over the next 55 minutes, I saw the temperature in the fridge slowly creep up, while the fridge's compressor ran constantly.

So 60 watts was more heat than the fridge's compressor could keep up with. So I opened the fridge again and swapped it for a 40-watt bulb at about minute 86. After that, the temperature dropped to just a few degrees, but the compressor never shut off.

I swapped the 40-watt bulb for a 17-watt appliance bulb. ... With the bulb adding 16.9 watts of heat to the fridge, the compressor duty cycle was 69%.

So if you tried to put a server in a standard refrigerator, and it consumed more than about 40 watts, the refrigerator would run non-stop; and it would not even stay particularly cool with a 60-watt server in there.

Answered by Michael Seifert on May 19, 2021

A typical residential refrigerator has a coefficient of performance in the neighborhood of 3, and a 150W compressor. This means it can reject heat at a rate of 150W*3=450W. This is of course just a ballpark number: some refrigerators are bigger or smaller, and the coefficient of performance will depend on things like the temperature differential between the inside and outside air.

Normally a refrigerator is full of things that don't actively generate heat, like yogurt and dead fish. If you put a server in there, and it generates more than 450W of heat (meaning, it consumes more than 450W of electrical power, which all ends up as heat), then refrigerator's heat pump won't be able to reject heat as fast as the server is producing it, and the interior of the refrigerator will increase in temperature until it's hot enough that enough heat escapes through the insulation to maintain equilibrium, which is almost certainly hot enough to cause failure of the servers inside.

Many laptops use less than 450W. Many servers use more. So depending on what kind of computer you put inside, this might work, or it might lead to inevitable failure of the computers inside. Multiple computers inside will almost certainly not work, so at best it's not a very efficient use of space.

You'll also have a few minor problems to solve -- how do you get the power cord in to the server? And the typical settings on a refrigerator are pretty cold. Perhaps not cold enough to be a problem directly, but if you open the door and let some warm, humid air inside, you'll have problems with condensation. So you may need to modify the thermostat to run the fridge at a higher temperature: this will address the condensation problem as well as increase the coefficient of performance.

But the idea is not really that bad of one. The technology used to cool the inside of a refrigerator is essentially the same technology used by air conditioners, and many data centers are thus essentially big refrigerators. You can read more about how modern data centers are cooled: https://journal.uptimeinstitute.com/a-look-at-data-center-cooling-technologies/

Answered by Phil Frost on May 19, 2021

To address the problem from a more abstract, physical point of view:

The problem with servers is that they produce heat. This heat has to be deduced somehow. If it isn’t, the server will heat up more and more until it breaks. Apart from temperature-dependent performance differences of the server, it doesn’t matter much at which temperature the server resides, as long as the heat is deducted somehow.

Deducing heat from an object is far more easy if it is hotter than the surroundings – the bigger the temperature difference, the easier. If you put a cup of boiling water in a room, the water will reach room temperature pretty quickly all by itself. However, it will never become colder than room temperature on its own. This is the second law of thermodynamics (Clausius’ statement). To make your cup of water colder than room temperature, you need some sort of heat pump, which consumes extra energy (and produces extra heat). Fridges use heat pumps.

As long as a computer operates above environment temperature, you can cool it by just facilitating sufficient heat exchange with the environment, e.g., by having lots of radiators, pumping air through it, etc. This is how desktop computers and laptops are cooled. For servers, you often have a combination of this and heat pumps (in air-conditioning systems), but these still operate more efficiently if the servers are hotter.

Answered by Wrzlprmft on May 19, 2021

Heat transfer capacity

The main problem is that a refrigerator is not potent enough for a server or a PC.

A refrigerator is a type of heat pump that takes heat energy from a cool environment and transfers it to a hotter environment. On both sides, it uses convection with the natural flow of air (as opposed to forced convection).

If you look at the history of computers, they started

  1. without any fans . Some smaller ones still can do without see e.g. Raspberry Pi. (convection)

  2. then they progressed to heat sinks (conductivity + convection)

enter image description here

  1. then combination of fan and heat sink (conductivity + forced convection) enter image description here

  2. Then water cooled heat sinks (replaced air with liquids). (conductivity + convection with water)

enter image description here

My point is that a pc is producing a lot of heat, that needs to be removed, which a refrigerator cannot do with the natural convection it.

moisture in the air.

If you could seal a server into a refrigerator and never open it again, then condensation would not be an issue. The reason is that, for condensations to occur you'd have to open the fridge and let the moisture creep in the cooler room. Even then, the little condensation would mostly happen on the cooler refrigerator surfaces and not on the heat source, and that can easily be removed (a lot of fridges have drains that remove the extra moisture).

However, keeping the refrigerator sealed is not really an option because you can't put a server into a refrigerator and keep it closed. The reason is that the higher temperature in the fridge would make the air stale, and would favor the development of fungi (have you ever left a fridge turned off with closed doors for a month?)

So, you'd have to have a constant flow of air (that would help also the natural convection). However, if you do that you'd start to worry about condensation a bit more, and then you'd need a humidity control device, and that departs from the idea of the typical refrigerator.

Answered by NMech on May 19, 2021

Its called a chiller rack - fridges are designed to be insulated (keep out heat), while a chiller rack pumps in coolant (often cooled water from a refrigeration unit, either on the rack itself or elsewhere.

Its not enough to keep pumping in cold air, or to keep heat out (when you're cooling heat generating devices), you need to vent it somewhere (outside? into the server room hot isles?). You can't really have a heat generating server in an insulated box - Cooling is significantly about air flow, and most sensible solutions use some form of forced conduction.

There's also rack mount air conditioning units that pump air out via something like a dryer hose. Craft computing has a video of one installed in a rack - however unlike a refrigerator - it has air vents and fans to move heated air out.

That said - there's another principle of cooling - thermal mass, and rooms have a lot of thermal mass. By pumping out your heat into the room, and cooling the room, except in the most extreme cases (for example high density blade servers), its just simpler and more maintenance free than chilling per rack, or even plumbing in a cooling system per server.

Answered by Journeyman Geek on May 19, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP