Комментарии:
Funny thing i got into servers a few months ago and might start hosting a local VPN
ОтветитьThere's your Global Warming. Instead of getting rid of cars, get rid of computers and servers. Lol
Ответитьthey are are not cooled by fire as depicted in the click bait thumbnail
ОтветитьIts so, so incredibly loud inside of a DC. Lots of fun too.
Ответить🤝🙏👍
ОтветитьCan use for personal cpu knowledge too. Pretty good :)
ОтветитьWho usually work in this data center? As in job titles?? I worked before on the but too scared to talk to the guys inside the data center
Ответитьnever been to a data center, however my future may involve going to one someday, and not for a tour
ОтветитьThis is why WestJet's whole system went down. The cooling system failed for their data centre. Possibly having only one data centre
ОтветитьBEAUTIFULLY EXPLAINED
ОтветитьI live in Denmark close to an Apple data center and it is the plan the part of my central heating will come from the data center from 2024.
Ответитьthey could liquid cool the servers have waterblocks similar to what you would use on your pc cpu and gpu.
they also make north and south bridge as well as hard drive and ssd water blocks.
just chilling out
ОтветитьThis is a very well explained video. I think the performance of the cooler is important, but in the end, the most important thing is to effectively convect and dissipate the generated heat. It seems that the actual cooling energy consumption can be reduced through this.
I think it is good to optimize the air flow to effectively dissipate heat.
One I’ve built is a hot aisle/cold aisle, air/mist evaporation cooled on from the second floor and forced down through the roof of the data hall and then hot air is removed and either mixed or expelled.
ОтветитьJust chilling out lol
ОтветитьThankyou soo much because of this video my college presentation went very very very good and my teacher also liked the information thankyou so much!! 🙏❤️
ОтветитьSome cooling centers use cold isle and hot isle. And some use cooling towers, while others use a different form. They are crazy. They have massive generators, and they are normally powered directly from the power source, IE hydroelectric dams.. one building can generate 1 trillion a year, and one section of the building can generate 500 million to 500 billion. I currently work at one such site. I work on ones that do not use refrigerant due to size, they are designed to be replaced after 10 years...
ОтветитьThanks for sharing. a quick question if I may. For DX CRAC units why the compressor is always installed in the indoor units?
ОтветитьIsn't water vooling inside the computer more effective than air cooling? In underwate data center it seems like the most doable option too.
ОтветитьI was told by a DC Manager that Liquid Immersion Cooling will replace all of the above, on new DCs over the next 10 years. It's the next gen server cooling system apparantly. No CRAC's, CRAH'S, AHU's, Chillers, Raised Floors, Hot/Cold Aisle Containment etc. Would be nice to see a vid on that.
ОтветитьWhy not just open data centers in the Arctic?
ОтветитьGood video, well done, but mathematically is probably the easiest calculation.(For an A grade student). :))
ОтветитьThe most environmental unfriendly things from.the moment, data centers....
Some use green energy, the claim.
But still they have emergency generators as backup power which are test run on a regular basis.
Still a big CO2 contamination while the data centers also take the green energy away from normal houses.
I installed many cooling units in data centers. They also has Halon fire suppression systems in them. I always worried about setting off the Halon system while working in them. Halon gas eats oxygen in the room quickly.
ОтветитьPower
ОтветитьQuestion: Is it possible to harness the heat from the hot air flowing in the ceiling into energy?
Because I have a dumb tought of placing a stirling engine (which I discovered by yt recommend) on the top of the ceiling where the hot air flows
If changing all cpu to ARM, just minor cooling is needed. X86 is earth destroyer, please say bye bye to x86 to make our earth greener!
ОтветитьWhy is this recommended to me I build these things I know how they work
ОтветитьI would like to ask you if there's any difference when I placed the crac linear with the cold aisle or linear with the hot aisle? which is more efficient? Can I Calculate it?
Ответитьyes siir... im a technician of a PACU units... especially vertiv😁😁👍👍
ОтветитьCool
ОтветитьHey Paul, I work in the immersion cooling datacenter industry. Drop me a line to see how we can work together.
ОтветитьI have designed and built over 30 data centers worldwide.
Ответитьcould the removed hot air then be used to turn a turbine, and convert some of the waste heat back into electricity?
ОтветитьWe are using DAHU Fans for Cooling.
ОтветитьVery interesting video!
A good tip for efficiency is to explain to the customers/rack owners, that blanking panels and correct installed equipment are mandatory....
No matter, how smart you build your mechanical cooling system and cold aisles.... when the equipment you want to cool is not installed properly, you will always have an issue.
Great that you made vido about Datacenter, I was waiting for one from you. Good work 👏
Ответитьi used to work in the 9/11 memorial as an engineer. Data centers were top alert at all times, and we had more than a few emergencies where the temp climbed from 60 to near 82 in minutes. The port authority server room had two constantly running dataaire units, and you literally had to wear a jacket if you were working inside for any length of time.
ОтветитьAs a design engineer for a company focused on data center cooling - can confirm
ОтветитьEven with optional airflow equipment, data center operations folk seem to still have a knack for installing intake on the hot aisle and exhaust on the cold aisle...
Ответитьjust chilling out
ОтветитьI work at a data center. I'll say this is a good video.
ОтветитьThat was useful
ОтветитьA data center video on how the critical load is maintained during power outage by generators, ATS’s, UPS’s, PDU’s, and static switch PDU’s would be cool. There’s so many configurations though.
Ответить