Can someone help me understand how cooling data centers "uses" water? The water presumably isn't destroyed it's just put back hotter than before right? I don't imagine it comes out as steam like a nuclear plant but I suppose it could.
I asked a friend who works on datacenter this so it's secondhand information but - apparently it's cheaper to use mist cooling where you essentially spray cold water on hot parts and it evaporates off instead of the normal closed loop system that you'd see in a home water cooling for example. And while it's not destroyed, it means you need a constant supply of clean water, and the evaporated water doesn't necessarily go back to the body of water you got it from.
> The water presumably isn't destroyed it's just put back hotter than before right?
I'd assume the water isn't put back into circulation after passing through a data center, so if anything it might be cooled and then reused in the same data center, best case scenario.
The results is the same, farmers who already have to fight to get enough water in Aragon now have to fight more and compete against companies like Amazon and other foreign investors for the water. Aragon isn't exactly wetlands, so hard to not feel the local government is making the wrong choice here.
And even if it was put directly back into circulation, it wouldn't be the same water. No more fishes and algae and whatever made it "living water" before, but warm water likely "enriched" with who knows what washed off chemicals (I don't see much talk about that either)
When water changes phases from liquid into a gas it sucks a lot of energy into it. This is how sweat works to cool you. The water comes in, cools things through conduction (think liquid cooling in a computer) and then they evaporate it and it sucks a bunch of heat. It is then in the air and becomes rain eventually.
In addition to the other concerns of evaporative cooling, there is one issue with a simple water cooled loop that just runs river water through a heat exchanger and then dumps it again. You can only raise a river's temperature so much before it becomes a problem for the ecosystem. This issue has in the past forced power plant shutdowns, so it's not that simple.
As other have stated, evaporative cooling is most of the usage, AC dries the air and requires adding moisture to the air to prevent problems with static electricity, this drives a much smaller portion of the usage.
Note that evaporative cooling also produces significant amounts of brine, as it is similar to distillation, producing a brine that is difficult to treat and often not useful for other purposes.
With AC can't you just well re-introduce it? The cold side could be almost closed system as nothing demands you to remove water in such location. For occupied spaces lower humidity generally feels better so moisture is removed. But for data centre it could be re-introduced?
Rain can deposit water in seas or lands away from country's border. Anyhow its always efficient and easy to collect water from a running river on its way to sea.
Data centres can be designed to harvest the steam / evaporated water and reuse it but thats expensive too and they might not choose to do so.
Apparently, it does not have to be this way. According to https://youtu.be/GhIJs4zbH0o?t=895 Stargate is designed with a "closed loop system" that will be filled up just once.
In the video he paints the picture of asking for "a million gallons of water one time" VS asking for "a million gallons of water per hour", that cannot possible be a faithful comparison.
For these new planned data centers, realistically, what would the "one-time" volume be VS how much they would need per hour/day? And not some straw-man argument like what the guy in the video said.
It's a fairly shallow info pop piece that feels more like a commercial. The only point with regards to this discussion (and assuming it is not all just a lie, which I did not investigate, so it might just be): You can have a closed system that does not "use up" water.
> You can have a closed system that does not "use up" water.
But those "a million gallons of water" (or whatever it consumes/uses) have to come from somewhere, and cannot go somewhere else at the same time as it goes to the data center, so from the perspective of the Aragon farmers, isn't the water "used up"?
Similar to other industries, the water isn't lost per se, but it's deferred. It takes a considerable amount of time for water to do the whole loop and end back where it started, so the more you defer, the less you have available for consumption at any given time.
I think the issue is the volume of water needed to cool these systems during peak hours. Even if the water is reclaimed, allocating all that water for availability by the data centers distorts availability for everyone else in the region.
Exactly this. This is a major issue in my region right now, central Texas. Data centers are going up in some of the most drought stricken areas of Texas, and since it is hot and dry, they want to save money on cooling by misting. This uses precious fresh water that would have gone to support the overwhelming growth of local cities. So now everyone who lives in the area is paying more for water service because the cities are competing for water with the data centers. Guess who has more lawyers and money?
It's corporate greed and an unwillingness to face the external consequences of demand for compute. Just because misting is cheaper doesn't make it automatically better.
Water is recycled in the cooling system, but part is lost through evaporation. But that's an excellent question, to which I have a hard time finding concrete numbers that answer it.
killing any healthy living parts in the water and contaminating the water with difficult molecules, of various degrees of toxicity. As others point out, the flow and duration of the flow of water is part of any ecosystem. There are already living things around water, so it also depends on how much living things are valued compared to industrial operations.
You think an LLM provider has a bigger moat than an IDE (say pre vs code for a better parallel). MSDN and Jetbrains licenses are far more expensive than Cursor or Windsurf.
While I agree that we should procreate, I disagree with the entire premise of this post.
I have always been garbage at the visual arts. Art classes in school were my worst class every year, I just could not get the pictures in my head down on to paper. I accepted that I just wasn't "creative". This mentality persisted through my school years and I ended up getting a business degree because it's not "creative". I was miserable. Eventually saw the light and got into web dev. I believe this is a creative pursuit, and the issue is the medium. Everyone is inherently creative and the general act of creation is one of the most fulfilling things you can do to occupy your time.
As a web dev AI allows me to be creative in my ideas and prompting, generating pictures that I would otherwise not have the ability to create myself. Before you say I'm taking the job of graphic designers, I assure you none of these uses would have ended in me spending any money or even someone else's time if AI image gen didn't exist. AI lowers the bar for people to pursue the act of creativity in a medium they otherwise struggle in.
Ok people are dunking on this for plenty of good reasons, but dear Lord, do you really think putting buttons at the top of the screen where they're the least reachable is a good idea? Maybe email send is not the best example but moving buttons down towards my thumb is a great move on these screens that won't stop growing.
If I ever find myself in your town I'm going to get free wifi.
But seriously I've wanted to build something like this for so long just never had the time. Going to definitely do it now. Love the idea of using the camera status to change the light.
I learned to code professionally in Ruby but wrote C# .Net for almost 10 years. I've probably forgotten more about .Net than I ever learned about Ruby at this point so take what I say with a grain of salt.
.Net has tons of configuration and boilerplate so I can't say that it's exactly the same in that sense, but the more meta theme is that just as there is a Rails way to do things, there is a Microsoft way to do things. Unlike Java where you're relying on lots of third party packages that while well maintained, aren't owned by the same company that does the language, framework, ORM, database, cloud provider, IDE and so on. Having a solid well documented default option that will work for 99% of use cases takes a lot of the load of decision making off your shoulders and also means you'll have plenty of solid documentation and examples you can follow. I've been in JVM land for the past couple years and it just can't compare.
I know Java people will come fight with me after this but I just don't think they know any better.
I don't want to fight you, because I don't know .Net well enough to have an opinion.
But I just want to say that I have the same feel when I develop using Spring Boot. I am extremely productive and seldom have to pull dependencies outside Spring for 80% of what I make.
Well not as strong as the best steels, but still stronger than many common steels. Even some less special bronze alloys can beat common steels in strength.
reply