Is there any reason the water can’t be safely consumed later? It’s not toxic or nuclear is it? The cooling water didn’t just up and disappear did it?

Edit: Links provided in the comments…

Notable comments:

Edit addendum: I’d like to thank everyone that’s participated in this question thread, sorry if I missed any good relevant links in the comments.

To be clear, I still loathe the whole AI datacenter era, it really is heavily wasteful of resources, notably energy, but I wanted to better understand the water usage situation.

  • WxFisch@lemmy.world
    link
    fedilink
    English
    arrow-up
    104
    arrow-down
    1
    ·
    22 hours ago

    It evaporates, that’s how it cools. The water is sprayed over a heat exchanger and gets turned to essentially steam and then new water is pumped in and thus the water is “gone”. It will fall as rain somewhere but likely not near where it was taken from.

    A closed loop system could be used but they are more expensive and require more maintenance so large data centers don’t usually use them unless required to.

        • Whats_your_reasoning@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 hours ago

          Some aircraft engines, too. The old single-engine Cessnas I trained on were air-cooled. Though that’s pretty easy when you’re pushing cool, atmospheric air over the engine at 100 knots.

      • BussyCat@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        5 hours ago

        Car cooling systems are stupidly expensive, run at temps that would damage computer CPUs, run outside, and have a really nice advantage over computers which is that at higher heat loads they also tend to go faster thus cooling them off faster.

        Now imagine you redlined a dozen cars for days on end in a garage in the middle of the summer do you think you might damage some components?

        It is still very possible to use closed loop cooling on data centers but any system you build needs to be able to work in summer temps which can be as high as 35-40C and needs to do that without letting the computers exceed 60C. An air cooled system to handle that much heat is going to be very expensive and use a ton of power (and power generation also uses water)

        • over_clox@lemmy.worldOP
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 hours ago

          While you’re effectively right in your comparison, you also must understand the difference between electronic data center cooling vs vehicle engine cooling.

          Vehicle engines run best at a higher temperature range than electronics, so they install a thermostat, to literally bring the engine temperature up to a suitable range for ideal performance. But the thermostat is not necessary (unless you live near cold polar regions and want heat).

          The thermostat can be safely removed from vehicles in more comfortable climates and the vehicle will run just fine, but just quite a bit cooler.

          So, take the concept of a closed loop cooling system, remove the thermostat from the equation, and you got a more viable closed loop system more suitable to keep electronics cool.

    • over_clox@lemmy.worldOP
      link
      fedilink
      arrow-up
      47
      arrow-down
      1
      ·
      22 hours ago

      I am still learning. Thank you for your educational comment.

      I loathe AI anyways, I just wanna better understand why I loathe AI…

      • qupada@fedia.io
        link
        fedilink
        arrow-up
        17
        ·
        21 hours ago

        Further to this, as well as the source of the water often being the local city’s drinking water supply (as we’ve found this puts a strain on that supply), evaporative cooling systems concentrate the minerals / contaminants in the water, meaning a smaller (relative to what is evaporated) of now highly-concentrated runoff water also has to be constantly disposed of. This likely is also going into the city’s wastewater systems.

        Radiators for closed-loop systems do also occupy more space (for the same cooling capacity) versus evaporative cooling towers, and are more limited in the range of climates they can be deployed in.

        On balance though, the closed-loop cooling should always be the first choice; if it works for the deployment it will never be the wrong choice on a long-term / total cost of ownership basis.

    • troybot@piefed.social
      link
      fedilink
      English
      arrow-up
      23
      ·
      20 hours ago

      Ok so what you’re telling me is power plants generate electricity by burning fossil fuels which power a turbine with steam, then the data center uses all that electricity to produce even more steam?

        • BlackLaZoR@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          10 hours ago

          The most funny thing is, that once trained, the model can run and make furry porn on your local machine. So they don’t even make any money on this.

          • Railcar8095@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            9 hours ago

            I haven’t tried furry porn models in particular, but all local image generation I’ve tried locally was really bad. It was with a 3070, so nothing really meant for this.

            • BlackLaZoR@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              8 hours ago

              Mainline stable diffusion is pretty horrible. Try using SDXL community fine tunes (if they fit into your VRAM). They’re worse than cutting edge cloud models, but they punch way above their league

      • UniversalBasicJustice@quokk.au
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        10 hours ago

        I have bad news for you; it’s all steam. EVERYTHING is steam. 🌍🧑‍🚀🔫🧑‍🚀🌚

        Even you and I are just steam in liquid and solid phases.