When you picture the tech industry, you probably think of things that don’t exist in physical space, such as the apps and internet browser on your phone. But the infrastructure required to store all this information – the physical datacentres housed in business parks and city outskirts – consume massive amounts of energy. Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

This is a hugely environmentally destructive side to the tech industry. While it has played a big role in reaching net zero, giving us smart meters and efficient solar, it’s critical that we turn the spotlight on its environmental footprint. Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities. It is hardly news that the tech bubble’s self-glorification has obscured the uglier sides of this industry, from its proclivity for tax avoidance to its invasion of privacy and exploitation of our attention span. The industry’s environmental impact is a key issue, yet the companies that produce such models have stayed remarkably quiet about the amount of energy they consume – probably because they don’t want to spark our concern.

  • alekwithak@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    2
    ·
    1 month ago

    I think the real question is what is the value we are getting out of the resources used? Do we need AI forced into every platform? Personally, I don’t think so. But just to be sure I asked Chat GPT and here is its answer:

    “Rather than integrating Al into every possible application, a more measured approach might be beneficial. Assessing the actual need and impact of Al in specific use cases can help avoid unnecessary energy consumption. Al should be implemented where it provides significant benefits and improvements, rather than as a default addition to every platform.”

    So even the AI itself knows it is used frivolously.

    • AIhasUse@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      3
      ·
      1 month ago

      A large part of creation is trying things and seeing what sticks. Nobody is claiming that every way LLMs are being tried out today will always be here. We are just doing what we think of and seeing what is useful. The useful things will stick around and evolve, other things won’t. Go back to videos from the early 90s when computers were starting, people talked so much shit on them. Now, we all have the future generations of them in our pockets.

        • AIhasUse@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          Do you think that people were better off before computers? How so? Do you think there was more or less war? Do you think people died at an older or younger age? Do you think people had more or less years of sickness? Do you think more or fewer mothers and children died at childbirth? Do you think there were more or fewer rapes? Do you think there were more or fewer murders? Do you think we knew more or less about the universe beyond our planet? Do you think we knew more or less about the laws of physics? Chemistry? Biology?

          In nearly all measurable ways, the lives of humans have improved since the advent of computers. To act otherwise is naive.