• cogman@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    22 hours ago

    The amount of power AI and Crypto require is orders of magnitude the amount of power required by pretty much any regular application. The company I work at uses somewhere around 2000 CPU cores worth of compute at AWS (and we have ~100 microservices. We are a fairly complex org that way).

    Generally speaking, an 80CPU core system takes up ~200W worth of power. That means my companies entire fleet operating eats about 5kW of power when running full bore (it isn’t doing that all the time). My company is not a small company.

    Compare that to what a single nvidia A100 eats up. Those GPUs take up to 400W of power. When doing AI/crypto stuff you are running them as hard as possible (meaning you are eating the full 400W). That means just 12 AI or crypto apps will eat all the same amount of power that my company with 100 different applications eats while running full bore. Now imagine that with the model training of someone like chatgpt which can eat pretty much as many GPUs as you can throw at it.

    To put all of this in perspective. 5kW is roughly what a minisplit system will consume.

    Frankly, I’m way more concerned about my companies travel budget in terms of CO2 emissions than I am our datacenter usage.