I know exactly how much goes into it, 5 seconds of GPU time, on my own computer. That’s why I said it. How many phone charges do you think it would take to fully create a digital drawing on a laptop? It’s not going to be much different IME.
You aren’t taking into account the resources required to train the model. You clearly have very little idea how much goes into it other than running software someone else wrote.
Of course I’ve taken into account model training costs, was that supposed to be your gotcha? You don’t actually think the energy cost amortization from training still accounts for the bulk of energy expended per image generated do you?
I know exactly how much goes into it, 5 seconds of GPU time, on my own computer. That’s why I said it. How many phone charges do you think it would take to fully create a digital drawing on a laptop? It’s not going to be much different IME.
You aren’t taking into account the resources required to train the model. You clearly have very little idea how much goes into it other than running software someone else wrote.
We aren’t training models here, this isn’t the training-ai-with-massive-datacentres@lemmy
Of course I’ve taken into account model training costs, was that supposed to be your gotcha? You don’t actually think the energy cost amortization from training still accounts for the bulk of energy expended per image generated do you?