First Post (but have loved playing along)
Mellow Yellow by Donovan
That’s AI? Dear me, it’s really really sexy.
She wears short shorts
Nopes
1985 by bowling for soup
Exactly…! I was worried for a bit there no one would get it 🥳
Soon as I saw it, the aesthetic reminded me of the song. Then I remembered the line, “her yellow SUV, is now the enemy” and was sure of it!
Yeah the AI grabbed that yellow reference and went wild with it.
The prompt was that entire verse,
She was gonna be an actress, she was gonna be a star, She was gonna shake her ass on the hood of Whitesnake’s car, Her yellow SUV is now the enemy, Looks at her average life and nothin’ has been alright,
What’s also funny about that, is I always took it as the modern (2000s) yellow suv is now the enemy cause shes driving her kids all around, but the AI definitely heard 1985 and “yellow suv” and tried to smash them together.
Coldplay - Yellow
Good guess, but no
Killing the environment one stupid AI image at a time. Great job
Oh no! 5 seconds of GPU time on consumer grade hardware!
It’s not nearly as small as you think it is.
Generating 1,000 images with a powerful AI model, such as Stable Diffusion XL, is responsible for roughly as much carbon dioxide as driving the equivalent of 4.1 miles in an average gasoline-powered car.
Even better when you take into account the scale at that these run at:
https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/
Consider the average toaster, roughly 1100W (on average) and toast takes 1-4 min to cook, (for the purposes of this we’ll split the difference and say 2 minutes).
With math, toasting 1 slice of bread equates to roughly 0.037kWh of electricity. (kWh = (watts × hours) ÷ 1000)Now I’m running a 7900XTX (OC) who’s peak power draw is 800W (300W less than a toaster), and it legit takes 5-10secs to generate an image. Realistically I might do a couple of runs (some small then one big one) and use 30 secs of peak compute time. This would equate to 0.0067kWh of electricity usage.
Toasting bread quite literally draws way more electricity than it takes for me to generate one AI image.
So are you out there hassling people cooking their morning toast for thier criminally high power usage?
Also some further context for you, I don’t use Stable Diffusion XL (listed in your article) as the old school 512x512 is more than enough for my needs (as demonstrated in this post^^). Your second article is paywalled, (not great to share if ppl can’t access it), but appears to be data center use which as described above is not what I’m doing here.
edit, spelling
I know exactly how much goes into it, 5 seconds of GPU time, on my own computer. That’s why I said it. How many phone charges do you think it would take to fully create a digital drawing on a laptop? It’s not going to be much different IME.
You aren’t taking into account the resources required to train the model. You clearly have very little idea how much goes into it other than running software someone else wrote.
We aren’t training models here, this isn’t the training-ai-with-massive-datacentres@lemmy
Of course I’ve taken into account model training costs, was that supposed to be your gotcha? You don’t actually think the energy cost amortization from training still accounts for the bulk of energy expended per image generated do you?