Another big argument is the large resource and environmental cost of AI. I’d rather laugh at a shitty photoshop or ms paint meme (like this one) than a funny image created in some water-hogging energy-guzzling server warehouse.
You’re confusing LLMs with other AI models, as LLMs are magnitudes more energy demanding than other AI. It’s easy to see why if you’ve ever looked at self hosting AI, you need a cluster of top line business GPUs to run modern LLMs while an image generator can be run on most consumer 3000, 4000 series Nvidia GPUs at home. Generating images is about as costly as playing a modern video game, and only when it’s generating.
Another big argument is the large resource and environmental cost of AI. I’d rather laugh at a shitty photoshop or ms paint meme (like this one) than a funny image created in some water-hogging energy-guzzling server warehouse.
You’re confusing LLMs with other AI models, as LLMs are magnitudes more energy demanding than other AI. It’s easy to see why if you’ve ever looked at self hosting AI, you need a cluster of top line business GPUs to run modern LLMs while an image generator can be run on most consumer 3000, 4000 series Nvidia GPUs at home. Generating images is about as costly as playing a modern video game, and only when it’s generating.