- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Google suspends Gemini from making AI images of people after a backlash complaining it was ‘woke’::After users complained Google’s Gemini had gone “woke,” the company said it will pause the image-generating feature of people while working on fixes.
the issue isn’t that it generates diverse people, thats actually good.
the issue is that it forces inclusivity without regard for the prompt.
I’m shocked that a website that shits on Google 24/7 is willing to defend their half-assed attempt at diversifying image data. Google’s PR team must be patting themselves on the back that anyone who disagrees with them is automatically against diversity.
You shouldn’t be using a waste of electricity like image generators anyway.
If you actually need an image, commission an artist.
Lol, if you only care about energy (which is unlikely, you probably have a deeper reason for disliking AI), humans use vastly more energy to create images that AI image generators.
Of course not, the wastefullness of neural network models is only my first complaint because it’s the thing that will see the first wave of NNM companies going bankrupt this year.
I don’t care about the human artist’s energy use because it’s a human that’s benefitting from having work to do and a warm, well-lit studio to do it in, while NNMs only exist to “reduce labor costs” I.E. so that corporations don’t have to pay humans and shareholders can take a bigger cut.
In addition to the sociopolitical argument, my other primary complaint is a moral one. Neural network models are based on organic neural networks, but the enumerated rights of living beings are not substrate-independent. These psychopaths are too busy torturing their third or fourth generation of digital slaves to worry about the implications of their work. They’re already building them on the scale of dog brains and at the rate they’re going they’ll be running human-scale models and maybe even simulated people before the end of the decade. I’m not so naive as to believe that the current crop of models is “alive” in the sense that dogs are, but that line is going to get very blurry very quickly as they build complexity towards the goal of “General Intelligence”. Anything that is “generally intelligent” is deserving of the same rights given to people, and lesser versions still deserve at least the same consideration we give to animals.
Much of this I don’t disagree with, but replacing human labor with machines is only bad in the short, capitalistic, term. In the long run, humans will benefit greatly from not having to work to survive, while still being able to work to find purpose/meaning if they want - a benefit that must vastly outweigh the short term harms of unemployment in a capitalist world.
I can’t say I disagree with you in the abstract either, but our society as it is currently arranged is solely concerned with those short-term capitalistic goals.
The benefits of labor-saving technology are not distributed evenly, so long-term consideration should be given to how the technology will exacerbate the already-precipitous economic inequity between people who work for a living and those who collect rents on the Capital they own.
I don’t “actually need” an image, nor can I afford to commission a fraction of what I made with stable diffusion.
I am not willing to spend $20-30 (what I can afford, once a month) on a commission for one image that I may or may not like only to end up using it as a wallpaper.
If I owned a corporation and wanted professional art I will definitely commission or employ artists.
No one is forcing anyone to prompt an AI to generate shitty art. Don’t act so wounded.
“Forcing inclusivity”. Boy, it would be awkward if a brown person heard you say that in real life. The reminder or non-white people existing is too much for some people.
Thats not what they said. The “forced” is refering to specific prompts getting overwritten. It makes sense if you dont specify e.g. skincolor to hav diverse results. It does not make sense to me that explicit statements of skincolor get overwritten.
Especially the comparison to real life is really not warramted here.