Grok didn’t do it, it just enabled people to do it.
Most AI platforms allow sexualized content to varying degrees. Google, Instagram, Tiktok, etc all host CSAM. Always have. It’s been understood that they’re not liable to the extent that they remove it when reported. Their detection technology is pretty good as to automatically handle it, but never perfect. They keep track of origins and comply with subpoenas which has gotten tons of people convicted.
Grok image gen was put behind a paywall, which people claim is worse. However, most people paying lose anonymity and thus can be appropriately handled when they request illicit content, even if Grok refuses the request like it usually does.
I think the Grok issue is sensationalized and taken out of context of the realities of what happens online and in law enforcement.
Yeah NGL. Kind of weird that all the noise is around the machine, that can’t be held accountable, because it is a machine, and not on the people actually making the images.
This is like blaming the camera for recording something illegal.
Maybe they should look at the hand holding the tool if they want accountability.
But I cant even get grok to make a picture of a 3 boobed Sydney Sweeney. Smh
We care more about this than we do about school shootings.
This is a fucking nightmare. Twitter has effectively become a sexual exploitation generator.
I have argued extensively with people on lemmy about why having AI porn generated of you without your consent is deeply traumatizing and a violation of your rights and privacy. Actual entire threads of people telling me that AI deep fake porn was perfectly fine and that we’re being too sensitive to not want our male peers and random strangers making AI deep fake porn of us. Wonder where those people are now.
I LOVE how we’re giving this CHILD PORN CREATION TOOL BILLIONS of US Tax Payer Dollars while ALSO Spending US Tax Payer Dollars on PROTECTING JEFFREY EPSTEIN and ALSO Spending US Tax Payer Dollars on ARMED MEN KIDNAPPING CHILDREN TO PLACES WERE NOT ALLOWED TO SEE!
Limited liability companies are “limited” in the sense that there are limitations on the responsibilities of the members of the corporation. The CEO can’t be held personally liable for the actions of the company, for example; their underlings could have been responsible and kept the leader in the dark.
However, there’s this interesting legal standard wherein it is possible to “pierce the corporate veil” and hold corporate leadership personally liable for illegal actions their company took, if you can show that by all reasonable standards they must or should have known about the illegal activity.
Anyway Elon has been elbow-deep in the inner workings of Xitter for years now, by his own admission, right? Really getting in there to tinker and build new stuff, like Grok and its image generation tools. Seems like he knows an awful lot about how that works. An awful lot.
If even Grok shut down completely, it doesn’t mean anything. Pandora’s box is open and AI generated porn is here to stay. There are soooooooooooooooo many websites that exist just to generate deepfake nudes and AI porn. You take down one, another 100 pop up. It’s a futile game of whackamole.
Even if we passed laws banning this shit, the technology that enables it to be a thing is free, open source, and can very easily be modified to do precisely this. Anybody can run these models locally at any time, and nobody can do a thing about it. Basically what I’m trying to say is that we’re cooked.
I mean, yeah, but it’s one thing if some perv’s running it on their own box after reading 5 guides vs. elon musk having a tiwtter bot that does it for you without even trying to stop it from doing that even after knowing it’s doing it. the former is unavoidable, the latter is a choice he’s made for some fucking reason
I agree.
It’s fine if people want to get mad about it, but it’s more effective to just learn to live with it because it’s not going away.
Time to start creating and spreading nudes of all these billionaires wives.
Weren’t they already picking out their wives from public nudes?
it was just one guy who was like
for num in range(3000000): if num < 23000: Grok.draw_sexualized_image(minor = True) continue Grok.draw_sexualized_image(minor = False)I think this code could be made more efficient






