Salta al contenuto principale


I find it interesting that there's loads of people who made a core part of their identity campaigning against trans men being in women's spaces and how it impacts women, who have gone completely silent about Grok being used to undress and brutalise women.
Questa voce è stata modificata (1 mese fa)
in reply to Kevin Beaumont

If you're wondering on xAI's stance on this, aside from Elon posting a crying with laughing emoji, Grok's creators have raised a further $20bn from Cisco and such yesterday.
reuters.com/business/musks-xai…

reshared this

in reply to Kevin Beaumont

If you're wondering what consequences X has faced: none. At all. A few months ago when Grok called itself MechaHitler, the service was shut entirely for days, the same day. When this women issue happened, Elon laughed.

Grok is still outputting non-consensual deepfake pornography and sexual abuse material at a rate of 1 post per second. Example search:

from:@grok filter:media

Direct link:
x.com/search?q=from%3A%40grok%…

I hope Cisco enjoy directly funding sexual abuse material.

Questa voce è stata modificata (1 mese fa)

reshared this

in reply to Kevin Beaumont

It'll be a matter of time til Meta actually considers this a feature. Absurd to think this is legitimately the worst feature i've ever seen in my life.

Grok is unleashing an apocalypse much to demoralize everyone else who keeps using X to post their drawings, but it'll also plague everything posted there. In which means we're going to say goodbye to the art and photos we've used to love.

Anyway, X wasn't actually designed to post portfolios.

Questa voce è stata modificata (1 mese fa)
in reply to Kevin Beaumont

Wired has a look at videos created on Grok. Includes knives being inserted into vaginas, “very young” people having sex. Around 10% are CSAM and still online. xAI declined to comment.

wired.com/story/grok-is-genera…

in reply to Kevin Beaumont

The Internet Watch Foundation (IWF) says its analysts have discovered "criminal imagery" of girls aged between 11 and 13 which "appears to have been created" using Grok. bbc.co.uk/news/articles/cvg1mz…
in reply to Kevin Beaumont

The Information Commissioners Office has put out a statement on LinkedIn about serious concerns about xAI and Grok. linkedin.com/posts/information…
in reply to Kevin Beaumont

Bloomberg reports Twitter has now become the number one app for deepfake undressing of women by a sizeable amount. bloomberg.com/news/articles/20…
in reply to Kevin Beaumont

The UK PM has asked communication regulator for options around banning Twitter from operating in the UK, due to their refusal to deal with unconsensual undressing of women and girls. telegraph.co.uk/business/2026/…
in reply to Kevin Beaumont

This is still happening btw, roughly one a second. People are doing it to female UK MPs and cabinet members, and getting Grok to CC with the images.
in reply to Kevin Beaumont

Maybe if someone created nudes of male leaders, with little tiny genitalia, there might be more effort made to stop this.
in reply to Kevin Beaumont

It looks like the threat of X being banned in the UK worked - Grok has been limited to paid users. Take your crying with laughter emojis and stick them up your arse, Elon. bbc.co.uk/news/articles/c99kn5…
in reply to Kevin Beaumont

The UK government says the move by X to limit Grok to paid users is “insulting” and basically monetising abuse, and they would support a ban of X in the UK if recommended by the regulator. They’ve asked the regulator for recommendations in days. bbc.co.uk/news/articles/c99kn5…
Questa voce è stata modificata (1 mese fa)
in reply to Kevin Beaumont

Ugh Grok is so gross, I had a Senior Manager at my current job who is in charge of our AI tell me GROK was A-OK to use within our company even though everything about it and X goes against our companies policies.

I now assume is just another incel tech bro who only cares about himself and hates others because he is a small minded man.

I plan to report him to HR when I get back to the office on Monday. 😀

in reply to Kevin Beaumont

Some Grok users, mostly men, began to demand to see bruising on the bodies of the women, and for blood to be added to the images. Requests to show women tied up and gagged were instantly granted.

‘Add blood, forced smile’

theguardian.com/news/ng-intera…

in reply to Kevin Beaumont

THe more this keeps going on, the more i'm speechless. Makes me grow a desire for Twitter to disappear completely.
in reply to Kevin Beaumont

I asked Cisco why it is directly funding an AI tool being used to non-consensual undress and brutalise women and children, having this week invested in xAI. They replied No comment.

I have a list of other cybersecurity providers invested in xAI, I am working my way through those and plan to feature the key staff members involved in a write up.

reshared this

in reply to Kevin Beaumont

Before we get to the staff members at cyber companies, the Financial Times has the staff at X and xAI. ft.com/content/ad94db4c-95a0-4…

reshared this

in reply to Kevin Beaumont

One feature shit enough to change the whole site completely...
in reply to Kevin Beaumont

OFCOM have opened a formal investigation into X.

They may fine X 10% of its global revenue, require all advertisers to withdraw as an X client, and require UK internet providers to block X. ofcom.org.uk/online-safety/ill…

Questa voce è stata modificata (1 mese fa)

reshared this

in reply to Kevin Beaumont

The UK government is to enforce a law this week which will make it a criminal offence to create non-consensual intimate images and make a new law to make tools to create non-consensual intimate images illegal bbc.co.uk/news/articles/cq845g…
Questa voce è stata modificata (1 mese fa)

reshared this

in reply to Kevin Beaumont

A new YouGov survey shows that the British public overwhelmingly believe AI companies should not be allowed to generate such imagery. Fully 96% of Britons say that firms should not be allowed to generate ‘undressed’ images of children (only 1% say they should), with 87% saying the same regarding such images of adults (5% think this is ok). yougov.co.uk/technology/articl…
in reply to Kevin Beaumont

California investigates Grok over AI deepfakes bbc.co.uk/news/articles/cpwnql…

reshared this

in reply to Kevin Beaumont

X has finally climbed down over Grok generating non-consensual undressing, after many of the major regulators got involved.

I’d strongly recommend countries have robust laws and regulation in place for GenAI being used to brutalise women and minorities as ultimately the whole episode shows it’s only external pressure that will hold companies accountable, not self regulation and common sense. X basically tried to monetise incels harassing people.

reshared this

in reply to GhostOnTheHalfShell

@GhostOnTheHalfShell "in those jurisdictions where it's illegal" 🙄 . So, their servers in the US will be hosting CSAM, still.
in reply to AI6YR Ben

@ai6yr
X must be shut down. It would be really interesting if you know around 5 o’clock in the morning when it’s still really dark, Apple stores would have some wonderful street art added in front of them.

Pedo App Store

in reply to Kevin Beaumont

and don't forget to prosecute for providing that service for the time that it did. Aiding and abetting the creation and dispersal of CSAM is illegal pretty much everywhere. Take them to court for it. Take the decisionmakers in the company to court. Throw the book at them. Hard.
Maybe then they'l learn to at least follow the law. Being decent human beings is sadly beyond them.
in reply to Kevin Beaumont

EU opens formal probe into X over Grok deepfakes.
They will be looking at what controls xAI had.

Spoiler: you could reply to any woman on X and ask it to remove her clothes, so not many controls.
ft.com/content/f5ed0160-7098-4…

Questa voce è stata modificata (2 settimane fa)
in reply to Kevin Beaumont

The UK’s Information Commissioners Office has opened another investigation into X, citing serious concerns about the company’s behaviour around AI. bloomberg.com/news/articles/20…
in reply to Kevin Beaumont

Reuters reports Grok is still generating non-consensual sexualised images of people - even when told the subjects don’t consent. reuters.com/business/despite-n…

reshared this

in reply to Kevin Beaumont

UNCyber and Europol are investigating X for production of child sexual abuse material. europol.europa.eu/media-press/…

reshared this

in reply to Kevin Beaumont

This Washington Post story has sparked a whole bunch of regulatory interest in xAI and X, unsurprisingly: washingtonpost.com/technology/…

Unpaywalled: archive.ph/i9z7w

I imagine X may see some corporate credit card transactions for shredders soon.

Questa voce è stata modificata (1 settimana fa)
in reply to Kevin Beaumont

So it *was* a marketing campaign.

I don't see how this helps at all.

in reply to myrmepropagandist

@futurebird Limiting Grok to paid users (here in the UK) probably won't save Grok in the UK, though. Possession of CSAM is a strict liability offense, as is distributing it. Doesn't matter whether it's a paid account or free, it's child pornography (usually a max 3 year prison sentence, but if aggravated can pull 10 years). Generating it is ALSO a criminal offense.
in reply to Charlie Stross

@cstross @futurebird how is the llc giving these people a fig leaf for overt criminal, morally reprehensible, actions?
in reply to Kofi Loves Efia

@Seruko @cstross @futurebird It isn't. Call me cynical, but the only reason X is still standing is

1) CSAM is not its primary purpose
2) It's large and has lots of money
3) well known figures are on it

If it was a small server hosted in the UK it would already have been taken down.

It doesn't provide a fig leaf, it's just that the law has not (yet) caught up with end users using it to generate illegal content.

I'd also note that if CSAM can be located the first action is typically to seize all the computers.

Expect phase 1 - if X doesn't fix this, it'll be added to the Great British Firewall which currently blocks CSAM and a few proscribed groups. I don't see the government backing down.

phase 2 - police investigation. Anyone that can be identified, *especially* those stupid enough to try creating unwanted images of MPs, will be prosecuted.

Phase 3 - shine a light on people using VPNs to get around geoblocking or age verification for entirely legitimate content. Great.. 🙁.

in reply to Pete / Syllopsium

@syllopsium @Seruko @cstross
Is this true for “fictional” CSAM ? Or are there some loopholes?

I hate arguing about these definitions. We all know a photo of your kid playing at the beach is fine— a drawing of the same is fine— but I expect these people to try to hide behind such technicalities and distinctions.

Never mind that we are talking about NOT your kid. And you never asked for it.

in reply to myrmepropagandist

@futurebird there's a new British law about deepfakes but it's not in force yet bbc.co.uk/news/articles/cp9jxv…

myrmepropagandist reshared this.

in reply to myrmepropagandist

@futurebird @syllopsium @Seruko Fictional CSAM is already illegal in the UK. There have been multiple prosecutions. (There's no US first amendment style right to free speech here.)
in reply to Charlie Stross

@cstross

Could US law cover the *distribution* of such material?

Right now we have a popular “top ten” phone app with this garbage. Worse than if it were broadcast on public airwaves or put up on a poster in the public square.

This is a worst case scenario from bad internet debates about porn, gore and obscenity laws come to life.

And I feel like the worst creeps I’ve ever known are whispering “actually it’s called eubuphila” as if that were a serious argument.

myrmepropagandist reshared this.

in reply to myrmepropagandist

@futurebird @cstross The simple grim fact of the matter is that the free speech absolutist position that a government able to ban speech will use this power to suppress its political opponents isn't wrong, and that the position that it is a necessary function of the civil power to suppress CSAM, never mind for-profit, mass production CSAM, isn't wrong, either.

It's a basic systems theory thing that if you get this kind of unresolvable deadlock you're looking at wrong scale.

in reply to Graydon

@futurebird @cstross Leaving aside the "just what are we doing wrong as a society that there is such a market" and the "computers aren't real" cultural lag, it's a choice; either there are things the civil power is obliged to suppress to permit life, liberty, and the pursuit of happiness to be possible, or that's all supposed to be handled by socially mediated means at individual scales.

Since we permit massive corporate entities, the later is equivalent to enslaving the population.

in reply to Graydon

@futurebird @cstross It may well all come down to how corporations are defined, which was a specific project that didn't go through the legislature much; it's a lot like someone recreating an aristocracy through something that looks enormously like how charter land (that is, gifts of land in perpetuity to religious foundations since you couldn't give a temporary gift to an eternal god) created private property a thousand years and more ago.

tl;dr get rid of the special status of corporations.

in reply to myrmepropagandist

@cstross

I think it is important that in the past grok did not generate such content, but Mr.Musk decided and implemented changes to make this kind of content possible. Implicitly this is an attempt to either challenge the law, or based on a deep belief that the law should not apply to him.

Which reminds me of the Epstein files, and all of those child marraige cults.

Every time we fail to hold the powerful accountable they increase the abuse.

in reply to lemgandi

@lemgandi @cstross

At some point the conservative needs to stop playing coy and come out and just say who is a person and who isn’t to them.

They don’t want to do this because they don’t even agree.

We are being asked “is a not poor lesbian white lady a person or not?”

And that is the debate.

reshared this

in reply to myrmepropagandist

@futurebird @lemgandi Yep. We're ALSO being asked to believe amazing bullshit like "pregnant women are not people" (they lose rights to bodily autonomy because magic sky daddy said ectopic foetus is more important than life) at the same time as "corporations are people".

Both these propositions are, I repeat, bullshit on stilts.

Aral Balkan reshared this.

in reply to myrmepropagandist

@futurebird @lemgandi @cstross No, no, the uncertainty is part of the attack.

You never know what to do to avoid the abuse, so you oppress yourself.

See also the spotty enforcement of speed limits, etc.

in reply to myrmepropagandist

@lemgandi @cstross

Do I need to say “of course she is” —Do I need to say this isn’t how you treat a person?

I’m thinking of so many people who are gone forever without a whisper. I’m looking at every brash and angry conservative voice who is trying justice shooting a person for not being meek and wondering why they don’t realize they are vulnerable too.

Do we keep the gay republicans? the brown ones? what about the ones who can’t ignore mr. epstein?

in reply to lemgandi

@lemgandi @cstross

I thought this response was about my other post — though they are related.

in reply to myrmepropagandist

@futurebird @syllopsium @Seruko @cstross In the UK since 1995 it’s an unequivocal yes:

(7) “ Pseudo-photograph ” means an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph.
(8) If the impression conveyed by a pseudo-photograph is that the person shown is a child, the pseudo-photograph shall be treated for all purposes of this Act as showing a child and so shall a pseudo-photograph where the predominant impression conveyed is that the person shown is a child notwithstanding that some of the physical characteristics shown are those of an adult.
in reply to Kevin Beaumont

It’s fascinating that payment processors and app stores happily bullied Tumblr over female presenting nipples and have kicked adult game creators off of Steam, but have been completely silent on CSAM and misogyny generated on X

reshared this