It took so long to have a decision like that from Discord.
I use Discord everyday and I know that a lot of teenagers use it and I think a decision like that can help to keep safe these teenagers from the internet.
When last year Discord introduced that AI face scanning I thought that it was a serious problem for the security of our data but if, as they say, they remove all of the data of the ID extimation.. why not?
People are pretty skeptical that discord will actually delete it (because they had a data breach showing a lot of undeleted face scans and id pictures already) or that their partner organization who does the validation won't just save it instead.
Also I don't really think this solves the grooming issue - it stops them from going in certain channels or getting pictures from non friends but an adult who wants to get past that to get to kids probably will. You'd really want like "teen only" servers with verification going the other way, if anything. I've never seen that proposed oddly
It’s hard to trust a company when when they’ve already demonstrated that they [can’t be trusted](https://arstechnica.com/tech-policy/2026/02/discord-faces-ba...). And i’m sure there will be an argument of “but this will make it more likely that they’ll be safe about it to not have a repeat”, which will likely be true for a while. Then cost-cutting offloading to a third party that promises to not keep the data, or a new feature will come in that needs the photos, or a misconfiguration will happen where things that should be deleted won’t be, and we’ll be in the same boat again.