"ICE officials have told us that an apparent biometric match by Mobile Fortify is a ‘definitive’ determination of a person’s status and that an ICE officer may ignore evidence of American citizenship—including a birth certificate—if the app says the person is an alien."
Every person involved in this decision needs to be hit over the head with a copy of Kashmir Hill's "Your Face Belongs to Us" until they come to their senses.
404media.co/ice-and-cbp-agents…
ICE and CBP Agents Are Scanning Peoples’ Faces on the Street To Verify Citizenship
“You don’t got no ID?” a Border Patrol agent in a baseball cap, sunglasses, and neck gaiter asks a kid on a bike. The officer and three others had just stopped the two young men on their bikes during the day in what a video documenting the incident says is Chicago. One of the boys is filming the encounter on his phone. He says in the video he was born here, meaning he would be an American citizen.
When the boy says he doesn’t have ID on him, the Border Patrol officer has an alternative. He calls over to one of the other officers, “can you do facial?” The second officer then approaches the boy, gets him to turn around to face the sun, and points his own phone camera directly at him, hovering it over the boy’s face for a couple seconds. The officer then looks at his phone’s screen and asks for the boy to verify his name. The video stops.
💡
Do you have any more videos of ICE or CBP using facial recognition? Do you work at those agencies or know more about Mobile Fortify? I would love to hear from you. Using a non-work device, you can message me securely on Signal at joseph.404 or send me an email at joseph@404media.co.
This post is for subscribers only
Become a member to get access to all content
Subscribe now

Videos on social media show officers from ICE and CBP using facial recognition technology on people in the field. One expert described the practice as “pure dystopian creep.”
Joseph Cox (404 Media)
Blake Coverett
in reply to evacide • • •Poloniousmonk
in reply to Blake Coverett • • •You're giving them too much credit. It's pure malice. The computer system is an accountability sink so they can disappear citizens. "Don't blame me, I just did what the computer told me." And considering that 95 out of 100 Americans are employed as organic input devices for a machine master, it'll fly with a lot of people.
Nuclear Oatmeal
in reply to evacide • • •This will definitely be effective and will end well.
/s
Hot Dog Water
in reply to evacide • • •Do not question the app.
The app is all things.
The app defines us.
The app is our salvation.
All praise the perfection of the app.
Nothing comes from within, the app will fill you and complete you.
Trust the app.
The app is your faith.
Yeah, no. Phones and tech aren't drugs at all. Nope.
keith
in reply to evacide • • •ts 🚇
in reply to evacide • • •Pēteris Krišjānis
in reply to evacide • • •davecb 🇨🇦
in reply to evacide • • •Once upon a time, a customer wanted to use facial recognition to scan people traveling through a large airport to see if they were in a long list of wanted criminals.
With about an 80% success rate, they expected that they’d have about 10% false positives, who would have to be interviewed, and about 10% false negatives, who would make it through the airport.
However, that’s assuming we did exactly one comparison of a single traveler against one wanted criminal. That’s not what was happening.
Let’s assume they had 1,000 wanted criminals (they had more). For each single traveler, there would be 1,000 comparisons, each of which had an 80% chance of being right, and a 20% chance of being wrong. 20% of 1,000 is 50 failures per person. 25 false positives, so the person needed to be interviewed to be sure they weren’t the criminals that the program claimed they were.
Now feed 1,000 travelers through that system, which would be about an hour’s normal traveler volume. You’re doing 1,000,000 comparisons. That’s bad.
The fact that they’re doing many more comparisons is the explanation for the “birthday paradox”, described at en.wikipedia.org/wiki/Birthday…. That’s where you have a fifty-fifty chance of meeting someone with the same birthday in a party with 23 people. In that case you were doing 23 * 22 / 2 = 253 comparisons, not as bad as with the airport example.
The number of comparisons is the number of travelers per hour, times the number of criminals.
When the system identified someone’s grandmother as a male member of the Baader-Meinhof gang, the experiment was abruptly ended.
Once they found out what was happening, they asked what would happen if we only had a 2% error rate. The answer was 1,000 * 2% = 20, for about ten false positives per person.
The moral of the story is, if you want facial recognition to work reasonably, you need to arrange to do exactly one comparison, and also have a low error rate.
It’s fine for seeing if my face matches the picture on my passport, but little else.
mathematical problem
Contributors to Wikimedia projects (Wikimedia Foundation, Inc.)indyradio
in reply to evacide • • •They Might Be Stupid
in reply to evacide • • •waiting for someone to hack their network and ensure a picture of major conservative leaders' faces appear as needing immediate deportation to... anywhere else.
Apropos of nothing in particular, I'm reminded of the CIA manual for sabotage and instructions for insider threats to cause maximum disruption while receiving minimal blame.
We don't need AI. AI needs us.
in reply to evacide • • •Angie
Unknown parent • • •Brad Martin
in reply to evacide • • •