Press "Enter" to skip to content

Apple Delays iPhone Update That Scans for Child Abuse Images After Mistaking Your Dick Pics for a Child’s

Apple has delayed plans to scan iPhones to look for collections of child abuse images after backlash from customers amd privacy advocates but mostly because your dick pics kept getting mistaken for a child’s, the tech giant said Friday.

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them,” a company statement said. “However, the AI simply cannot get around how small your penis is. We tried to tweak the technology by lowering the parameters, but unfortunately that renders the entire system useless, because most underage boy’s penises are bigger than yours.”

The news comes as a blow to child safety advocates, but Apple is insisting they tried everything in order to meet the rollout deadline. “Every time you send a picture holding your penis, our AI thinks that a grown man is holding the genitalia of an innocent boy. We thought we might be able to salvage some of the program to at least scan for underage girls, but that wasn’t very effective due to the system mistaking your penis for a girl’s clitoris.”

Members of the tech community shared their feelings online. “I’m very happy to hear that Apple is delaying their CSAM technology rollout. It is half-baked, full of dangerous vulnerabilities, and breaks encryption for everyone,” tweeted Evan Bockwit, a professor in the Department of Industrial Engineering at Tel-Aviv University who specializes in privacy issues, and also has a tiny penis. “Child safety deserves a solution which is more thoughtful and more impactful.”

“We hope that in the future, technology will become powerful enough to overcome these microscopic hurdles,” Apple finished their statement by saying. “Or maybe you’ll just have a late growth spurt.”