Apple announced new features Thursday that will scan iPhone and iPad users’ photos to detect and report large collections of child sexual abuse images stored on its cloud servers.
“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” Apple said in a statement.
“This program is ambitious, and protecting children is an important responsibility,” it said. “Our efforts will evolve and expand over time.”
Bravo!
And think about how it will not be 2fA on their account because they don’t know anything about security.. We will see accounts being hacked in thr coming years due to this idea.
https://wp.me/p1tOZP-EW