Microsoft will get new accounts they may not want!

Apple announced new features Thursday that will scan iPhone and iPad users’ photos to detect and report large collections of child sexual abuse images stored on its cloud servers.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” Apple said in a statement.

“This program is ambitious, and protecting children is an important responsibility,” it said. “Our efforts will evolve and expand over time.”

Bravo!