Apple: Apple has begun scanning each photo uploaded to iCloud to check for child abuse, reports The Telegraph.
Speaking at the Consumer Electronics Show in Las Vegas, Jane Horvath, Apple’s Chief Privacy Officer, revealed that the company is now automatically screening images backed up to iCloud to look for illegal photos.
A statement on the company’s legal website offers some more details…
Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.
As a reminder, if you are using Apple’s iCloud services your data is not secure. Apple has the encryption keys to your photos, contacts, calendars, bookmarks, mail, notes, voice memos, health, call history, and files. Even your messages are not secure if you have iCloud backup enabled.