APPLE DESIGNS CHILD SEXUAL ABUSE MATERIAL (CSAM)- IMAGE DETECTION

The system is structured such that the user does not need to rely on Apple

Image credit: Bamyx Technologies

This week, Apple went further to provide additional design and security details as regards the launching of the feature that detects child sexual abuse material (CSAM) Images stored in iCloud Photos. Privacy groups like Electronic Frontier Foundation were of the opinion that flagging CSAM photos effectively narrows the meaning of end-to-end encryption to give room for client-side access, implying that Apple is putting a backdoor into its data storage.

Despite the fact that Apple gave a detailed explanation of how its technical implementation will protect privacy and security in its proposed backdoor, EEF still thinks even a properly documented, meticulously thought-out, and narrowly scoped backdoor will always remain a backdoor.

However, Apple’s newly released document explained that the feature is only available to child accounts set up in Family Sharing and the parent or guardian must opt-in. The messaging app deploys a machine-learning to the classifier, which will trigger notifications if the app notices offensive images being transferred to or from the account. If the account is for a child under 13 years old, the messaging app also drops notifications for the parent or guardian. Not to worry, just notifications are sent to the parents, the images are not sent along side, Apple added.

The company stated that the proposed feature does not divulge information to Apple, neither does it share the dialogue of users, the action of the child nor the notifications that drops for the parents. It does not deal with comparing images to any database, such as a database of CSAM material. It does not produce any report for Apple or law enforcement.

According to Apple, the feature has the ability to recognize collections of CSAM images uploaded to iCloud photos. It starts by running a code on the device that compares any photo being submitted with the existing CSAM image database.

“If and only if that user’s iCloud photos account goes beyond a particular number of matches, called the match threshold, the iCloud photo servers can go ahead to decrypt the safety vouchers corresponding to positive matches”, Apple added.

The images are sent to a human reviewer once a particular number of images are discovered and if an issue is noticed, the information is handed over to the National Center for Missing and Exploited Children who will inform law enforcement accordingly.

Apple stated that, “The system is structured such that the user does not need to rely on Apple, any other single entity or any conspiracy from the same sovereign jurisdiction to be assured that the functionality of the system is the same as depicted.”

According to Apple, “The CSAM device database was produced from the combination of information from two separate child-safety agencies. This process discards any perceptual hashes found in only one participating child-safety organization’s database, or only in databases from multiple agencies in a single sovereign jurisdiction, and they aren’t included in the encrypted CSAM database that Apple includes in the operating system.” This approach satisfies our criterion for source-image correctness.

In addition, the company explained that the database is never shared or even updated via the internet. There is zero possibility of remotely updating the database, and there is no way Apple different CSAM database can be distributed to targeted users, in as much as it is the same signed operating system that Apple distributes to all users all over the world. This meets the company’s database update transparency and database universality criteria.

A knowledge base article with a root hash of the encrypted database with each Ios will be published in order to give room for independent third-party technical audits. How these details will convince critics of this development is still not clear.

“While Apple seeks to fight child exploitation and abuse, it has developed an infrastructure that is all to easy to repurpose for further surveillance and censorship,” the EFF responded. “The program will smear Apple’s claim that it is unable to meet the broader demands.”

Oops! You almost missed a step!!! You are yet to give us your LIKES

Bamyx Technologies says THANK YOU FOR YOUR LIKES

We are a technology business that provides large-scale saas solutions to companies in a variety of industries.