AppleInsider may earn an affiliate commission on purchases made through links on our site.
Apple isn’t checking images viewed within the macOS Finder for CSAM content, an investigation into macOS Ventura has determined, with analysis indicating that Visual Lookup isn’t being used by Apple for that particular purpose.
In December, Apple announced it had given up on plans to scan iPhone photos uploaded to iCloud for Child Sexual Abuse Material (CSAM), following considerable backlash from critics. However, rumors apparently lingered alleging that Apple was still performing checks in macOS Ventura 13.1, prompting an investigation from a developer.
According to Howard Oakley of Electric Light Co. in a blog post from January 18, a claim started to circulate that Apple was automatically sending “identifiers of images” that a user had browsed in Finder, doing so “without that user’s consent or awareness.”
The plan for CSAM scanning would’ve involved a local on-device check of images for potential CSAM content, using a hashing system. The hash of the image would then be sent off and checked against a list of known CSAM files.
While the idea of scanning images and creating a neural hash to be sent off to Apple to describe characteristics of image could feasibly be used for CSAM scanning, Oakley’s testing indicates it’s not actively being used in that way. Instead, it seems that Apple’s Visual Lookup system, which allows macOS and iOS to identify people and objects in an image, as well as text, could be mistaken for conducting in this sort of behavior.
No evidence in tests
As part of testing, macOS 13.1 was run within a virtual machine and the application Mints was used to scan a unified log of activities on the VM instance. On the VM, a collection of images were viewed for a period of one minute in Finder’s gallery view, with more than 40,000 log entries captured and saved.
If the system was used for CSAM analysis, there would be repeated outgoing connections from the “mediaanalysisd” to an Apple server for each image. The mediaanalysisisd refers to an element used in Visual Lookup where Photos and other tools can display information about detected items in an image, such as “cat” or the names of objects.
The logs instead showed that there were no entries associated with mediaanalysisd at all. A further log extract was then found to be very similar to Visual Lookup as it appeared in macOS 12.3, in that the system hasn’t materially changed since that release.
Typically, mediaanalysisd doesn’t contact Apple’s servers until very late in the process itself, as it requires neural hashes generated by image analysis beforehand. Once sent off and a response is received back from Apple’s servers, the received data is then used to identify to the user elements within the image.
Further trials determined that there were some other attempts to send off data for analysis, but for enabling Live Text to function.
In his conclusion, Oakley writes that there is “no evidence that local images on a Mac have identifiers computed and uploaded to Apple’s servers when viewed in Finder windows.”
While images viewed in apps with Visual Lookup support have neural hashes produced, which can be sent to Apple’s servers for analysis. Except that trying to harvest the neural hashes for detecting CSAM “would be doomed to failure for many reasons.”
Local images in QuickLook Preview also go under normal analysis for Live Text, but “that doesn’t generate identifiers that could be uploaded to Apple’s servers.”
Furthermore, Visual Lookup could be disabled by turning off Siri Suggestions. External mediaanalysiss look-ups could also be blocked using a software firewall configured to block port 443, though “that may well disable other macOS features.”
Oakley concludes the article with a warning that “alleging that a user’s actions result in controversial effects requires full demonstration of the full chain of causation. Basing claims on the inference that two events might be connected, without understanding the nature of either, is reckless if not malicious.”
CSAM still an issue
While Apple has gone off the idea of performing local CSAM detection processing, lawmakers still believe Apple isn’t doing enough about the problem.
In December, the Australian e-Safety Commissioner attacked Apple and Google over a “clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming.”
Rather than directly scanning for existing content, which would be largely ineffective due to Apple using fully end-to-end encrypted photo storage and backups, Apple instead seems to want to go down a different approach. Namely, one that can detect nudity included in photos sent over iMessage.