We are rapidly approaching that point. Apple is/was/will going to enable on-device scanning for someone's definition of naughty. Not hard to imagine that naughty will soon includes images of Winnie the Pooh, union formation, abortion, minority group X, what have you. Automatic notification of the authorities to follow.
Edit: To be clear, I am obviously opposed to CSAM, but on-device scanning is a privacy violation. Nobody knows what hashes trigger a flag, and they could be updated at anytime without the user being aware.
Running arbitrary and proprietary code without being able to review it first was always a mistake but we crossed that bridge over twenty years ago.
Every OS and chip manufacturer is working towards "secure core" architectures now. Executed code will run inside OS and silicon-level sandboxes. Memory spaces will not only be randomized, but encrypted and authenticated through dedicated secure enclaves. Hardened IOMMU modules will negotiate bus communication. System code is partitioned off and verified through hardware root of trust.
Malware as we have known it will be extinct in a few years.
fbdab103|2 years ago
Edit: To be clear, I am obviously opposed to CSAM, but on-device scanning is a privacy violation. Nobody knows what hashes trigger a flag, and they could be updated at anytime without the user being aware.
wilg|2 years ago
flangola7|2 years ago
Every OS and chip manufacturer is working towards "secure core" architectures now. Executed code will run inside OS and silicon-level sandboxes. Memory spaces will not only be randomized, but encrypted and authenticated through dedicated secure enclaves. Hardened IOMMU modules will negotiate bus communication. System code is partitioned off and verified through hardware root of trust.
Malware as we have known it will be extinct in a few years.
saagarjha|2 years ago