(no title)
833 | 8 months ago
But it's concerning that they seem to not have integrated proper hashing solutions until now:
> We are proud to provide an important update on our continuous work detecting Child Sexual Abuse Material (CSAM) content, announcing today that we have launched additional CSAM hash matching efforts.
> This system allows X to hash and match media content quickly and securely,
The existing hashing tools are perfectly fit for purpose, but if the CSAM isn't known (and it's not, because it's either new or AI generated) then no amount of hashing will detect it.
Not sure why X developed something new instead of using PhotoDNA, if it all still uses the same hash databases!
No comments yet.