4
I just developed and deployed the first real-time protection for lemmy against CSAM! - Divisions by zero
lemmy.dbzer0.comcross-posted from: https://lemmy.dbzer0.com/post/4500908
[https://lemmy.dbzer0.com/post/4500908] > In the past months, there’s a been a
issue in various instances where accounts would start uploading blatant CSAM to
popular communities. First of all this traumatizes anyone who gets to see it
before the admins get to it, including the admins who have to review to take it
down. Second of all, even if the content is a link to an external site, lemmy
sill caches the thumbnail and stores it in the local pict-rs, causing headaches
for the admins who have to somehow clear that out. Finally, both image posts and
problematic thumbnails are federated to other lemmy instances, and then likewise
stored in their pict-rs, causing such content to be stored in their image
storage. > > This has caused multiple instances to take radical measures, from
defederating liberaly, to stopping image uploads to even shutting down. > >
Today I’m happy to announce that I’ve spend multiple days developing a tool you
can plug into your instance to stop this at the source: pictrs-safety
[https://github.com/db0/pictrs-safety] > > Using a new feature from pictr-rs
0.4.3 we can now cause pictrs to call an arbitary endpoint to validate the
content of an image before uploading it. pictrs-safety builds that endpoint
which uses an asynchronous approach to validate such images. > > I had already
developed fedi-safety [https://github.com/db0/fedi-safety] which could be used
to regularly go through your image storage and delete all potential CSAM. I have
now extended fedi-safety to plug into pict-rs safety and scan images sent by
pict-rs. > > The end effect is that any images uploaded or federated into your
instance will be scanned in advance and if fedi-safety thinks they’re potential
CSAM, they will not be uploaded to your image storage at all! > > This covers
three important vectors for abuse: > > * Malicious users cannot upload CSAM to
for trolling communities. Even novel GenerativeAI CSAM. > * Users cannot upload
CSAM images and never submit a post or comment (making them invisible to
admins). The images will be automatically rejected during upload > * Deferated
images and thumbnails of CSAM will be rejected by your pict-rs. > > Now, that
said, this tool is AI-driven and thus, not perfect. There will be false
positives, especially around lewd images and images which contain children or
child-topics (even if not lewd). This is the bargain we have to take to prevent
the bigger problem above. > > By my napkin calculations, false positive rates
are below 1%, but certainly someone’s innocent meme will eventually be affected.
If this happen, I request to just move on as currently we don’t have a way to
whitelist specific images. Don’t try to resize or modify the images to pass the
filter. It won’t help you. > > ### For lemmy admins: > > * pictrs-safety
contains a docker-compose sample
[https://github.com/db0/pictrs-safety/blob/main/docker-compose.yml] you can add
to your lemmy’s docker-compose. You will need to your put the .env in the same
folder, or adjust the provided variables. (All kudos to @[email protected]
[https://beehaw.org/u/Penguincoder] for the docker support). > * You need to
adjust your pict-rs ENVIRONMENT as well. Check the readme. > * fedi-safety must
run on a system with GPU. The reason for this is that lemmy provides just a
10-seconds grace period for each upload before it times out the upload
regardless of the results. A CPU scan will not be fast enough. However my
architecture allows the fedi-safety to run on a different place than
pictrs-safety. I am currently running it from my desktop. In fact, if you have a
lot of images to scan, you can connect multiple scanning workers to
pictrs-safety! > * For those who don’t have access to a GPU, I am working on a
NSFW-scanner which will use the AI-Horde [https://aihorde.net] directly instead
and won’t require using fedi-safety at all. Stay tuned. > > ### For other
fediverse software admins > > fedi-safety can already be used to scan your image
storage for CSAM, so you can also protect yourself and your users, even on
mastodon or firefish or whatever. > > I will try to provide real-time scanning
in the future for each software as well and PRs are welcome. > > ### Divisions
by zero > This tool is already active now on divisions by zero. It’s usage
should be transparent to you, but do let me know if you notice anything wrong. >
> ### Support > > If you appreciate the priority work that I’ve put in this
tool, please consider supporting this and future development work on liberapay:
> > https://liberapay.com/db0/ [https://liberapay.com/db0/] > > All my work is
and will always be FOSS and available for all who need it most. >
You must log in or register to comment.
We were already made aware of it, but thank you, I appreciate you thinking of us and sharing it!