Keep your platform safe of NSFW content, all for free. You can easily detect if an image (videos and other content in the near future) is NSFW (not safe for work) by just uploading it to us. One of the platforms using nsfw.rest is file.coffee!
@ricky_han1 I'm planning on having some sort of ratelimitting in the platform but I'd rather have that people scan a lot of files than not scan any at all (to help them keep their platform safe and sound). I'm also planning on making the system automatically scale (to a limit of course) to avoid it getting overloaded.
@matt_lindsay Ah! That's a though questions. My focus is detection of NSFW related content. CSAM is of course a part of that. Since it's not allowed to own that content I cannot train an AI to recognize is (for now). Maybe combining NSFW detection with some sort of age detection can pin point to the type of content tho. I'll def. look into that :)
Satta
file.coffee
AirCare
file.coffee
AirCare
AirCare
file.coffee
file.coffee