react-native-nsfw
v0.0.2-alpha.0
Published
Provides a function to detect nsfw images on iOS using Core ML
Downloads
26
Maintainers
Readme
react-native-nsfw
NSFWDetector is a small (17 kB) CoreML Model to scan images for nudity. It was trained using CreateML to distinguish between porn/nudity and appropriate pictures. With the main focus on distinguishing between instagram model like pictures and porn.
This package is a React-Native wrapper around NSFWDetector and was used by me to learn working with Expo's Sweet API. I would highly appreciate contributions for Android and Web (there are some very good JS libs already). It's even possible to implement this for Camera.
Expo
Even though this project uses expo-modules-core, it's not working with Expo Go, since it adds native code and is not part of the Expo ecosystem (and not related).
Installation in React Native projects /Expo Custom Dev Client
You must ensure that you have installed and configured the expo
package before continuing.
This is expected to take about five minutes. The footprint is very small and our modules benefit from JSI under the hood. If you're already using Expo with a custom dev client (good choice), you can skip this step.
Add the package to your npm dependencies
yarn add reat-native-nsfw
Configure for iOS
Run npx pod-install
after installing the npm package.
Configure for Android
Not supported yet. No-op currently missing, but will be added soon.
API documentation
import * as NSFWDetector from "react-native-nsfw";
const { isNSFW, confidence } = await NSFWDetector.detectAsync(uri, threshold);
Supports all image types which expo-image-loader can resolve. (which is basically all you need).
| Arguments | Description | | ------------------- | --------------------------------------------------------------------------------------------------------------------------- | | uri (string) | URI of the file to manipulate. Should be on the local file system or a base64 data URI. Remote files are not supported yet. | | threshold? (number) | At which confidence an image should be classified as NSWF.Range between 0.0 and 1.0. Defaults to 0.9optional | | | |
License
MIT