react-face-liveness-detection
v0.1.0
Published
A React component for liveness detection using MediaPipe Face Mesh.
Downloads
81
Maintainers
Readme
react-face-liveness-detection
A React component for liveness detection using MediaPipe Face Mesh. Built with TypeScript.
Installation
npm install react-face-liveness-detection
Usage
import React from "react";
import { LivenessDetection } from "react-face-liveness-detection";
const App = () => {
const handleVerification = () => {
console.log("User verified!");
// Perform actions after successful verification (e.g., redirect, submit data)
};
const handleError = (err: any) => {
console.error("Liveness detection error:", err);
// Handle errors (e.g., display an error message)
};
return (
<div>
<LivenessDetection
onVerified={handleVerification}
onError={handleError}
instructions={[
{ text: "Open your mouth wide", type: "mouth", duration: 3 },
{ text: "Blink twice", type: "blink", count: 2 },
{ text: "Smile broadly", type: "smile", duration: 2 },
]}
style={{
container: {}, // Add any custom styles here
alert: { backgroundColor: "lightblue", color: "darkblue" },
canvas: { border: "2px solid green" },
}}
thresholds={{
// Adjust thresholds if needed
EAR_THRESHOLD: 0.1,
MAR_THRESHOLD: 0.15,
SMILE_THRESHOLD: 3.0,
}}
/>
</div>
);
};
export default App;
Props
| Prop | Type | Description | Default |
| ----------------------- | ---------------------- | ----------------------------------------------------------------- | -------- |
| instructions
| Instruction[]
| Array of instruction objects. See "Instruction Object Structure". | See code |
| onVerified
| () => void
| Callback function called when the user is verified. | |
| onError
| (error: any) => void
| Callback function called when an error occurs. | |
| cameraWidth
| number
| Width of the camera stream. | 640
|
| cameraHeight
| number
| Height of the camera stream. | 480
|
| canvasWidth
| number
| Width of the canvas element. | 400
|
| canvasHeight
| number
| Height of the canvas element. | 576
|
| thresholds
| Thresholds
| Object containing EAR, MAR, and SMILE thresholds | See code |
| randomizeInstructions
| boolean
| Whether to randomize the order of instructions. | true
|
| style
| Style
| Object to customize styles. See "Styling". | {}
|
Instruction Object Structure
type Instruction = {
text: string; // Text to display to the user
type: "mouth" | "blink" | "smile"; // Type of action
duration?: number; // Duration in seconds (for 'mouth' and 'smile')
count?: number; // Number of times to perform action (for 'blink')
};
Styling
You can customize the styles using the style
prop:
type Style = {
container?: React.CSSProperties; // Styles for the main container
alert?: React.CSSProperties; // Styles for the alert box
canvas?: React.CSSProperties; // Styles for the canvas element
loadingIndicator?: React.CSSProperties; // Styles for the loading indicator
};
Thresholds
type Thresholds = {
EAR_THRESHOLD: number;
MAR_THRESHOLD: number;
SMILE_THRESHOLD: number;
};
You might need to adjust these thresholds based on lighting conditions, camera quality, and individual facial features.
Demo
The simplest way to create a demo is to create a new React project using Create React App (or similar):
npx create-react-app my-liveness-demo
cd my-liveness-demo
npm install react-face-liveness-detection
Then, replace the contents of src/App.tsx
with the usage example provided above. Now you can start your demo app:
npm start