Gathering detailed insights and metrics for @biopassid/face-sdk-react-native
Gathering detailed insights and metrics for @biopassid/face-sdk-react-native
Gathering detailed insights and metrics for @biopassid/face-sdk-react-native
Gathering detailed insights and metrics for @biopassid/face-sdk-react-native
npm install @biopassid/face-sdk-react-native
Typescript
Module System
Node Version
NPM Version
53.7
Supply Chain
65.9
Quality
81.2
Maintenance
50
Vulnerability
94.1
License
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
2
19
Key Features • Customizations • Quick Start Guide • Prerequisites • Installation • How to use • LicenseKey • FaceConfig • Something else • Changelog • Support
Increase the security of your applications without harming your user experience.
All customization available:
Enable or disable components:
Change all colors:
Check out our official documentation for more in depth information on BioPass ID.
Android | iOS | |
---|---|---|
Support | SDK 24+ | iOS 15+ |
- A device with a camera
- License key
- Internet connection is required to verify the license
1npm install @biopassid/face-sdk-react-native
Change the minimum Android sdk version to 24 (or higher) in your android/app/build.gradle
file.
1minSdkVersion 24
Requires iOS 15.0 or higher.
Add to the ios/Info.plist
:
Privacy - Camera Usage Description
and a usage description.If editing Info.plist
as text, add:
1<key>NSCameraUsageDescription</key> 2<string>Your camera usage description</string>
Then go into your project's ios folder and run pod install.
1# Go into ios folder 2$ cd ios 3 4# Install dependencies 5$ pod install
1<?xml version="1.0" encoding="UTF-8"?> 2<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> 3<plist version="1.0"> 4<dict> 5 <key>NSPrivacyCollectedDataTypes</key> 6 <array> 7 <dict> 8 <key>NSPrivacyCollectedDataType</key> 9 <string>NSPrivacyCollectedDataTypeOtherUserContent</string> 10 <key>NSPrivacyCollectedDataTypeLinked</key> 11 <false/> 12 <key>NSPrivacyCollectedDataTypeTracking</key> 13 <false/> 14 <key>NSPrivacyCollectedDataTypePurposes</key> 15 <array> 16 <string>NSPrivacyCollectedDataTypePurposeAppFunctionality</string> 17 </array> 18 </dict> 19 <dict> 20 <key>NSPrivacyCollectedDataType</key> 21 <string>NSPrivacyCollectedDataTypeDeviceID</string> 22 <key>NSPrivacyCollectedDataTypeLinked</key> 23 <false/> 24 <key>NSPrivacyCollectedDataTypeTracking</key> 25 <false/> 26 <key>NSPrivacyCollectedDataTypePurposes</key> 27 <array> 28 <string>NSPrivacyCollectedDataTypePurposeAppFunctionality</string> 29 </array> 30 </dict> 31 </array> 32 <key>NSPrivacyTracking</key> 33 <false/> 34 <key>NSPrivacyAccessedAPITypes</key> 35 <array> 36 <dict> 37 <key>NSPrivacyAccessedAPITypeReasons</key> 38 <array> 39 <string>CA92.1</string> 40 </array> 41 <key>NSPrivacyAccessedAPIType</key> 42 <string>NSPrivacyAccessedAPICategoryUserDefaults</string> 43 </dict> 44 </array> 45</dict> 46</plist>
You can use the library in Expo via a development build.
You will need to use expo-build-properties
plugin.
1npx expo install expo-build-properties
Add the configuration in the app.json
file.
1{ 2 "plugins" : [ 3 [ 4 "expo-build-properties", 5 { 6 "android": { 7 "extraMavenRepos": [ 8 { 9 "url": "https://packagecloud.io/biopassid/FaceSDKAndroid/maven2" 10 }, 11 { 12 "url": "https://packagecloud.io/biopassid/dlibwrapper/maven2" 13 } 14 ] 15 } 16 } 17 ] 18 ], 19}
To call @biopassid/face-sdk-react-native
in your React Native project is as easy as follow:
1import React from 'react'; 2import {StyleSheet, View, Button} from 'react-native'; 3import {useFace} from '@biopassid/face-sdk-react-native'; 4 5export default function App() { 6 const {takeFace} = useFace(); 7 8 async function onPress() { 9 await takeFace({ 10 config: { 11 licenseKey: 'your-license-key', 12 // If you want to use Liveness, uncomment this line 13 // liveness: {enabled: true}, 14 }, 15 onFaceCapture: ( 16 image, 17 faceAttributes /* Only available on Liveness */, 18 ) => { 19 console.log('onFaceCapture:', image.substring(0, 20)); 20 // Only available on Liveness 21 console.log('onFaceCapture:', faceAttributes); 22 }, 23 // Only available on Liveness 24 onFaceDetected: faceAttributes => { 25 console.log('onFaceDetected:', faceAttributes); 26 }, 27 }); 28 } 29 30 return ( 31 <View style={styles.container}> 32 <Button onPress={onPress} title="Capture Face" /> 33 </View> 34 ); 35} 36 37const styles = StyleSheet.create({ 38 container: { 39 flex: 1, 40 alignItems: 'center', 41 justifyContent: 'center', 42 backgroundColor: '#FFFFFF', 43 }, 44});
In this example we use facial detection in liveness mode to perform automatic facial capture and send it to the BioPass ID API.
We used the Liveness from the Multibiometrics plan.
Here, you will need an API key to be able to make requests to the BioPass ID API. To get your API key contact us through our website BioPass ID.
1import React from 'react'; 2import {StyleSheet, View, Button} from 'react-native'; 3import {useFace} from '@biopassid/face-sdk-react-native'; 4 5export default function App() { 6 const {takeFace} = useFace(); 7 8 async function onPress() { 9 await takeFace({ 10 config: { 11 licenseKey: 'your-license-key', 12 liveness: {enabled: true}, 13 }, 14 onFaceCapture: async (image, faceAttributes) => { 15 // Create headers passing your api key 16 const headers = { 17 'Content-Type': 'application/json', 18 'Ocp-Apim-Subscription-Key': '9cf1c6b269d04da8a51cc6bdfeec7557', 19 }; 20 21 // Create json body 22 const body = JSON.stringify({ 23 Spoof: {Image: image}, 24 }); 25 26 // Execute request to BioPass ID API 27 const response = await fetch( 28 'https://api.biopassid.com/multibiometrics/v2/liveness', 29 { 30 method: 'POST', 31 headers, 32 body, 33 }, 34 ); 35 const data = await response.json(); 36 37 // Handle API response 38 console.log('Response status: ', response.status); 39 console.log('Response body: ', data); 40 }, 41 onFaceDetected: faceAttributes => { 42 console.log('onFaceDetected:', faceAttributes); 43 }, 44 }); 45 } 46 47 return ( 48 <View style={styles.container}> 49 <Button onPress={onPress} title="Capture Face" /> 50 </View> 51 ); 52} 53 54const styles = StyleSheet.create({ 55 container: { 56 flex: 1, 57 alignItems: 'center', 58 justifyContent: 'center', 59 backgroundColor: '#FFFFFF', 60 }, 61});
1import React, {useEffect, useState} from 'react'; 2import { 3 StyleSheet, 4 View, 5 ScrollView, 6 Text, 7 TouchableOpacity, 8 Image, 9} from 'react-native'; 10import { 11 FaceConfig, 12 FaceExtract, 13 useFace, 14 useRecognition, 15} from '@biopassid/face-sdk-react-native'; 16 17export default function App() { 18 const [imageA, setImageA] = useState<string | null>(); 19 const [imageB, setImageB] = useState<string | null>(); 20 const [faceExtractA, setFaceExtractA] = useState<FaceExtract | null>(); 21 const [faceExtractB, setFaceExtractB] = useState<FaceExtract | null>(); 22 const {takeFace} = useFace(); 23 const {closeRecognition, extract, startRecognition, verify} = 24 useRecognition(); 25 26 const config: FaceConfig = {licenseKey: 'your-license-key'}; 27 28 useEffect(() => { 29 startRecognition('your-license-key'); 30 31 return () => closeRecognition(); 32 }, [closeRecognition, startRecognition]); 33 34 async function extractA() { 35 await takeFace({ 36 config: config, 37 onFaceCapture: async (image, faceAttributes) => { 38 setImageA(image); 39 const faceExtractA = await extract(image); 40 setFaceExtractA(faceExtractA); 41 console.log('ExtractA:', faceExtractA); 42 }, 43 }); 44 } 45 46 async function extractB() { 47 await takeFace({ 48 config: config, 49 onFaceCapture: async (image, faceAttributes) => { 50 setImageB(image); 51 const faceExtractB = await extract(image); 52 setFaceExtractB(faceExtractB); 53 console.log('ExtractB:', faceExtractB); 54 }, 55 }); 56 } 57 58 async function verifyTemplates() { 59 if (faceExtractA && faceExtractB) { 60 const faceVerify = await verify( 61 faceExtractA!.template, 62 faceExtractB!.template, 63 ); 64 console.log('Verify:', faceVerify); 65 } 66 } 67 68 return ( 69 <ScrollView style={styles.container}> 70 <Text style={styles.title}>Face Recognition</Text> 71 {imageA && ( 72 <View style={styles.imageContainer}> 73 <Image 74 style={styles.image} 75 source={{ 76 uri: `data:image/png;base64,${imageA}`, 77 }} 78 /> 79 </View> 80 )} 81 <TouchableOpacity style={styles.button} onPress={extractA}> 82 <Text style={styles.buttonText}>Extract A</Text> 83 </TouchableOpacity> 84 {imageB && ( 85 <View style={styles.imageContainer}> 86 <Image 87 style={styles.image} 88 source={{ 89 uri: `data:image/png;base64,${imageB}`, 90 }} 91 /> 92 </View> 93 )} 94 <TouchableOpacity style={styles.button} onPress={extractB}> 95 <Text style={styles.buttonText}>Extract B</Text> 96 </TouchableOpacity> 97 <TouchableOpacity style={styles.button} onPress={verifyTemplates}> 98 <Text style={styles.buttonText}>Verify Templates</Text> 99 </TouchableOpacity> 100 </ScrollView> 101 ); 102} 103 104const styles = StyleSheet.create({ 105 container: { 106 flex: 1, 107 paddingHorizontal: 30, 108 backgroundColor: '#FFF', 109 }, 110 title: { 111 marginTop: 54, 112 marginBottom: 15, 113 color: '#000', 114 fontSize: 32, 115 fontWeight: 'bold', 116 textAlign: 'center', 117 }, 118 imageContainer: { 119 marginTop: 12, 120 alignItems: 'center', 121 justifyContent: 'center', 122 }, 123 image: { 124 width: 150, 125 height: 150, 126 resizeMode: 'contain', 127 marginBottom: 20, 128 }, 129 button: { 130 marginBottom: 15, 131 alignItems: 'center', 132 backgroundColor: '#D6A262', 133 padding: 10, 134 }, 135 buttonText: { 136 fontSize: 25, 137 fontWeight: 'bold', 138 color: '#FFF', 139 }, 140});
In this example we use react-native-sqlite-storage
to save templates locally.
First, add the react-native-sqlite-storage package.
1npm install react-native-sqlite-storage
Because we’re using TypeScript, we can install @types/react-native-sqlite-storage
to use the included types. If you stick to JavaScript, you don’t have to install this library.
Define the Template data model:
1export interface Template { 2 id: number; 3 image: string; 4 template: string; 5}
Create the DBHelper to handle DB operations:
1import { 2 enablePromise, 3 openDatabase, 4 SQLiteDatabase, 5} from 'react-native-sqlite-storage'; 6 7import {Template} from './Template'; 8 9const tableName = 'templates'; 10 11enablePromise(true); 12 13export async function getDBConnection() { 14 return openDatabase({name: 'template.db', location: 'default'}); 15} 16 17export async function createTable(db: SQLiteDatabase) { 18 const query = `CREATE TABLE IF NOT EXISTS ${tableName} (id INTEGER PRIMARY KEY AUTOINCREMENT, image TEXT, template TEXT)`; 19 await db.executeSql(query); 20} 21 22export async function getTemplates(db: SQLiteDatabase): Promise<Template[]> { 23 try { 24 const templates: Template[] = []; 25 const results = await db.executeSql(`SELECT * FROM ${tableName}`); 26 results.forEach(result => { 27 for (let index = 0; index < result.rows.length; index++) { 28 templates.push(result.rows.item(index)); 29 } 30 }); 31 return templates; 32 } catch (error) { 33 console.error(error); 34 throw Error('Failed to get templates'); 35 } 36} 37 38export async function saveTemplate( 39 db: SQLiteDatabase, 40 image: string, 41 template: string, 42) { 43 const insertQuery = `INSERT INTO ${tableName} (image, template) values (?, ?)`; 44 return db.executeSql(insertQuery, [image, template]); 45} 46 47export async function deleteTemplate(db: SQLiteDatabase, id: number) { 48 const deleteQuery = `DELETE FROM ${tableName} WHERE id = ${id}`; 49 await db.executeSql(deleteQuery); 50}
In your App.tsx:
1import React, {useCallback, useEffect, useState} from 'react'; 2import { 3 StyleSheet, 4 View, 5 ScrollView, 6 Text, 7 TouchableOpacity, 8 Image, 9 Button, 10 Alert, 11} from 'react-native'; 12import { 13 FaceConfig, 14 useFace, 15 useRecognition, 16} from '@biopassid/face-sdk-react-native'; 17 18import {Template} from './Template'; 19import { 20 createTable, 21 deleteTemplate, 22 getDBConnection, 23 getTemplates, 24 saveTemplate, 25} from './DBHelper'; 26 27export default function App() { 28 const [templates, setTemplates] = useState<Template[]>([]); 29 const {takeFace} = useFace(); 30 const {closeRecognition, extract, startRecognition, verify} = 31 useRecognition(); 32 33 const config: FaceConfig = {licenseKey: 'your-license-key'}; 34 35 const loadDataCallback = useCallback(async () => { 36 try { 37 const storedTemplates = await getAll(); 38 setTemplates(storedTemplates); 39 } catch (error) { 40 console.error(error); 41 } 42 }, []); 43 44 useEffect(() => { 45 loadDataCallback(); 46 startRecognition('your-license-key'); 47 48 return () => closeRecognition(); 49 }, [closeRecognition, loadDataCallback, startRecognition]); 50 51 async function getAll() { 52 const db = await getDBConnection(); 53 await createTable(db); 54 return await getTemplates(db); 55 } 56 57 async function extractTemplate() { 58 await takeFace({ 59 config: config, 60 onFaceCapture: async (image, faceAttributes) => { 61 try { 62 const faceExtract = await extract(image); 63 64 if (faceExtract) { 65 const db = await getDBConnection(); 66 await saveTemplate(db, image, faceExtract!.template); 67 loadDataCallback(); 68 } 69 } catch (error) { 70 console.error(error); 71 } 72 }, 73 }); 74 } 75 76 async function verifyTemplate() { 77 await takeFace({ 78 config: config, 79 onFaceCapture: async (image, faceAttributes) => { 80 try { 81 const faceExtract = await extract(image); 82 83 if (faceExtract) { 84 const templates = await getAll(); 85 86 const filteredTemplates: Template[] = []; 87 88 for (const template of templates) { 89 const faceVerify = await verify( 90 faceExtract!.template, 91 template.template, 92 ); 93 if (faceVerify?.isGenuine) { 94 filteredTemplates.push(template); 95 } 96 } 97 98 if (filteredTemplates.length) { 99 setTemplates(filteredTemplates); 100 } else { 101 Alert.alert( 102 'Template not found', 103 'No template was found compatible with the captured image.', 104 [{text: 'OK', onPress: () => {}}], 105 ); 106 } 107 } 108 } catch (error) { 109 console.error(error); 110 } 111 }, 112 }); 113 } 114 115 async function removeTemplate(template: Template) { 116 try { 117 const db = await getDBConnection(); 118 await deleteTemplate(db, template.id); 119 loadDataCallback(); 120 } catch (error) { 121 console.error(error); 122 } 123 } 124 125 return ( 126 <View style={styles.container}> 127 <Text style={styles.title}>Face Recognition</Text> 128 <ScrollView style={styles.templateList}> 129 <View> 130 {templates.map(template => ( 131 <TouchableOpacity 132 key={template.id} 133 onLongPress={() => removeTemplate(template)}> 134 <View style={styles.templateItem}> 135 <Image 136 style={styles.image} 137 source={{ 138 uri: `data:image/png;base64,${template.image}`, 139 }} 140 /> 141 <Text style={styles.templateItemTitle}>{template.id}</Text> 142 </View> 143 </TouchableOpacity> 144 ))} 145 </View> 146 </ScrollView> 147 <View style={styles.buttonContainer}> 148 <Button onPress={extractTemplate} title="Add template" /> 149 </View> 150 <View style={styles.buttonContainer}> 151 <Button onPress={verifyTemplate} title="Verify Template" /> 152 </View> 153 <View style={styles.buttonContainer}> 154 <Button onPress={loadDataCallback} title="Get All" /> 155 </View> 156 </View> 157 ); 158} 159 160const styles = StyleSheet.create({ 161 container: { 162 flex: 1, 163 paddingHorizontal: 30, 164 backgroundColor: '#FFF', 165 }, 166 title: { 167 marginTop: 54, 168 marginBottom: 15, 169 color: '#000', 170 fontSize: 32, 171 fontWeight: 'bold', 172 textAlign: 'center', 173 }, 174 buttonContainer: { 175 marginBottom: 15, 176 }, 177 templateList: { 178 marginBottom: 15, 179 }, 180 templateItem: { 181 flex: 1, 182 flexDirection: 'row', 183 alignItems: 'center', 184 marginBottom: 15, 185 }, 186 image: { 187 height: 40, 188 width: 40, 189 borderRadius: 40, 190 }, 191 templateItemTitle: { 192 marginLeft: 20, 193 color: '#000', 194 }, 195});
To use @biopassid/face-sdk-react-native
you need a license key. To set the license key needed is simple as setting another attribute. Simply doing:
1// Face Detection
2import { useFace } from '@biopassid/face-sdk-react-native';
3const { takeFace } = useFace();
4await takeFace({
5 config: { licenseKey: 'your-license-key' },
6 onFaceCapture: (image, faceAttributes) => {},
7});
8
9// Face Recognition
10import { useRecognition } from '@biopassid/face-sdk-react-native';
11const { startRecognition } = useRecognition();
12startRecognition('your-license-key');
You can set a custom callback to receive the captured image. Note: FaceAttributes will only be available if livenes mode is enabled. You can write you own callback following this example:
1import { useFace } from '@biopassid/face-sdk-react-native'; 2const { takeFace } = useFace(); 3await takeFace({ 4 config: { licenseKey: 'your-license-key' }, 5 onFaceCapture: (image, faceAttributes) => { 6 console.log('onFaceCapture:', image.substring(0, 20)); 7 console.log('onFaceCapture:', faceAttributes); 8 }, 9 onFaceDetected: (faceAttributes) => { 10 console.log('onFaceDetected:', faceAttributes); 11 }, 12});
Name | Type | Description |
---|---|---|
faceProp | number | Proportion of the area occupied by the face in the image, in percentage |
faceWidth | number | Face width, in pixels |
faceHeight | number | Face height, in pixels |
ied | number | Distance between left eye and right eye, in pixels |
bbox | FaceRect | Face bounding box |
rollAngle | number | The Euler angle X of the head. Indicates the rotation of the face about the axis pointing out of the image. Positive z euler angle is a counter-clockwise rotation within the image plane |
pitchAngle | number | The Euler angle X of the head. Indicates the rotation of the face about the horizontal axis of the image. Positive x euler angle is when the face is turned upward in the image that is being processed |
yawAngle | number | The Euler angle Y of the head. Indicates the rotation of the face about the vertical axis of the image. Positive y euler angle is when the face is turned towards the right side of the image that is being processed |
averageLightIntensity | number | The average intensity of the pixels in the image |
leftEyeOpenProbability? // Android Only | number | Probability that the face’s left eye is open, in percentage |
rightEyeOpenProbability? // Android Only | number | Probability that the face’s right eye is open, in percentage |
smilingProbability? // Android Only | number | Probability that the face is smiling, in percentage |
Name | Type |
---|---|
left | number |
top | number |
right | number |
bottom | number |
The SDK supports 2 types of facial detection, below you can see their description and how to use them:
A simpler facial detection, supporting face centering and proximity. This is the SDK's default detection. See FaceDetectionOptions to see what settings are available for this functionality. Note: For this functionality to work, liveness mode and continuous capture must be disabled. See below how to use:
1const config: FaceConfig = { 2 licenseKey: 'your-license-key', 3 liveness: { enabled: false }, // liveness mode must be disabled. This is the default value 4 continuousCapture: { enabled: false }, // continuous capture must be disabled. This is the default value 5 faceDetection: { 6 enabled: true, 7 autoCapture: true, 8 multipleFacesEnabled: false, 9 timeToCapture: 3000, // time in millisecond 10 maxFaceDetectionTime: 60000, // time in millisecond 11 scoreThreshold: 0.5, 12 }, 13};
More accurate facial detection supporting more features beyond face centering and proximity. Ideal for those who want to have more control over facial detection. See FaceLivenessDetectionOptions to see what settings are available for this functionality. Note: This feature only works with the front camera. See below how to use:
1const config: FaceConfig = { 2 licenseKey: 'your-license-key', 3 liveness: { 4 enabled: true, 5 debug: false, 6 timeToCapture: 3000, // time in milliseconds 7 maxFaceDetectionTime: 60000, // time in milliseconds 8 minFaceProp: 0.1, 9 maxFaceProp: 0.4, 10 minFaceWidth: 150, 11 minFaceHeight: 150, 12 ied: 90, 13 bboxPad: 20, 14 faceDetectionThresh: 0.5, 15 rollThresh: 4.0, 16 pitchThresh: 4.0, 17 yawThresh: 4.0, 18 closedEyesThresh: 0.7, 19 smilingThresh: 0.7, 20 tooDarkThresh: 50, 21 tooLightThresh: 170, 22 faceCentralizationThresh: 0.05, 23 }, 24};
You can use continuous shooting to capture multiple frames at once. Additionally, you can set a maximum number of frames to be captured using maxNumberFrames. As well as capture time with timeToCapture. Note: Facial detection does not work with continuous capture. Using continuous capture is as simple as setting another attribute. Simply by doing:
1const config: FaceConfig = { 2 licenseKey: 'your-license-key', 3 liveness: { enabled: false }, // liveness mode must be disabled. This is the default value 4 continuousCapture: { 5 enabled: true, 6 timeToCapture: 1000, // capture every frame per second, time in millisecond 7 maxNumberFrames: 40, 8 }, 9};
Extract is a funcionality to extract a template from a given biometric image. The operation returns a FaceExtract object which contains the resulting template as a byte array and a status.
1import { useRecognition } from '@biopassid/face-sdk-react-native'; 2const { extract, startRecognition } = useRecognition(); 3startRecognition('your-license-key'); 4const faceExtract = await extract(base64Image);
Name | Type | Description |
---|---|---|
status | number | The resulting status |
template | string | The resulting template as a base64 string |
Verify is a funcionality to compares two biometric templates. The operation returns a FaceVerify object which contains the resulting score and if the templates are correspondent.
1import { useRecognition } from '@biopassid/face-sdk-react-native'; 2const { startRecognition, verify } = useRecognition(); 3startRecognition('your-license-key'); 4const faceVerify = await verify(base64Template, base64Template);
Name | Type | Description |
---|---|---|
isGenuine | boolean | Indicates whether the informed biometric templates are correspondent or not |
score | number | Indicates the level of similarity between the two given biometrics. Its value may vary from 0 to 100 |
You can also use pre-build configurations on your application, so you can automatically start using multiples services and features that better suit your application. You can instantiate each one and use it's default properties, or if you prefer you can change every config available. Here are the types that are supported right now:
Name | Type |
---|---|
licenseKey | string |
resolutionPreset | FaceResolutionPreset |
lensDirection | FaceCameraLensDirection |
imageFormat | FaceImageFormat |
flashEnabled | boolean |
fontFamily | string |
liveness | FaceLivenessDetectionOptions |
continuousCapture | FaceContinuousCaptureOptions |
faceDetection | FaceDetectionOptions |
mask | FaceMaskOptions |
titleText | FaceTextOptions |
loadingText | FaceTextOptions |
helpText | FaceTextOptions |
feedbackText | FaceFeedbackTextOptions |
backButton | FaceButtonOptions |
flashButton | FaceFlashButtonOptions |
switchCameraButton | FaceButtonOptions |
captureButton | FaceButtonOptions |
Default configs:
1const defaultConfig: FaceConfig = { 2 licenseKey: '', 3 resolutionPreset: FaceResolutionPreset.VERYHIGH, 4 lensDirection: FaceCameraLensDirection.FRONT, 5 imageFormat: FaceImageFormat.JPEG, 6 flashEnabled: false, 7 fontFamily: 'facesdk_opensans_bold', 8 liveness: { 9 enabled: false, 10 debug: false, 11 timeToCapture: 3000, 12 maxFaceDetectionTime: 60000, 13 minFaceProp: 0.1, 14 maxFaceProp: 0.4, 15 minFaceWidth: 150, 16 minFaceHeight: 150, 17 ied: 90, 18 bboxPad: 20, 19 faceDetectionThresh: 0.5, 20 rollThresh: 4.0, 21 pitchThresh: 4.0, 22 yawThresh: 4.0, 23 closedEyesThresh: 0.7, 24 smilingThresh: 0.7, 25 tooDarkThresh: 50, 26 tooLightThresh: 170, 27 faceCentralizationThresh: 0.05, 28 }, 29 continuousCapture: { 30 enabled: false, 31 timeToCapture: 1000, 32 maxNumberFrames: 40, 33 }, 34 faceDetection: { 35 enabled: true, 36 autoCapture: true, 37 multipleFacesEnabled: false, 38 timeToCapture: 3000, 39 maxFaceDetectionTime: 40000, 40 scoreThreshold: 0.7, 41 }, 42 mask: { 43 enabled: true, 44 type: FaceMaskFormat.FACE, 45 backgroundColor: '#CC000000', 46 frameColor: '#FFFFFF', 47 frameEnabledColor: '#16AC81', 48 frameErrorColor: '#E25353', 49 }, 50 titleText: { 51 enabled: true, 52 content: 'Capturing Face', 53 textColor: '#FFFFFF', 54 textSize: 20, 55 }, 56 loadingText: { 57 enabled: true, 58 content: 'Processing...', 59 textColor: '#FFFFFF', 60 textSize: 14, 61 }, 62 helpText: { 63 enabled: true, 64 content: 'Fit your face into the shape below', 65 textColor: '#FFFFFF', 66 textSize: 14, 67 }, 68 feedbackText: { 69 enabled: true, 70 messages: { 71 noDetection: 'No faces detected', 72 multipleFaces: 'Multiple faces detected', 73 faceCentered: 'Face centered. Do not move', 74 tooClose: 'Turn your face away', 75 tooFar: 'Bring your face closer', 76 tooLeft: 'Move your face to the right', 77 tooRight: 'Move your face to the left', 78 tooUp: 'Move your face down', 79 tooDown: 'Move your face up', 80 invalidIED: 'Invalid inter-eye distance', 81 faceAngleMisaligned: 'Misaligned face angle', 82 closedEyes: 'Open your eyes', 83 smiling: 'Do not smile', 84 tooDark: 'Too dark', 85 tooLight: 'Too light', 86 }, 87 textColor: '#FFFFFF', 88 textSize: 14, 89 }, 90 backButton: { 91 enabled: true, 92 backgroundColor: '#00000000', 93 buttonPadding: 0, 94 buttonSize: { width: 56, height: 56 }, 95 iconOptions: { 96 enabled: true, 97 iconFile: 'facesdk_ic_close', 98 iconColor: '#FFFFFF', 99 iconSize: { width: 32, height: 32 }, 100 }, 101 labelOptions: { 102 enabled: false, 103 content: 'Back', 104 textColor: '#FFFFFF', 105 textSize: 14, 106 }, 107 }, 108 flashButton: { 109 enabled: false, 110 backgroundColor: '#FFFFFF', 111 buttonPadding: 0, 112 buttonSize: { width: 56, height: 56 }, 113 flashOnIconOptions: { 114 enabled: true, 115 iconFile: 'facesdk_ic_flash_on', 116 iconColor: '#FFCC01', 117 iconSize: { width: 32, height: 32 }, 118 }, 119 flashOnLabelOptions: { 120 enabled: false, 121 content: 'Flash On', 122 textColor: '#323232', 123 textSize: 14, 124 }, 125 flashOffIconOptions: { 126 enabled: true, 127 iconFile: 'facesdk_ic_flash_off', 128 iconColor: '#323232', 129 iconSize: { width: 32, height: 32 }, 130 }, 131 flashOffLabelOptions: { 132 enabled: false, 133 content: 'Flash Off', 134 textColor: '#323232', 135 textSize: 14, 136 }, 137 }, 138 switchCameraButton: { 139 enabled: true, 140 backgroundColor: '#FFFFFF', 141 buttonPadding: 0, 142 buttonSize: { width: 56, height: 56 }, 143 iconOptions: { 144 enabled: true, 145 iconFile: 'facesdk_ic_switch_camera', 146 iconColor: '#323232', 147 iconSize: { width: 32, height: 32 }, 148 }, 149 labelOptions: { 150 enabled: false, 151 content: 'Switch Camera', 152 textColor: '#323232', 153 textSize: 14, 154 }, 155 }, 156 captureButton: { 157 enabled: true, 158 backgroundColor: '#FFFFFF', 159 buttonPadding: 0, 160 buttonSize: { width: 56, height: 56 }, 161 iconOptions: { 162 enabled: true, 163 iconFile: 'facesdk_ic_capture', 164 iconColor: '#323232', 165 iconSize: { width: 32, height: 32 }, 166 }, 167 labelOptions: { 168 enabled: false, 169 content: 'Capture', 170 textColor: '#323232', 171 textSize: 14, 172 }, 173 }, 174};
Name | Type |
---|---|
enabled | boolean |
timeToCapture | number // time in milliseconds |
maxNumberFrames | number |
Name | Type | Description |
---|---|---|
enabled | boolean | Activates facial detection |
autoCapture | boolean | Activates automatic capture |
multipleFacesEnabled | boolean | Allows the capture of photos with two or more faces |
timeToCapture | number | Time it takes to perform an automatic capture, in miliseconds |
maxFaceDetectionTime | number | Maximum facial detection attempt time, in miliseconds |
scoreThreshold | number | Minimum trust score for a detection to be considered valid. Must be a number between 0 and 1, which 0.1 would be a lower face detection trust level and 0.9 would be a higher trust level |
Name | Type | Description |
---|---|---|
enabled | boolean | Activates facial detection |
debug | boolean | If activated, a red rectangle will be drawn around the detected faces, in addition, it will be shown in the feedback message which attribute caused an invalid face |
timeToCapture | number | Time it takes to perform an automatic capture, in miliseconds |
maxFaceDetectionTime | number | Maximum facial detection attempt time, in miliseconds |
minFaceProp | number | Maximum facial detection attempt time, in miliseconds |
maxFaceProp | number | Maximum limit on the proportion of the area occupied by the face in the image, in percentage |
minFaceWidth | number | Minimum face width, in pixels |
minFaceHeight | number | Minimum face height, in pixels |
ied | number | Minimum distance between left eye and right eye, in pixels |
bboxPad | number | Padding the face's bounding box to the edges of the image, in pixels |
faceDetectionThresh | number | Minimum trust score for a detection to be considered valid. Must be a number between 0 and 1, which 0.1 would be a lower face detection trust level and 0.9 would be a higher trust level |
rollThresh | number | The Euler angle X of the head. Indicates the rotation of the face about the axis pointing out of the image. Positive z euler angle is a counter-clockwise rotation within the image plane |
pitchThresh | number | The Euler angle X of the head. Indicates the rotation of the face about the horizontal axis of the image. Positive x euler angle is when the face is turned upward in the image that is being processed |
yawThresh | number | The Euler angle Y of the head. Indicates the rotation of the face about the vertical axis of the image. Positive y euler angle is when the face is turned towards the right side of the image that is being processed |
closedEyesThresh // Android Only | number | Minimum probability threshold that the left eye and right eye of the face are closed, in percentage. A value less than 0.7 indicates that the eyes are likely closed |
smilingThresh // Android Only | number | Minimum threshold for the probability that the face is smiling, in percentage. A value of 0.7 or more indicates that a person is likely to be smiling |
tooDarkThresh | number | Minimum threshold for the average intensity of the pixels in the image |
tooLightThresh | number | Maximum threshold for the average intensity of the pixels in the image |
faceCentralizationThresh | number | Threshold to consider the face centered, in percentage |
Name | Type |
---|---|
enabled | boolean |
type | FaceMaskFormat |
backgroundColor | string |
frameColor | string |
frameEnabledColor | string |
frameErrorColor | string |
Name | Type |
---|---|
enabled | boolean |
messages | FaceFeedbackTextMessages |
textColor | string |
textSize | number |
Name | Type |
---|---|
noDetection | string |
multipleFaces | string |
faceCentered | string |
tooClose | string |
tooFar | string |
tooLeft | string |
tooRight | string |
tooUp | string |
tooDown | string |
invalidIED | string |
faceAngleMisaligned | string |
closedEyes // Android Only | string |
smiling // Android Only | string |
tooDark | string |
tooLight | string |
Name | Type |
---|---|
enabled | boolean |
backgroundColor | string |
buttonPadding | number |
buttonSize | FaceSize |
flashOnIconOptions | FaceIconOptions |
flashOnLabelOptions | FaceTextOptions |
flashOffIconOptions | FaceIconOptions |
flashOffLabelOptions | FaceTextOptions |
Name | Type |
---|---|
enabled | boolean |
backgroundColor | string |
buttonPadding | number |
buttonSize | FaceSize |
iconOptions | FaceIconOptions |
labelOptions | FaceTextOptions |
Name | Type |
---|---|
enabled | boolean |
iconFile | string |
iconColor | string |
iconSize | FaceSize |
Name | Type |
---|---|
enabled | boolean |
content | string |
textColor | string |
textSize | number |
Name |
---|
FaceCameraLensDirection.FRONT |
FaceCameraLensDirection.BACK |
Name |
---|
FaceImageFormat.JPEG |
FaceImageFormat.PNG |
Name |
---|
FaceMaskFormat.FACE |
FaceMaskFormat.SQUARE |
FaceMaskFormat.ELLIPSE |
Name | Resolution |
---|---|
FaceResolutionPreset.LOW | 240p (352x288 on iOS, 320x240 on Android) |
FaceResolutionPreset.MEDIUM | 480p (640x480 on iOS, 720x480 on Android) |
FaceResolutionPreset.HIGH | 720p (1280x720) |
FaceResolutionPreset.VERYHIGH | 1080p (1920x1080) |
FaceResolutionPreset.ULTRAHIGH | 2160p (3840x2160) |
FaceResolutionPreset.MAX | The highest resolution available |
You can use the default font family or set one of your own. To set a font, create a folder font under res directory in your android/app/src/main/res
. Download the font which ever you want and paste it inside font folder. All font file names must be only: lowercase a-z, 0-9, or underscore. The structure should be some thing like below.
To add the font files to your Xcode project:
Then, add the "Fonts provided by application" key to your app’s Info.plist file. For the key’s value, provide an array of strings containing the relative paths to any added font files.
In the following example, the font file is inside the fonts directory, so you use fonts/roboto_mono_bold_italic.ttf as the string value in the Info.plist file.
Finally, just set the font passing the name of the font file when instantiating FaceConfig in your React Native app.
1const config: FaceConfig = { 2 licenseKey: "your-license-key", 3 fontFamily: "roboto_mono_bold_italic", 4};
You can use the default icons or define one of your own. To set a icon, download the icon which ever you want and paste it inside drawable folder in your android/app/src/main/res
. All icon file names must be only: lowercase a-z, 0-9, or underscore. The structure should be some thing like below.
To add icon files to your Xcode project:
Finally, just set the icon passing the name of the icon file when instantiating FaceConfig in your React Native app.
1const config: FaceConfig = { 2 licenseKey: "your-license-key", 3 // Changing back button icon 4 backButton: { iconOptions: { iconFile: "ic_baseline_camera" } }, 5 // Changing switch camera button icon 6 switchCameraButton: { iconOptions: { iconFile: "ic_baseline_camera" } }, 7 // Changing capture button icon 8 captureButton: { iconOptions: { iconFile: "ic_baseline_camera" } }, 9 // Changing flash button icon 10 flashButton: { 11 flashOnIconOptions: { iconFile: 'ic_baseline_camera' }, 12 flashOffIconOptions: { iconFile: 'ic_baseline_camera' }, 13 }, 14};
Do you like the Face SDK and would you like to know about our other products? We have solutions for fingerprint detection and digital signature capture.
1// Before 2const config: FaceConfig = { 3 licenseKey: 'your-license-key', 4 feedbackText: { 5 messages: { 6 noFaceDetectedMessage: 'No faces detected', 7 multipleFacesDetectedMessage: 'Multiple faces detected', 8 detectedFaceIsCenteredMessage: 'Face centered. Do not move', 9 detectedFaceIsTooCloseMessage: 'Turn your face away', 10 detectedFaceIsTooFarMessage: 'Bring your face closer', 11 detectedFaceIsOnTheLeftMessage: 'Move your face to the right', 12 detectedFaceIsOnTheRightMessage: 'Move your face to the left', 13 detectedFaceIsTooUpMessage: 'Move your face down', 14 detectedFaceIsTooDownMessage: 'Move your face up', 15 }, 16 }, 17}; 18 19// Now 20const config: FaceConfig = { 21 licenseKey: 'your-license-key', 22 feedbackText: { 23 messages: { 24 noDetection: 'No faces detected', 25 multipleFaces: 'Multiple faces detected', 26 faceCentered: 'Face centered. Do not move', 27 tooClose: 'Turn your face away', 28 tooFar: 'Bring your face closer', 29 tooLeft: 'Move your face to the right', 30 tooRight: 'Move your face to the left', 31 tooUp: 'Move your face down', 32 tooDown: 'Move your face up', 33 }, 34 }, 35};
No vulnerabilities found.
No security vulnerabilities found.