-
Create an assets folder in the root of your project and add the files in the Materials/Models directory to it.
-
Update your
rn-cli.config.js
to bundle files with extensionspb
andtxt
.
getAssetExts() {
return ['pb', 'txt']
}
- Create a new ImageRecognizer.ts file and drop it into the root of your project directory.
import { TfImageRecognition } from 'react-native-tensorflow';
export default class ImageRecognizer
{
recognizer: TfImageRecognition;
constructor(options)
{
this.recognizer = new TfImageRecognition(options);
}
async recognize(data)
{
return await this.recognizer.recognize(data);
}
}
- Instantiate ImageRecognizer whenever the
Welcome Screen
mounts so we can process images.
import ImageRecognizer from '../ImageRecognizer';
componentDidMount() {
this.recognizer = new ImageRecognizer({
model: require('../../assets/model.pb'),
labels: require('../../assets/labels.txt'),
});
}
- Update takePicture() to classify the picture captured using the supplied model.
const results = await this.recognizer.recognize({
image: data.path,
inputName: 'Placeholder',
outputName: 'loss',
});
if (results.length > 0) {
alert(`Name: ${results[0].name} - Confidence: ${results[0].confidence.toFixed(2)}`);
}
- Capturing a picture in the app processes image through Tensforflow model.