Replies: 9 comments 1 reply
-
Hey @AZOPCORP - so I've recently solved how to do this! (I think) I made a test, and for that to work it had to run 100% in node. This test runs a classification on the logo without accessing a browser. I hope the setup code in this test works for you. I'm sure it could be better wrapped in the actual lib though. If you do use this lib on a backend, please contribute back any improvements 👍 |
Beta Was this translation helpful? Give feedback.
-
Thanx for your awesome job! Below a working eg with tfjs-node based on your test: const tf =require('@tensorflow/tfjs-node')
const load=require('./dist/index').load
const fs = require('fs');
const jpeg = require('jpeg-js');
// Fix for JEST
const globalAny = global
globalAny.fetch = require('node-fetch')
const timeoutMS = 10000
const NUMBER_OF_CHANNELS = 3
const readImage = (path) => {
const buf = fs.readFileSync(path)
const pixels = jpeg.decode(buf, true)
return pixels
}
const imageByteArray = (image, numChannels) => {
const pixels = image.data
const numPixels = image.width * image.height;
const values = new Int32Array(numPixels * numChannels);
for (let i = 0; i < numPixels; i++) {
for (let channel = 0; channel < numChannels; ++channel) {
values[i * numChannels + channel] = pixels[i * 4 + channel];
}
}
return values
}
const imageToInput = (image, numChannels) => {
const values = imageByteArray(image, numChannels)
const outShape = [image.height, image.width, numChannels] ;
const input = tf.tensor3d(values, outShape, 'int32');
return input
}
(async()=>{
const model = await load('file://./model/')//moved model at root of folder
const logo = readImage(`./_art/nsfwjs_logo.jpg`)
const input = imageToInput(logo, NUMBER_OF_CHANNELS)
console.time('predict')
const predictions = await model.classify(input)
console.timeEnd('predict')
console.log(predictions)
})() |
Beta Was this translation helpful? Give feedback.
-
very cool! Maybe we should put an example in the demo folder? |
Beta Was this translation helpful? Give feedback.
-
Something like a way to pass some options to configure lib using tfjs-node and tfjs-node-gpu would be much apreciated I think. BTW i will make a backend demo as soon as i have the time. |
Beta Was this translation helpful? Give feedback.
-
AZOPCORP, I am getting this Error: Request for file://.model/model.json failed due to error: TypeError: Only HTTP(S) protocols are supported. |
Beta Was this translation helpful? Give feedback.
-
Closing because question moved to https://github.com/infinitered/nsfwjs/wiki/FAQ:-NSFW-JS |
Beta Was this translation helpful? Give feedback.
-
For information, if you are still wondering how to run it on Node.js, this guy's code works https://github.com/mishazawa/nums When you do const nsfwjs = require('nsfw/dist') He basically forked the code to It would be nice and not too much work to have this published to npmjs.org. Similarly you can use const toxicity = require('@tensorflow-models/toxicity')
const sentences = ['I love C++']
toxicity.load(0.9).then(model =>
model.classify(sentences).then(predictions => ...)
) What I would like to have from a developer point of view is to be able to use it out of the box the same way, const image = ...
nsfwjs.load().then(model => // Instead of nsfwjs.load('file://..../model/')
model.classify(image).then(predictions => ...)
) |
Beta Was this translation helpful? Give feedback.
-
I wish he had contributed back with a Pull Request. @mycaule - would you be willing to take a shot? If not, this is something I could get around to at some point. |
Beta Was this translation helpful? Give feedback.
-
Ok I will try to do this week, the workflow between NPM and your build might be something I can't test though. For the fetching of this line, Lines 21 to 22 in 0497856 it might save you some S3 costs. const mobilenet = require('@tensorflow-models/mobilenet')
mobilenet.load().then(model => model.classify(...).then(predictions => ...)
// Downloads a file from tfhub in the background
// https://tfhub.dev/google/imagenet/mobilenet_v1_100_224/classification/1/model.json?tfjs-format=file See #224 |
Beta Was this translation helpful? Give feedback.
-
could it be possible to port the lib for backend use via tfjs-node and node canvas?
Beta Was this translation helpful? Give feedback.
All reactions