-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add test_forward_ssd_mobilenet_v1 to tflite/test_forward #3350
Conversation
ci_2: All checks have passed ci_1: |
@FrozenGene @kevinthesun @yongwww Could you review? |
@@ -567,6 +567,24 @@ def test_forward_inception_v4_net(): | |||
tvm.testing.assert_allclose(np.squeeze(tvm_output[0]), np.squeeze(tflite_output[0]), | |||
rtol=1e-5, atol=1e-5) | |||
|
|||
####################################################################### | |||
# SSD Mobilenet | |||
# --------- |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
better to be
# SSD Mobilenet
---------------
Keep the same length
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
"""Test the SSD Mobilenet V1 TF Lite model.""" | ||
# SSD MobilenetV1 | ||
tflite_model_file = tf_testing.get_workload_official( | ||
"https://neo-ai-dlr-test-artifacts.s3-us-west-2.amazonaws.com/tflite-models/ssd_mobilenet_v1_coco_2018_01_28_nopp.tgz", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move this to our dmlc / web-data : https://github.com/dmlc/web-data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or download the model directly from tf lite community if possible
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TF provides tflite files only for quantized models, but TVM does not support quantized models yet.
For non-quantized models they provide tflite_graph.pb
file which is compatible with tflite_convert
util and can be converted to tflite file.
For now I can put my tflite converted file to dmlc/web-data.
I can also investigate if Jenkins worker has tflite_convert
util or python module tensorflow.lite.TFLiteConverter
available. In that case I can download tgz model file from Tensorflow zoo, unpack it and convert tflite_graph.pb
to tflite file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, official non-quantized tar.gz ssd model files do not have tflite_graph.pb
. They only have inference pb file and a checkpoint.
tflite_graph.pb
can be created from checkpoint using object_detection/export_tflite_ssd_graph.py
from tensorflow/models repo.
We do not have this repo and util on Jenkins worker.
I opened PR to add tflite model packed to tar.gz ssd_mobilenet_v1_coco_2018_01_28_nopp.tgz
to dmlc/web-data
dmlc/web-data#186
cc0aefe
to
210bb6d
Compare
"""Test the SSD Mobilenet V1 TF Lite model.""" | ||
# SSD MobilenetV1 | ||
tflite_model_file = tf_testing.get_workload_official( | ||
"https://raw.githubusercontent.com/apivovarov/web-data/dev/tensorflow/models/object_detection/ssd_mobilenet_v1_coco_2018_01_28_nopp.tgz", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will change it to dmlc once dmlc/web-data#186 is merged
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated
2076e54
to
6193ccc
Compare
@FrozenGene CI build/test passed. Can you merge it? |
I don't have merge privilege. ping @tqchen @srkreddy1238 help to handle. |
@FrozenGene @yongwww Can you approve the PR? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Thanks! |
This tests uses official non-quantized model
ssd_mobilenet_v1_coco
downloaded from https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.mdssd_mobilenet_v1_coco_2018_01_28.tar.gz
contains checkpoint.To generate tflite file I exported the checkpoint to
tflite_graph.pb
file usingobject_detection/export_tflite_ssd_graph.py
form `tensorflow/models' repo.Then I converted
tflite_graph.pb
file toTFLITE
format using the command below.Because TFLite frontend does not support custom operators I did not include custom operator
TFLite_Detection_PostProcess
which are the last operator in the graph.I did that by specifying
--output_arrays
asraw_outputs/class_predictions,raw_outputs/box_encodings
which are inputs forTFLite_Detection_PostProcess
op.Users can do PostProcessing outside of TVM.
post-processing code is here
Graph visualization
I uploaded resulting
model_nopp.tflite
file to s3 bucket belonging to neo-ai organization