Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding metadata to TFLite models #5784

Closed
1 task done
chainyo opened this issue Nov 25, 2021 · 28 comments
Closed
1 task done

Adding metadata to TFLite models #5784

chainyo opened this issue Nov 25, 2021 · 28 comments
Labels
question Further information is requested Stale Stale and schedule for closing soon

Comments

@chainyo
Copy link

chainyo commented Nov 25, 2021

Search before asking

Question

I looked a lot of issues and the @zldrobit repo for TFLite export and inference.

But, I'm still lacking on answers in order to add metadatas to my tflite model. I know how to use it in a python script or even with tf.js, but I need to use my converted model in MLKit (a solution by Google) and this platform requires metadatas, especially outputs metadatas to run.

I tried also to add manually metadatas to my model with tflite_support package. Unfortunatly the default object detection template doesn't work because YoloV5 output is only 1 tensor and not 4 tensors as expected by the template. I tried to customize the metadatas add via this package, but nothing seems to work.

I'm still looking for a solution to add metadatas to my TFLite custom model.

I looked this issues' answer @zldrobit any suggestions ? #5030 (comment)

@chainyo chainyo added the question Further information is requested label Nov 25, 2021
@glenn-jocher
Copy link
Member

@chainyo I'm not sure, I'm not a TF specialist myself, but on the surface this appears to be a TFLite issue rather than a YOLOv5 issue. Perhaps raise this on the TF repo or the TF forums, or even stackoverflow.

Naturally this would be nice to have here too, as TFLite inference with detect.py lacks class names currently, so from our side at least adding the class names as model metadata would be useful.

@chainyo
Copy link
Author

chainyo commented Nov 25, 2021

This is the actual custom script I have to add metadatas.
This is what works and what is not working:

  • Models infos
  • Model Inputs
  • Model Outputs

Outputs don't match the metadatas because yolov5 has 1 tensor as output and the metadatas are expecting 4 tensors.

ValueError: The number of output tensors (1) should match the number of output tensor metadata (4)

The goal would be to understand how to make it fit the yolov5 output.

- subgraph.outputTensorGroups = [group]
+ subgraph.outputTensorGroups = [[group]]
import os 

from tflite_support import flatbuffers
from tflite_support import metadata as _metadata
from tflite_support import metadata_schema_py_generated as _metadata_fb


model_file = "[path_to_model].tflite"


# Creates model info.
model_meta = _metadata_fb.ModelMetadataT()
model_meta.name = "Model Name"
model_meta.description = ("Model Description")
model_meta.version = "v1"
model_meta.author = "Model Author"


# Creates input info.
input_meta = _metadata_fb.TensorMetadataT()
input_meta.name = "image"
input_meta.description = ("Input Description")
input_meta.content = _metadata_fb.ContentT()
input_meta.content.contentProperties = _metadata_fb.ImagePropertiesT()
input_meta.content.contentProperties.colorSpace = (
    _metadata_fb.ColorSpaceType.RGB)
input_meta.content.contentPropertiesType = (
    _metadata_fb.ContentProperties.ImageProperties)
input_normalization = _metadata_fb.ProcessUnitT()
input_normalization.optionsType = (
    _metadata_fb.ProcessUnitOptions.NormalizationOptions)
input_normalization.options = _metadata_fb.NormalizationOptionsT()
input_normalization.options.mean = [127.5]
input_normalization.options.std = [127.5]
input_meta.processUnits = [input_normalization]
input_stats = _metadata_fb.StatsT()
input_stats.max = [255]
input_stats.min = [0]
input_meta.stats = input_stats


# Creates output info.
output_location_meta = _metadata_fb.TensorMetadataT()
output_location_meta.name = "location"
output_location_meta.description = "The locations of the detected boxes."
output_location_meta.content = _metadata_fb.ContentT()
output_location_meta.content.contentPropertiesType = (
    _metadata_fb.ContentProperties.BoundingBoxProperties)
output_location_meta.content.contentProperties = (
    _metadata_fb.BoundingBoxPropertiesT())
output_location_meta.content.contentProperties.index = [1, 0, 3, 2]
output_location_meta.content.contentProperties.type = (
    _metadata_fb.BoundingBoxType.BOUNDARIES)
output_location_meta.content.contentProperties.coordinateType = (
    _metadata_fb.CoordinateType.RATIO)
output_location_meta.content.range = _metadata_fb.ValueRangeT()
output_location_meta.content.range.min = 2
output_location_meta.content.range.max = 2

output_class_meta = _metadata_fb.TensorMetadataT()
output_class_meta.name = "category"
output_class_meta.description = "The categories of the detected boxes."
output_class_meta.content = _metadata_fb.ContentT()
output_class_meta.content.contentPropertiesType = (
    _metadata_fb.ContentProperties.FeatureProperties)
output_class_meta.content.contentProperties = (
    _metadata_fb.FeaturePropertiesT())
output_class_meta.content.range = _metadata_fb.ValueRangeT()
output_class_meta.content.range.min = 2
output_class_meta.content.range.max = 2
label_file = _metadata_fb.AssociatedFileT()
label_file.name = os.path.basename("label.txt")
label_file.description = "Label of objects that this model can recognize."
label_file.type = _metadata_fb.AssociatedFileType.TENSOR_VALUE_LABELS
output_class_meta.associatedFiles = [label_file]

output_score_meta = _metadata_fb.TensorMetadataT()
output_score_meta.name = "score"
output_score_meta.description = "The scores of the detected boxes."
output_score_meta.content = _metadata_fb.ContentT()
output_score_meta.content.contentPropertiesType = (
    _metadata_fb.ContentProperties.FeatureProperties)
output_score_meta.content.contentProperties = (
    _metadata_fb.FeaturePropertiesT())
output_score_meta.content.range = _metadata_fb.ValueRangeT()
output_score_meta.content.range.min = 2
output_score_meta.content.range.max = 2

output_number_meta = _metadata_fb.TensorMetadataT()
output_number_meta.name = "number of detections"
output_number_meta.description = "The number of the detected boxes."
output_number_meta.content = _metadata_fb.ContentT()
output_number_meta.content.contentPropertiesType = (
    _metadata_fb.ContentProperties.FeatureProperties)
output_number_meta.content.contentProperties = (
    _metadata_fb.FeaturePropertiesT())


# Creates subgraph info.
group = _metadata_fb.TensorGroupT()
group.name = "detection result"
group.tensorNames = [
    output_location_meta.name, output_class_meta.name,
    output_score_meta.name
]
subgraph = _metadata_fb.SubGraphMetadataT()
subgraph.inputTensorMetadata = [input_meta]
subgraph.outputTensorMetadata = [
    output_location_meta, output_class_meta, output_score_meta,
    output_number_meta
]
subgraph.outputTensorGroups = [group]
model_meta.subgraphMetadata = [subgraph]

b = flatbuffers.Builder(0)
b.Finish(
    model_meta.Pack(b),
    _metadata.MetadataPopulator.METADATA_FILE_IDENTIFIER)
metadata_buf = b.Output()

populator = _metadata.MetadataPopulator.with_model_file(model_file)
populator.load_metadata_buffer(metadata_buf)
populator.load_associated_files(["[path_to_labels].txt"])
populator.populate()

@chainyo I'm not sure, I'm not a TF specialist myself, but on the surface this appears to be a TFLite issue rather than a YOLOv5 issue. Perhaps raise this on the TF repo or the TF forums, or even stackoverflow.

Yes I know but I was expecting any hints from @zldrobit because he worked a lot on tflite conversions 🤗

@chainyo
Copy link
Author

chainyo commented Nov 25, 2021

By the way, it could be a really nice add to tflite conversions. And If I find the way to do it, I would PR it 😄

@glenn-jocher
Copy link
Member

@chainyo the 4 output format is a special TF format that they apply to older SSD models. We talked to Google (Sachin Joglekar, https://github.com/srjoglekar246) about applying it to YOLOv5 but the talks stagnated. You might want to contact him and see if he'd be interested in pointing you in the right direction or providing more official support for YOLOv5.

References:
https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/running_on_mobile_tf2.md#step-1-export-tflite-inference-graph
https://github.com/tensorflow/models/blob/master/research/object_detection/export_tflite_graph_lib_tf2.py

@chainyo
Copy link
Author

chainyo commented Nov 25, 2021

Thanks for links and stuff! 🤝

@zldrobit
Copy link
Contributor

@chainyo Adding tf.image.non_max_suppression_padded is the most viable way to support GPU delegate, but it still has dynamic-sized tensors (#2095 (comment)), which cannot be run by GPU delegate. Actually, if you only care usage on CPU and would like to integrate an NMS op, you could replace agnostic_nms() with https://github.com/zldrobit/yolov5/blob/c9702e62d9e8fbcd225dda86626abba9f57c3f68/models/tf.py#L391-L420 in model/tf.py and the agnostic_nms_layer class with https://github.com/zldrobit/yolov5/blob/c9702e62d9e8fbcd225dda86626abba9f57c3f68/models/tf.py#L376-L388. Or just use the https://github.com/zldrobit/yolov5/tree/tflite-nms-padded branch with tf.image.non_max_suppression_padded. Be careful that the output tensor order changes after TFLite conversion, so the Android Java code needs to be updated as well. I think this might be a possible way to use metadata in TFLite models.

@chainyo
Copy link
Author

chainyo commented Nov 25, 2021

Actually, if you only care usage on CPU

Because it will run on android mobile, yes GPU support is not needed.

I will give a look and try your hints, thanks. I keep the thread up to date. I will also try to contact Sachin Joglekar as Glenn suggested.

@chainyo
Copy link
Author

chainyo commented Nov 25, 2021

I think this might be a possible way to use metadata in TFLite models.

That works really well with detect.py file and netron.app recognise there is 4 output tensors, but MLKit doesn't know how to use it. I think it's not implemented yet...

image

It seems that MLKit only works with standard models from 3 years ago...

@chainyo
Copy link
Author

chainyo commented Nov 26, 2021

@glenn-jocher It is probably because I used the YoloV5 export.py script.

I will try to use the tf converter to see if it changes anything 🤗

EDIT: With tf_converter it produces 1 output tensor model, that won't be recognize by MLKit anyways. So I think it cannot help... sadly 😞

@chainyo
Copy link
Author

chainyo commented Dec 3, 2021

After some discussion with Google employees, in order to make custom models to fit tflite_support needs for adding metadatas, custom model outputs need to fit these outputs, with the same IO signature.

They use fused operators as described here on the documentation.

@dcboy
Copy link

dcboy commented Dec 13, 2021

@chainyo did you have success to convert tflite model for ml-kit?

@zldrobit
Copy link
Contributor

@chainyo @dcboy NMS support for TFLite models has been added in #5938. You could refer to the colab notebook as an example, though detection for TFLite models with NMS is not supported yet by the master branch.

@chainyo
Copy link
Author

chainyo commented Dec 14, 2021

@dcboy @zldrobit Yes I used the NMS support to export the model and MLKit still doesn't work with it because it doesn't fit the outputs requirements. MLKit is made for SSD models, and not custom models for the moment.

@github-actions
Copy link
Contributor

github-actions bot commented Jan 14, 2022

👋 Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs.

Access additional YOLOv5 🚀 resources:

Access additional Ultralytics ⚡ resources:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLOv5 🚀 and Vision AI ⭐!

@github-actions github-actions bot added the Stale Stale and schedule for closing soon label Jan 14, 2022
@PadalaKavya
Copy link

populator = _metadata.MetadataPopulator.with_model_file(model_file)
populator.load_metadata_buffer(metadata_buf)
populator.load_associated_files(["[path_to_labels].txt"])
populator.populate()

The above code is giving me an error:
error: unpack_from requires a buffer of at least 4 bytes

@matinmoezzi
Copy link

Hi @chainyo
Did you find any solutions to your problem? I have the same problem with using the yolov5 model in tflite.

@chainyo
Copy link
Author

chainyo commented Jun 22, 2022

@matinmoezzi Not really... I trained some Google models (EfficientDet) instead of or kept YoloV5 without MLKit.

@matinmoezzi
Copy link

@chainyo Thank you

@KingWu
Copy link

KingWu commented Jun 25, 2022

Facing the same issue
For yolov5, there is no way to export a tflite which support to ML Kit?

@glenn-jocher
Copy link
Member

@PadalaKavya can you please submit a PR to attach metadata to TFLite models and then read the same data at inference time. We have this in place with many existing formats like ONNX but not TFLite yet.

@glenn-jocher
Copy link
Member

@KingWu I'm not very familiar with ML Kit, so you should probably raise an issue directly there requesting support for YOLOv5 TFLite models

@chainyo
Copy link
Author

chainyo commented Jun 26, 2022

@KingWu I'm not very familiar with ML Kit, so you should probably raise an issue directly there requesting support for YOLOv5 TFLite models

The last time I talked to the google employee managing the TFLite converter, she told me that MLKit wasn't supposed to be compatible with anything else Google models like EfficientDet. So I think that you have to do the TFLite implementation when you want to use the YoloV5 model.

@yangqinjiang
Copy link

That's my problem, PT 2 TFLITE & add metadata, tensors outputs like this:
image
BUT the tflite from google, tensors outputs like that:
image
I want to use the task vision library tensorflow-lite-task-vision:0.4.0 and the tflite (PT 2 TFLITE & add metadata,) file to implement the functionality
BUT it don't work~

Below is my previous post:
#10251

@ShahriarAlom
Copy link

adding metadata to efficientdet, yolo model will not work, because ML kit is not compatible with that till now. It can be in future! Although I can recreate your model's and deploy in your android app, contact me! WhatsApp: +8801770293055

@glenn-jocher
Copy link
Member

@ShahriarAlom adding metadata to the EfficientDet or YOLO model for compatibility with ML Kit is currently not supported. ML Kit has its own compatibility requirements and it may not work with models that are not specifically designed for it. You can contact the mentioned phone number for assistance in recreating and deploying the model in your Android app.

@gokkuu100
Copy link

Crazy I am having the similar issues right now in 2024.

@pderrenger
Copy link
Member

Currently, YOLOv5 models don't directly support ML Kit due to output format differences. You might consider using alternative models like EfficientDet for ML Kit compatibility.

@ultralytics ultralytics deleted a comment from pderrenger Nov 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale Stale and schedule for closing soon
Projects
None yet
Development

No branches or pull requests