-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to infer 513 size by Quantization-aware trained deeplab? #5
Comments
It might depend on training configuration or datasets. Could you provide your implementation details? |
Dear @tantara, I have the same question. I'm tring run your application with my deeplab model, from your this model different only input size (1, 513, 513, 3).
But I have this error.
I think, problem in the input parameters of (tflite_convert or export_model.py), or maybe you change model's architecture. Please, could you help me? deeplabv3_mnv2_pascal_train_aug_2018_01_29.tar.gz - original model |
@PotekhinRoman I haven't used different types as follows
Let's try to the same type on |
@tantara Given both inference types as same as you said above but unable to convert to tflite and giving the following message: Traceback (most recent call last): Please respond, thanks! |
@PotekhinRoman Did u get the solution for the error? |
@tantara, @Roopesh-Nallakshyam, So far I have not been able to solve this problem, I plan to return to searching for a solution in a month. |
I trained deeplab(mobilenetV2) by quantization-aware training method.
And I exported quantized pb file. ( Cropped by here )
And converted from pb file to tflite file.
So I made my quantized tflite file.
until Crop Size 321, segmentation result is good. ( Green Mask is Overlayed )
But Over Crop Size 321, segmentation is not good ( Green Mask is not Overlayed )
Could you teach me why this problem is happened?
Sorry for bad English and Thank you.
The text was updated successfully, but these errors were encountered: