Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to get Tensorflow lite model? #14

Open
RayChen1017 opened this issue Nov 17, 2023 · 2 comments
Open

Is it possible to get Tensorflow lite model? #14

RayChen1017 opened this issue Nov 17, 2023 · 2 comments

Comments

@RayChen1017
Copy link

Hi, I would like to inquire if you could provide a TensorFlow Lite private detector model with a dType of float32.
I've attempted to convert an existing saved_model.pb to .tflite for use on Android platform mobile phones. Android platform does not support Float16. Ultimately, I found a way to obtain a Float32 model, which is by retraining and setting the dType from Float16 to Float32 during the training process. However, this method does not seem to be orthodox, so I was wondering if you could provide a .tflite file with a dType of float32. Thank you.

@Steeeephen
Copy link
Contributor

Heyhey, I can retrain with FP32 and add the savedmodel if that works?

@RayChen1017
Copy link
Author

RayChen1017 commented Dec 1, 2023

Hey! Thank you for your response.
Retraining with FP32 and adding the SavedModel should work.

I'm curious about how you retrained the model with FP32.
My approach involved simply changing the sections highlighted in red from FP16 to FP32. Is this method similar to yours? Additionally, are there any potential concerns I should be aware of with my approach?

Snippet

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants