Transfer learning with Inception and MobileNet was used as out-of-box solution with options for rapid experimentation in limited time. After series of trials MobileNet was chosen for better inference time and requirement <3mb for model with low latency (compressed to tflite, realtime <50ms on Mali-T880 MP12). Although more data is required for better accuracy.
Launching
git clone https://github.com/ZackPashkin/tensorflow_glasses_classifier_plus_tflite
cd tensorflow_glasses_classifier_plus_tflite
# Put your data in tf_files/dataset
# MobileNet is available 0.25; 0,5; 0.75 and 1.0
# Image size can be 128,160,192, 224 pixels.
IMAGE_SIZE=224
ARCHITECTURE="mobilenet_0.25_${IMAGE_SIZE}"
python -m scripts.retrain \
--bottleneck_dir=tf_files/bottlenecks \
--how_many_training_steps=5000 \
--model_dir=tf_files/models/ \
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" \
--output_graph=tf_files/retrained_graph.pb \
--output_labels=tf_files/retrained_labels.txt \
--architecture="${ARCHITECTURE}" \
--image_dir=tf_files/dataset
#make prediction
#change image adding name from your dataset
python -m scripts.label_image \
--graph=tf_files/retrained_graph.pb \
--image=tf_files/dataset/without_glasses/000004.jpg \
--labels=tf_files/retrained_labels.txt
#Compress model converting to TFlite format
#IMAGE_SIZE=224
!tflite_convert \
--graph_def_file=tf_files/retrained_graph.pb \
--output_file=tf_files/optimized_graph025_224.lite \
--input_format=TENSORFLOW_GRAPHDEF \
--output_format=TFLITE \
--input_shape=1,224,224,3 \
--input_array=input \
--output_array=final_result \
--inference_type=FLOAT \
--input_data_type=FLOAT
Run in Google Colab |
MobileNet 0,25 with 224 image size was used to make 1.9mb tflite model then imported on Dataset: See with_glasses.zip without_glasses.zip Images from CelebA dataset were used Demo app:apk Possible improvements:
|