DUP is a virtual dressing room application for >= Android 6.0.
It allows users to see how an outfit will look on them by using image processing and a body pose estimation model.
DUP allows users to add their own outfit from gallery.
The selected outfit is processed according to a sensitivity rate given by the user.
User selects a category to store outfit.
The outfit stored in database.
Image processing methods to extract outfit given below;
- Add alpha channel to image
- Boolean Masking (Binary Threshold)
- Noise Removal (Gaussian Blur)
- Generate mask to make background transparent
- Apply generated mask
- Find largest contour to remove unnecessary area
- Crop largest contour
DUP uses a tensorflow-lite model to estimate certain points on user's body during camera preview.
By using these estimated points, the outfit is placed on screen by calculating its size and position.
The model estimates 14 points on user's body;
Top, Neck, Left Shoulder, Left Elbow, Left Wrist, Right Shoulder, Right Elbow, Right Wrist, Left Hip, Left Knee, Left Ankle, Right Hip, Right Knee, Right Ankle.
There are 4 outfit categories;
"Top", "Long Wears", "Trousers", "Shorts and Skirts"
According to its category, the outfit size and position are calculated by using;
- Top --> Left Shoulder, Right Shoulder, Left Hip
- Long Wears --> Left Shoulder, Right Shoulder, Left Knee
- Trousers --> Left Hip, Right Hip, Left Ankle
- Shorts and Skirts --> Left Hip, Right Hip, Left Knee
- Android Studio 3.2.1
- TensorFlow-Lite
- OpenCV 4.1
- SQLite
- Users will be allowed to create outfit combinations
- Users will be allowed to take screenshot during preview with a button
- Semantic segmentation to extract outfit in a more efficient way
- 3D modeling for a more realistic result