You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@SwatiModi Thanks for such a nice work. I have a question related to the training model . I am working same sort of the problem , so I am using u2net segmentation Model for my solution . But it is really slow when I integrate it with my web-app code to run in the browser. Can you suggest me any way to improve my speed or can you provide your python training and testing script , So I can use it for my problem. Thanks
The text was updated successfully, but these errors were encountered:
Hi @NaeemKhan333, you can refer to the model architecture here, I had used MobileNetV1 as encoder backbone and a custom decoder. Also few things that you could try while building the model
reducing width of the network by reducing input size(I had used 224x224, 160x160 or 192x192 shall also work well), and number of filters in each layer (if you are using keras you can directly adjust this by alpha ref)
reducing the depth of network i.e. using as less layers in network as possible
In the custom decoder I have used constant number of of filters throughout, this will also help with optimizing the inference time
Also when deploying the model for web, WASM can be really helpful for speed boost - ref
@SwatiModi thank you for sharing this repo. I am also trying to make this kind of project. In my project, I need to segment 8 classes, so I have to train my own dataset. Can you share your traing code?
@SwatiModi Thanks for such a nice work. I have a question related to the training model . I am working same sort of the problem , so I am using u2net segmentation Model for my solution . But it is really slow when I integrate it with my web-app code to run in the browser. Can you suggest me any way to improve my speed or can you provide your python training and testing script , So I can use it for my problem. Thanks
The text was updated successfully, but these errors were encountered: