Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Have anyone checked the inference time exactly? #25

Open
msson opened this issue Jun 12, 2018 · 1 comment
Open

Have anyone checked the inference time exactly? #25

msson opened this issue Jun 12, 2018 · 1 comment

Comments

@msson
Copy link

msson commented Jun 12, 2018

When I run the test_enet.py, I see only about 0.5~1fps inference time with 1024x768 resolution. There must be other way to check exact inference time I think. Please give me advice. Thank you.

@kwotsin
Copy link
Owner

kwotsin commented Jul 23, 2018

This greatly depends on the hardware you used and seems to be a duplicated problem of #19. Have you checked if the bottleneck comes from the data feeding part rather than model inference?

@kwotsin kwotsin added this to the awaiting response milestone Jul 23, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants