-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
efficientnet #946
base: master
Are you sure you want to change the base?
efficientnet #946
Conversation
sufeidechabei
commented
Sep 19, 2019
- Performance can't match the orginal paper (I don't have enough GPUS to train Imagenet. )
- It does't support hybrid now because mxnet doesn't have same padding.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add unitest for testing model forward
How to add it, can you give me an example @zhanghang1989 |
@sufeidechabei You can use the symbol.Pooling op to emulate Same pad as it has param |
It only support Symbol instances. |
@sufeidechabei You can wrap a hybridblock to it or you can use hybridlambda. |
I used hybridblock to wrap as you say @chinakook |
Like @chinakook said please try to remove usage of |
Link to model unitest: |
@sufeidechabei I think you may allow an argument during the init function to fix the input shape |
input_filters=int(options['i']), | ||
output_filters=int(options['o']), | ||
expand_ratio=int(options['e']), | ||
id_skip=('noskip' not in block_string), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
id_skip
is not used ?
Okay, I will fix that. |
1 similar comment
Okay, I will fix that. |
_add_conv( | ||
self._se_reduce, | ||
num_squeezed_channels, | ||
active=False, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe there should be activation function in SE module.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I follow the pytorch efficientnet while it doesn't have activation function in SE block.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.