We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I get an error
Steps to reproduce the behavior:
import torch from torch.quantization import get_default_qconfig from torch.quantization.quantize_fx import prepare_fx # init module class MyModule(torch.nn.Module): def __init__(self): super(MyModule, self).__init__() ... def forward(self, x): for i in range(x.size(1)): x += 1 return torch_model = MyModule().eval() # fx s_qconfig_dict = {'': get_default_qconfig("fbgemm")} prepare_fx(torch_model, s_qconfig_dict)
Traceback (most recent call last): File "mini_code.py", line 22, in <module> prepare_fx(torch_model, s_qconfig_dict) File "/opt/conda/lib/python3.8/site-packages/torch/quantization/quantize_fx.py", line 392, in prepare_fx return _prepare_fx(model, qconfig_dict, prepare_custom_config_dict) File "/opt/conda/lib/python3.8/site-packages/torch/quantization/quantize_fx.py", line 174, in _prepare_fx graph_module = GraphModule(model, tracer.trace(model)) File "/opt/conda/lib/python3.8/site-packages/torch/fx/symbolic_trace.py", line 571, in trace self.create_node('output', 'output', (self.create_arg(fn(*args)),), {}, File "mini_code.py", line 14, in forward for i in range(x.size(1)): TypeError: 'Proxy' object cannot be interpreted as an integer
The text was updated successfully, but these errors were encountered:
How to solve it? I also met it ..... Thanks!
Sorry, something went wrong.
I get it, Thank you!
I got a same error, how to solve it?
@xiaopengaia How to solve it? I also met it ..... Thanks!
No branches or pull requests
🐛 Bug
I get an error
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Environment
The text was updated successfully, but these errors were encountered: