You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After several PTQ and QAT of the QuantizationSimModel, I used the export API to export the onnx model and encoding files.
While I got an error for the following code in _aimet_torch/base/quantsim.py:
for name in op_names:
if name in end_op_names:
output_tensors.extend(op_to_io_tensor_map[name].outputs)
else:
intermediate_tensors.extend(op_to_io_tensor_map[name].outputs)
intermediate_tensors.extend(op_to_io_tensor_map[name].outputs)
AttributeError: 'list' object has no attribute 'outputs'
Because the intermediate output of LSTM module was list. I tried to print them and got:
print(name)
if isinstance(op_to_io_tensor_map[name], list):
print(op_to_io_tensor_map[name])
rnn_sequence_block.encoder
rnn_sequence_block.encoder#1
rnn_sequence_block.encoder#2
rnn_sequence_block.encoder#3
rnn_sequence_block.encoder#2-1
rnn_sequence_block.encoder#3-1
rnn_sequence_block.encoder#4
rnn_sequence_block.encoder#4-1
rnn_sequence_block.encoder#0-1
rnn_sequence_block.encoder#6
rnn_sequence_block.encoder#7-1
rnn_sequence_block.encoder#root_node
[<aimet_torch.defs.OpToIOTensors object at 0x7f0e50c58250>, <aimet_torch.defs.OpToIOTensors object at 0x7f0e50c2b7c0>]
intermediate_tensors.extend(op_to_io_tensor_map[name].outputs)
AttributeError: 'list' object has no attribute 'outputs'
Thus I tried to work around the bug:
for name in op_names:
if name in end_op_names:
output_tensors.extend(op_to_io_tensor_map[name].outputs)
else:
if isinstance(op_to_io_tensor_map[name], list):
print(f"Name list f{name}")
print(op_to_io_tensor_map[name])
for named_tensor in op_to_io_tensor_map[name]:
intermediate_tensors.extend(named_tensor.outputs)
else:
print(f"Name f{name}")
print(op_to_io_tensor_map[name])
intermediate_tensors.extend(op_to_io_tensor_map[name].outputs)
and still got an error, which indicated that the
Param tensor {weight_ih_lo, weight_hh_l0, bias_ih_l0, bias__hh_l0} not found in the valid param set.
Hope to solve this soon!
BRs.
The text was updated successfully, but these errors were encountered:
This is a related issue with Bug report when using QuantizationSimModel to produce the quantization model for LSTM module.
After several PTQ and QAT of the QuantizationSimModel, I used the export API to export the onnx model and encoding files.
While I got an error for the following code in _aimet_torch/base/quantsim.py:
Because the intermediate output of LSTM module was list. I tried to print them and got:
Thus I tried to work around the bug:
and still got an error, which indicated that the
Param tensor {weight_ih_lo, weight_hh_l0, bias_ih_l0, bias__hh_l0} not found in the valid param set.
Hope to solve this soon!
BRs.
The text was updated successfully, but these errors were encountered: