Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

temporary: the only valid use of a module is looking up an attribute but found = prim::SetAttr[name="output"](%self, %x.1) #1242

Open
MHGL opened this issue Jul 7, 2021 · 1 comment
Labels
bug Unexpected behaviour that should be corrected (type) PyTorch (not traced)

Comments

@MHGL
Copy link

MHGL commented Jul 7, 2021

🐞Describe the bug

torch_model -> torch.jit.script -> coreml
I got this error while try to setattr in forward.

Trace

Traceback (most recent call last):
  File "mini_code.py", line 21, in <module>
    model = ct.convert(
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/_converters_entry.py", line 175, in convert
    mlmodel = mil_convert(
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/mil/converter.py", line 128, in mil_convert
    proto = mil_convert_to_proto(model, convert_from, convert_to,
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/mil/converter.py", line 171, in mil_convert_to_proto
    prog = frontend_converter(model, **kwargs)
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/mil/converter.py", line 85, in __call__
    return load(*args, **kwargs)
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 70, in load
    converter = TorchConverter(torchscript, inputs, outputs, cut_at_symbols)
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 145, in __init__
    raw_graph, params_dict = self._expand_and_optimize_ir(self.torchscript)
  File "/home/liyang/.local/lib/python3.8/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 262, in _expand_and_optimize_ir
    graph, params = _torch._C._jit_pass_lower_graph(
RuntimeError: 
temporary: the only valid use of a module is looking up an attribute but found  = prim::SetAttr[name="output"](%self, %x.1)
:

To Reproduce

import torch
import coremltools as ct

# init torch module
class MyModule(torch.nn.Module):
    def __init__(self):
        super(MyModule, self).__init__()
        self.output: torch.Tensor = torch.empty(1)

    def forward(self, x):
        self.output = x
        return

torch_model = MyModule()

# script
script_model = torch.jit.script(torch_model)

# Convert to Core ML using the Unified Conversion API
model = ct.convert(
    script_model,
    inputs=[ct.ImageType(name="input", shape=(1, 3, 224, 224))],
)

System environment (please complete the following information):

  • coremltools version (e.g., 3.0b5): 4.1
  • OS (e.g., MacOS, Linux): Ubuntu20.04 LTS
  • How you install python (anaconda, virtualenv, system): miniconda
  • python version (e.g. 3.7): 3.8.5
  • any other relevant information:
    • pytorch version: 1.9.0
    • gpu: GeForce GTX 1650
    • driver: Driver Version: 460.80
    • CUDA: CUDA Version: 11.2
@MHGL MHGL added the bug Unexpected behaviour that should be corrected (type) label Jul 7, 2021
@TobyRoseman TobyRoseman added the triaged Reviewed and examined, release as been assigned if applicable (status) label Jul 8, 2021
@TobyRoseman TobyRoseman added PyTorch (not traced) and removed triaged Reviewed and examined, release as been assigned if applicable (status) labels Oct 25, 2022
@TobyRoseman
Copy link
Collaborator

With coremltools 6.0 and torch 1.21.1, this error is fixed.

However we get a new error due to the fact that the forward method is not returning any value. If we update forward to return x, we another error: RuntimeError: PyTorch convert function for op 'setattr' not implemented.

That is caused by this line in forward: self.output = x. Given that MLModel don't have state, I don't think this is something we can support.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Unexpected behaviour that should be corrected (type) PyTorch (not traced)
Projects
None yet
Development

No branches or pull requests

2 participants