-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clip convert error: Dynamic value of min/max is not implemented #191
Comments
The full onnx model can be download from here: |
@ongiaf Do you have any success decoding FuXi (I've been fine-tuning this model for a long time)? I recommend paying attention to this solution |
Thanks, it's excellent work. |
@ongiaf Did you manage to run FuXi with the current weights for the fine-tuning process (I am currently thinking about how to complete the work on the model on a 1-hour grid and thought about freezing some layers except the U-transformer.) ? |
Thank you for posting your changes about Clip. Could you also suggest how to fix |
@juanqiu1 In order for this to work with FuXi, you will need to change the @add_converter(operation_type='LayerNormalization', version=17)
def _(node: OnnxNode, graph: OnnxGraph) -> OperationConverterResult:
node_attributes = node.attributes
axis = node_attributes.get('axis', AXIS_DEFAULT_VALUE)
epsilon = node_attributes.get('epsilon', EPSILON_DEFAULT_VALUE)
if all(value_name in graph.initializers for value_name in node.input_values[1:]):
input_value_info = graph.value_info[node.input_values[0]]
input_shape = get_shape_from_value_info(input_value_info)
torch_module = nn.LayerNorm(
normalized_shape=(1536), # input_shape[axis:], (this block!)
eps=epsilon,
elementwise_affine=True,
) |
@dsuhoi Thank you for hinting, there are a couple of other easy fixes (typing, etc). |
@juanqiu1 Yes, I managed to start the learning process by highlighting the I used Nvidia A100 (40GB). |
Hi,
I have an onnx model. Here is one of the nodes in onnxgraph
When I tried to convert it to the torch model, it cased a KeyError:
it may caused by
onnx2torch/onnx2torch/node_converters/clip.py
Line 60 in a8b0603
After adding conditions
The convert can work.
The text was updated successfully, but these errors were encountered: