Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[one-optimize] YAMNet Optimization Not Working #13790

Open
qsunki opened this issue Aug 27, 2024 · 12 comments
Open

[one-optimize] YAMNet Optimization Not Working #13790

qsunki opened this issue Aug 27, 2024 · 12 comments
Assignees

Comments

@qsunki
Copy link
Contributor

qsunki commented Aug 27, 2024

What?

When I compile the YAMNet TFLite model to a circle format and run one-optimize, the model is compiled, but the optimization process fails. As a result, I get yamnet.circle but not yamnet.opt.circle. Below is the log file from one-optimize:

yamnet.opt.circle.log
I found the following message in log

circle2circle: terminate called after throwing an instance of 'std::invalid_argument'
circle2circle:   what():  loco::must_cast() failed to cast: PN4luci11CircleConstE

How to reproduce

Compiler: Built from release/1.28.0
Model and Config: yamnet.zip

I obtained this model from https://huggingface.co/thelou1s/yamnet/blob/main/lite-model_yamnet_tflite_1.tflite

  • yamnet.cfg
[onecc]
one-import-tflite=True
one-optimize=True

[one-import-tflite]
model_format=tflite
input_path=yamnet.tflite
output_path=yamnet.circle
input_arrays=input
input_shapes=1
output_arrays=predictions
converter_version=v2

[one-optimize]
input_path=yamnet.circle
output_path=yamnet.opt.circle
verbose=1
  • To reproduce the issue:
    Unzip yamnet.zip.
    Run the command: onecc -C yamnet.cfg
@seanshpark
Copy link
Contributor

seanshpark commented Aug 27, 2024

From yamnet.circle,

  • looks like this Pads paddings input is not constant
  • name: yamnet_frames/tf_op_layer_Pad_2/Pad_2;StatefulPartitionedCall/yamnet_frames/tf_op_layer_Pad_2/Pad_2
  • image
  • failed here
loco::NodeShape infer_pad(const luci::CirclePad *node)
{
  // TODO support non-const case
  auto paddings = loco::must_cast<luci::CircleConst *>(node->paddings());
  return use_paddings(node, paddings);
}

we have to work with // TODO support non-const case

@seanshpark
Copy link
Contributor

How to find this?

  • use vscode's debug go F5 (or gdb but a bit hard to using commands...)
  • program: debug/compiler/circle2circle/circle2circle
  • arguments: yamnet.circle yamnet.opt.circle

run debug, look at the call stack
image

@seanshpark
Copy link
Contributor

@zetwhite , @shs-park , any good ideas for this case?

@icodo98
Copy link
Contributor

icodo98 commented Aug 28, 2024

In my opinion, shape after non-const padding should be dynamic.
There are 2 cases for non-constant paddings.

input padding output
1 [1,2,3,4] [4,2]
(known shape)
[?,?,?,?]
2 [1,2,3,4] [?,2]
(unknown shape)
[?,?,?,?]

for case 1, which case is for yamNet, we know the shape of padding tensor but the value of padding is determined by input.
so after pad layer, shape of tensor is unknown, only the rank of tensor preserved.
for case 2, even the shape of padding is unknown. Since shape of padding should [rank of input, 2 ], we have two choices for this.

  • throw an error
  • shape of tensor is unknown, only the rank of tensor preserved. -> throw an error in runtime

for both cases, padding shape should [n,2]. In the shape inference phase of the pad operation, it is sufficient to check whether the paddings tensor satisfies rank == 2 and dim(1).value == 2.

@zetwhite
Copy link
Contributor

IIUC, The error comes from must_cast : trying to cast CirclePack into CircleConst.
Not only handing dynamic shape, We also have to support CirclePack as a CirclePad's padding.

We also have to support CirclePack as a CirclePad's padding.

I will look for this and leave a comment if there is an easy way to do it.

@seanshpark
Copy link
Contributor

seanshpark commented Aug 28, 2024

The graph:
image

As you can see, first node for the CiclePack is CicleShape which will produce constant output.
And as with constant outputs, all other Ops can produce constant output.
That is if we implement good constant folding for these ops, shape inference for Pad will work.

But as this model input shape is dynamic, we can't do above procedure ... -_-;

@seanshpark
Copy link
Contributor

seanshpark commented Aug 28, 2024

About YAMNet ... I was curious why input is 1D

It is audio ...

  • input name is waveform

Meaning of "dynamic input" can be, number of audio samples can be any ... :)

@icodo98
Copy link
Contributor

icodo98 commented Aug 28, 2024

for case 1, which case is for yamNet, we know the shape of padding tensor but the value of padding is determined by input. so after pad layer, shape of tensor is unknown, only the rank of tensor preserved.

In tensorflow, this case is supported.
I made simple tflite model with this code.
simple.zip

image
For those paddings tensor with dynamic data, model output shape is unknown and only the rank is known.
How about follow this reference?

@shs-park
Copy link
Contributor

How about follow this reference?

I think it is good to follow. 😅

@kyeong8139
Copy link
Contributor

Issue

There is a problem with Range's shape inference when trying to optimize yamnet.tflite.
If the parameters (start, limit, or delta) are dynamic values, I expect the output to have a dynamic shape (represented as ?).
However, the actual output is 1.

  • yamnet.tflite
    image

  • yamnet.opt.circle
    image

Reason

loco::NodeShape infer_range(const luci::CircleRange *node)
{
loco::TensorShape output_shape;
output_shape.rank(1);
auto start_node = dynamic_cast<luci::CircleConst *>(node->start());
auto limit_node = dynamic_cast<luci::CircleConst *>(node->limit());
auto delta_node = dynamic_cast<luci::CircleConst *>(node->delta());
if (start_node == nullptr || limit_node == nullptr || delta_node == nullptr)
{
return use_own(node);
}

The issue occurs because of a failure to dynamically cast to CircleConst, which results in returning nullptr.
As a result, the function falls back to using 1 as the dimension size when dim(axis).known() is false.

I would like to take on this issue to support the optimization of yamnet.tflite.

@shs-park
Copy link
Contributor

@kyeong8139,
I think it is good to add the Range operation to the Others table in this issue #13697

@pcs1265
Copy link
Member

pcs1265 commented Sep 27, 2024

As we decided that the Reshape operation's sinf can output a shape that all dimensions are unknown (e.g., ? x ? x ? x ?), the following Conv2D operation should support dynamic shapes.

Currently, Conv2D and DepthwiseConv2D do not support unknown dimensions except for the batch size.

So, I am working on adding that support now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants