Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initial setup for LiteRT (TensorFlow Lite) model tests. #59

Merged
merged 3 commits into from
Jan 15, 2025

Conversation

ScottTodd
Copy link
Member

Progress on #5.

This contains two simple test cases for demonstration purposes, one of which is currently failing due to a regression: iree-org/iree#19402.

The test suite follows the same structure as the onnx_models test suite in this repository. Some cleanup and refactoring will be more evident as this grows. We could for example share the compile_mlir_with_iree helper function between both test suites.

Comment on lines +32 to +37
# TODO(#5): test iree-run-module success and numerics
# * On Linux...
# * Determine interface via ai-edge-litert / tflite-runtime
# * Produce test inputs, save to .bin for IREE
# * Produce golden test outputs, save to .bin for IREE
# * Run with inputs and expected outputs
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# See https://llvm.org/LICENSE.txt for license information.
# SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception

# https://www.kaggle.com/models/tensorflow/mobilenet-v1/
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Starting with just this one model test in this initial PR. We can port over the previous IREE tests at https://github.com/iree-org/iree/tree/main/integrations/tensorflow/test/python/iree_tfl_tests and then also import popular models from https://www.kaggle.com/models/?orderby=downloadCount&framework=tfLite.

Copy link
Collaborator

@zjgarvey zjgarvey left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. Thanks for writing good documentation.



def download_from_kagglehub(kaggle_model_name: str) -> Path:
model_dir = Path(kagglehub.model_download(kaggle_model_name))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where does this get downloaded to? Does it go to the kagglehub cache?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. See in the README:

### Kaggle

Models are downloaded using https://github.com/Kaggle/kagglehub.

By default, kagglehub caches downloads at `~/.cache/kagglehub/models/`. This
can be overriden by setting the `KAGGLEHUB_CACHE` environment variable. See the
[`kagglehub/config.py` source](https://github.com/Kaggle/kagglehub/blob/main/src/kagglehub/config.py)
for other configuration options.

For example:

PS C:\Users\Scott\.cache\kagglehub> tree /f /a
C:.
\---models
    \---tensorflow
        \---mobilenet-v1
            \---tfLite
                +---0-25-224
                |   |   1.complete
                |   |
                |   \---1
                |           1.mlir
                |           1.mlirbc
                |           1.tflite
                |           1_cpu.vmfb
                |
                \---0-25-224-quantized
                    |   1.complete
                    |
                    \---1
                            1.mlirbc
                            1.tflite

This is a first pass to get something working. I don't want the .mlir/.mlirbc/.vmfb files to sit in the same user/system-wide cache directory. There is also a TODO below:

# TODO(scotttodd): TEST_SUITE_ROOT/aritfacts dir like in onnx_models/

The structure there is (condensed):

PS D:\dev\projects\iree-test-suites\onnx_models\artifacts> tree /f /a
D:.
\---model_zoo
    \---validated
        \---vision
            +---body_analysis
            |       age_googlenet.onnx
            |       age_googlenet_version17.mlir
            |       age_googlenet_version17.onnx
            |       age_googlenet_version17_cpu.vmfb
            |       age_googlenet_version17_input_0.bin
            |       age_googlenet_version17_output_0.bin
            |       emotion-ferplus-8.onnx
            |       emotion-ferplus-8_version17.mlir
            |       emotion-ferplus-8_version17.onnx
            |       emotion-ferplus-8_version17_cpu.vmfb
            |       emotion-ferplus-8_version17_input_0.bin
            |       emotion-ferplus-8_version17_output_0.bin
            |
            +---classification
            |       bvlcalexnet-12.onnx
            |       bvlcalexnet-12_version17.mlir
            |       bvlcalexnet-12_version17.onnx
            |       bvlcalexnet-12_version17_cpu.vmfb
            |       bvlcalexnet-12_version17_input_0.bin
            |       bvlcalexnet-12_version17_output_0.bin
            |       caffenet-12.onnx
            |       caffenet-12_version17.mlir
            |       caffenet-12_version17.onnx
            |       caffenet-12_version17_cpu.vmfb
            |       caffenet-12_version17_input_0.bin
            |       caffenet-12_version17_output_0.bin
            |
            \---super_resolution
                    super-resolution-10.onnx
                    super-resolution-10_version17.mlir
                    super-resolution-10_version17.onnx
                    super-resolution-10_version17_cpu.vmfb
                    super-resolution-10_version17_input_0.bin
                    super-resolution-10_version17_output_0.bin

I think I'd like a model similar to huggingface's cache where there is a concrete cache in the user's home dir (or whatever location they set via the env var) then symlinks from those files into a directory associated with the test suite. That way:

  • other users/tools/etc. on the system can benefit from that common cache without seeing test artifacts mixed in
  • the test suite artifacts dir can be inspected and worked within (as long as symlinks are followed)

@ScottTodd
Copy link
Member Author

I'm going to want to refactor this and share code with the ONNX model test suite. I'll merge the PR as-is after syncing and running the test workflow then refactor in a follow-up.

@ScottTodd ScottTodd merged commit 1919511 into iree-org:main Jan 15, 2025
3 checks passed
@ScottTodd ScottTodd deleted the litert-test-setup branch January 15, 2025 21:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants