-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: Add triton_distributed_rs wheel install to container build #135
Conversation
…inding install when using --mount-workspace, add placeholder test to sanity check binding install
runtime/rust/python-wheel/python/tests/test_bindings_install.py
Outdated
Show resolved
Hide resolved
Moved triton_distributed_rs install outside of venv so it can be more easily tested in current CI setup. Passes locally too:
|
…nference-server/triton_distributed into rmccormick/test_python_bindings
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM - could move to virtual env - could do in seperate MR
Need to reconcile:
Need to make both test and vllm example work |
…nstall, additionally install built wheel to system interpreter for now
…nference-server/triton_distributed into rmccormick/test_python_bindings
…ributed into rmccormick/test_python_bindings
./container/build.sh
./container/run.sh -it # or --mount-workspace for local dev/debug
$ python3 -c "import triton_distributed_rs" # OK
$ source /opt/triton/venv/bin/activate
(venv) $ python3 -c "import triton_distributed_rs" # OK In future we may shift to using virtual environment everywhere and remove the extra Also note that |
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ | ||
RUN mkdir /opt/triton && \ | ||
uv venv /opt/triton/venv --python 3.12 && \ | ||
source /opt/triton/venv/bin/activate && \ | ||
cd runtime/rust/python-wheel && \ | ||
maturin develop | ||
uv build && \ | ||
uv pip install dist/triton_distributed_rs*cp312*.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think if you do maturin build
instead of maturin develop
, then virtual environment is not required
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did have that originally f81c240, but the issue with having everything installed in system environment (not the virtual environment) - is that there were unresolved issues with trying to pip install vllm
for the examples/python_rs/llm/vllm
example in the system environment. See #134.
So for now, to keep both old pure python examples/tests working, and new mixed python/rust examples working - I went for this approach.
I'd prefer to move towards all or nothing in terms of virtualenv usage or not - but I think we'd need to figure out the issues with the vllm install to remove the virtualenv.
I decided to move forward with test/pipeline improvements before going back to debug the vllm install issue, because I figured we'd make a decision in the near future to install vllm>=0.7
in the container at some point when doing ./build.sh --framework vllm
anyways - but may need to avoid breaking the pure-python vllm disagg example (which patches vllm install) when doing so.
Changes:
triton_distributed_rs
python binding install when using --mount-workspace by usingmaturin build
instead ofmaturin develop
- see refactor: Modify rust vllm example to use container #132 (comment)./container/run.sh -it
, but fails with addition of--mount-workspace
--mount-workspace
-r
inmaturin build -r
Closes #148