Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do we install this to use with RIFE V2? #128

Open
FurkanGozukara opened this issue Mar 3, 2025 · 1 comment
Open

How do we install this to use with RIFE V2? #128

FurkanGozukara opened this issue Mar 3, 2025 · 1 comment

Comments

@FurkanGozukara
Copy link

FurkanGozukara commented Mar 3, 2025

I want to use Rife V2 like below code

how do we install?

import vsmlrt
output = vsmlrt.inference(rgbs, "path/to/onnx", backend=vsmlrt.Backend.TRT(fp16=True))

I have downloaded rife_v4.25_heavy.7z

how do i install vsmlrt?

readme has no install explanation

i need to install into a python venv

or are there any other way to use with python venv?

currently I am using this way and it works

I am using this repo : https://github.com/hzwer/Practical-RIFE/

this code

    cmd = f'"{sys.executable}" "Practical-RIFE/inference_video.py" --model="{model_dir}" --multi={multiplier_val} --video="{last_video_path}" --output="{improved_video}"'

these files : 4.25.lite - 2024.10.20

Image

i need same way of usage

@WolframRhodium
Copy link
Contributor

WolframRhodium commented Mar 4, 2025

You don't need to use onnx in this repo if you are using the official Practical-RIFE repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants