You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Team,
Starting from 0.10.0, torchServe introduced open_inference_grpc.proto to allow Pytorch GRPC APIs to follow Kserve open inference V2 protocol. However, I am wondering why the package name used for the proto is different from what's used in Kserve. Having a different package name would require Pytorch model and non-Pytorch model to use different proto definitions even though they both follow the open inference protocol. I am wondering if it is possible to make the open_inference_grpc.proto within the same package as what is defined in Kserve grpc_predict_v2.proto?
Thank you.
The text was updated successfully, but these errors were encountered:
Hi Team,
Starting from 0.10.0, torchServe introduced open_inference_grpc.proto to allow Pytorch GRPC APIs to follow Kserve open inference V2 protocol. However, I am wondering why the package name used for the proto is different from what's used in Kserve. Having a different package name would require Pytorch model and non-Pytorch model to use different proto definitions even though they both follow the open inference protocol. I am wondering if it is possible to make the open_inference_grpc.proto within the same package as what is defined in Kserve grpc_predict_v2.proto?
Thank you.
The text was updated successfully, but these errors were encountered: