OpenVINO Execution Provider #1096
bopeng1234
started this conversation in
Support for Targets (OS / EPs / Hardware)
Replies: 2 comments
-
Yes, it doesn't currently support it but we can add basic support for it (where onnxruntime uses it). I think GenAI doing CPU scoring is no problem for OpenVINO as it's CPU based. |
Beta Was this translation helpful? Give feedback.
0 replies
-
thanks, do we have any branch we are working on, to add openvino ep to onnxruntime-genai |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi there,
I am wondering if the onnxruntime-genai support OpenVINO as an Execution Provider.
as I found in example, only provide the execution sample on CPU/CUDA/DML, no OpenVINO.
To my understanding, the OpenVINO EP now, can only run one time inference, for example tradition image classification, but can't run text generation (multi round inference and kv caching)?
Best Regards!
Beta Was this translation helpful? Give feedback.
All reactions