-
Notifications
You must be signed in to change notification settings - Fork 56
Description
Describe the issue
We are making below pipeline to show how to run Whisper on Intel CPU/GPU/NPU thru ORT+OVEP
https://github.com/luke-lin-vmc/whisper-ovep-python-static
When running the pipeline thru OVEP, we always got below warning, no matter the device is CPU, GPU or NPU
Warning: Specified provider 'OpenVINOExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
Re-install onnxruntime-openvino can workaround the issue
pip uninstall -y onnxruntime-openvino
pip install onnxruntime-openvino~=1.23.0
To reproduce
Please refer https://github.com/luke-lin-vmc/whisper-ovep-python-static, all the required files are there.
- To make a clean environment, run below commands to uninstall all installed python pacakges
pip freeze > un.txt
pip uninstall -y -r un.txt
- Run the following commands to export models
pip install -r requirements.txt
python export-onnx.py --model base
- Run the pipeline on CPU
python whisper_onnx.py --model_type base --device CPU how_are_you_doing_today.wav
Below warning appears, and it acutally does not use OV's CPU but falls back to default CPUExecutionProvider
Warning: Specified provider 'OpenVINOExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
- Reinstall onnxruntime-openvino to workaround this issue
pip uninstall -y onnxruntime-openvino
pip install onnxruntime-openvino
- Run the pipeline on CPU again, No warning was shown
python whisper_onnx.py --model_type base --device CPU how_are_you_doing_today.wav
[Log]
Log file: https://github.com/luke-lin-vmc/whisper-ovep-python-static/blob/main/log_full.txt
Line#379 : First time run the pipeline thru OVEP
Line#386 : Warning appears
Line#411 and Line#421 : Apply workaround by re-installing onnxruntime-openvino
Line#440 : Runnn the pipeline again. No warning was shown
Urgency
Medium
Platform
Windows
OS Version
Windows 11 24H2
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.23.2
ONNX Runtime API
Python
Architecture
X64
Execution Provider
OpenVINO
Execution Provider Library Version
5.8
Model File
Please refer https://github.com/luke-lin-vmc/whisper-ovep-python-static
Steps to export model has been mentioned in [To Reproduce] section
Is this a quantized model?
No