Add fp32 resnet50 model for vitisai ep demo#116
Add fp32 resnet50 model for vitisai ep demo#116Json288 wants to merge 2 commits intomicrosoft:mainfrom
Conversation
Signed-off-by: Song <jamesong@amd.com>
Signed-off-by: Song <jamesong@amd.com>
| url: "webnn/efficientnet-lite4/resolve/main/onnx/model_fp16.onnx", | ||
| path: "./demos/image-classification/models/webnn/efficientnet-lite4/onnx", | ||
| }, | ||
| { |
There was a problem hiding this comment.
I was testing out the changes in this branch, and it looks like I can't fetch the full suite of models (after the seventh one, the npm command exits and doesn't download the remaining ones). Are you seeing anything similar? Example output below.
PS C:\src\webnn-developer-preview-amd> npm run fetch-models
> webnn-developer-preview@1.0.0 fetch-models
> node fetch_models.js
[1/40] Downloading https://huggingface.co/xenova/resnet-50/resolve/main/onnx/model_fp16.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\xenova\resnet-50\onnx\model_fp16.onnx
-> downloading [========================================] 100% 0.0s
[1/40] Downloaded https://huggingface.co/xenova/resnet-50/resolve/main/onnx/model_fp16.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\xenova\resnet-50\onnx\model_fp16.onnx
[2/40] Downloading https://huggingface.co/webnn/mobilenet-v2/resolve/main/onnx/model_fp16.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\webnn\mobilenet-v2\onnx\model_fp16.onnx
-> downloading [========================================] 100% 0.0s
[2/40] Downloaded https://huggingface.co/webnn/mobilenet-v2/resolve/main/onnx/model_fp16.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\webnn\mobilenet-v2\onnx\model_fp16.onnx
[3/40] Downloading https://huggingface.co/webnn/efficientnet-lite4/resolve/main/onnx/model_fp16.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\webnn\efficientnet-lite4\onnx\model_fp16.onnx
-> downloading [========================================] 100% 0.0s
[3/40] Downloaded https://huggingface.co/webnn/efficientnet-lite4/resolve/main/onnx/model_fp16.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\webnn\efficientnet-lite4\onnx\model_fp16.onnx
[4/40] Downloading https://huggingface.co/amd/resnet50/resolve/main/webnn/onnx/model.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\amd\resnet50\onnx\model.onnx
-> downloading [========================================] 100% 0.0s
[4/40] Downloaded https://huggingface.co/amd/resnet50/resolve/main/webnn/onnx/model.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\amd\resnet50\onnx\model.onnx
[5/40] Downloading https://huggingface.co/amd/MobileNetV2/resolve/main/webnn/onnx/model.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\amd\MobileNetV2\onnx\model.onnx
-> downloading [========================================] 100% 0.0s
[5/40] Downloaded https://huggingface.co/amd/MobileNetV2/resolve/main/webnn/onnx/model.onnx to C:\src\webnn-developer-preview-amd\demos\image-classification\models\amd\MobileNetV2\onnx\model.onnx
[6/40] Downloading https://huggingface.co/microsoft/sd-turbo-webnn/resolve/main/text_encoder/model_layernorm.onnx to C:\src\webnn-developer-preview-amd\demos\sd-turbo\models\text_encoder\model_layernorm.onnx
-> downloading [========================================] 100% 0.0s
[6/40] Downloaded https://huggingface.co/microsoft/sd-turbo-webnn/resolve/main/text_encoder/model_layernorm.onnx to C:\src\webnn-developer-preview-amd\demos\sd-turbo\models\text_encoder\model_layernorm.onnx
[7/40] Downloading https://huggingface.co/microsoft/sd-turbo-webnn/resolve/main/unet/model_layernorm.onnx to C:\src\webnn-developer-preview-amd\demos\sd-turbo\models\unet\model_layernorm.onnx
PS C:\src\webnn-developer-preview-amd>
There was a problem hiding this comment.
No, I did not see a similar issue with fetch_models.js. And based on the provided log, the amd models added have been downloaded without error. Could you check if amd's models exist in the corresponding folder? And could you also try to run the fetch again to see if the problem persist?
There was a problem hiding this comment.
The new models exist, but I think the problem is that the rest of the models aren't getting downloaded for whatever reason. I'd expect to see ~40 models downloaded. So, I'm not sure if something regressed here, as I don't see this behavior on the main branch of the parent repo.
C:\src\webnn-developer-preview-amd>dir /s /b *.onnx
C:\src\webnn-developer-preview-amd\demos\image-classification\models\amd\MobileNetV2\onnx\model.onnx
C:\src\webnn-developer-preview-amd\demos\image-classification\models\amd\resnet50\onnx\model.onnx
C:\src\webnn-developer-preview-amd\demos\image-classification\models\webnn\efficientnet-lite4\onnx\model_fp16.onnx
C:\src\webnn-developer-preview-amd\demos\image-classification\models\webnn\mobilenet-v2\onnx\model_fp16.onnx
C:\src\webnn-developer-preview-amd\demos\image-classification\models\xenova\resnet-50\onnx\model_fp16.onnx
C:\src\webnn-developer-preview-amd\demos\sd-turbo\models\text_encoder\model_layernorm.onnx
C:\src\webnn-developer-preview-amd\demos\sd-turbo\models\unet\model_layernorm.onnx
There was a problem hiding this comment.
Well, not sure what changed in the previous few hours (I restarted and changed networks in the interim, so maybe something there got things unstuck)- but at the moment, running the script seems to make it past the seventh model (it's downloading number 13 as I write this). We can consider this resolved.
There was a problem hiding this comment.
I jinxed it. It bailed out after the 14th model finished downloading. Trying again. 😢
There was a problem hiding this comment.
Hm, there's some degree of flakiness that I'm observing. At any rate, will treat this as a separate issue. Thanks for confirming behavior on your side.
[26/40] Downloading https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.encoder-fp16.onnx to C:\src\webnn-developer-preview-amd\demos\segment-anything\models\sam_vit_b_01ec64.encoder-fp16.onnx
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.encoder-fp16.onnx (2 retries left)
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.encoder-fp16.onnx (1 retries left)
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.encoder-fp16.onnx after multiple attempts Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.encoder-fp16.onnx Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
[27/40] Downloading https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.decoder-orig-img-size-dynamic-fp16.onnx to C:\src\webnn-developer-preview-amd\demos\segment-anything\models\sam_vit_b_01ec64.decoder-orig-img-size-dynamic-fp16.onnx
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.decoder-orig-img-size-dynamic-fp16.onnx (2 retries left)
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.decoder-orig-img-size-dynamic-fp16.onnx (1 retries left)
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.decoder-orig-img-size-dynamic-fp16.onnx after multiple attempts Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b_01ec64.decoder-orig-img-size-dynamic-fp16.onnx Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
[28/40] Downloading https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-encoder-int8.onnx to C:\src\webnn-developer-preview-amd\demos\segment-anything\models\sam_vit_b-encoder-int8.onnx
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-encoder-int8.onnx (2 retries left)
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-encoder-int8.onnx (1 retries left)
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-encoder-int8.onnx after multiple attempts Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-encoder-int8.onnx Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
[29/40] Downloading https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-decoder-orig-img-size-dynamic-int8.onnx to C:\src\webnn-developer-preview-amd\demos\segment-anything\models\sam_vit_b-decoder-orig-img-size-dynamic-int8.onnx
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-decoder-orig-img-size-dynamic-int8.onnx (2 retries left)
Retrying download for https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-decoder-orig-img-size-dynamic-int8.onnx (1 retries left)
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-decoder-orig-img-size-dynamic-int8.onnx after multiple attempts Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
Failed to download https://huggingface.co/webnn/segment-anything-model-webnn/resolve/main/sam_vit_b-decoder-orig-img-size-dynamic-int8.onnx Error: HTTP error! Status: 404 Not Found
at downloadFile (file:///C:/src/webnn-developer-preview-amd/fetch_models.js:206:19)
at process.processTicksAndRejections (node:internal/process/task_queues:103:5)
at async file:///C:/src/webnn-developer-preview-amd/fetch_models.js:226:13
[30/40] Downloading https://huggingface.co/webnn/whisper-base-webnn/resolve/main/whisper_base_decoder_static_kvcache_128_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx to C:\src\webnn-developer-preview-amd\demos\whisper-base\models\whisper_base_decoder_static_kvcache_128_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx
-> downloading [========================================] 100% 0.0s
[30/40] Downloaded https://huggingface.co/webnn/whisper-base-webnn/resolve/main/whisper_base_decoder_static_kvcache_128_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx to C:\src\webnn-developer-preview-amd\demos\whisper-base\models\whisper_base_decoder_static_kvcache_128_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx
[31/40] Downloading https://huggingface.co/webnn/whisper-base-webnn/resolve/main/whisper_base_decoder_static_non_kvcache_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx to C:\src\webnn-developer-preview-amd\demos\whisper-base\models\whisper_base_decoder_static_non_kvcache_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx
-> downloading [========================================] 100% 0.0s
[31/40] Downloaded https://huggingface.co/webnn/whisper-base-webnn/resolve/main/whisper_base_decoder_static_non_kvcache_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx to C:\src\webnn-developer-preview-amd\demos\whisper-base\models\whisper_base_decoder_static_non_kvcache_lm_fp16_layernorm_gelu_4dmask_iobinding.onnx
[32/40] Downloading https://huggingface.co/webnn/whisper-base-webnn/resolve/main/whisper_base_encoder_lm_fp16_layernorm_gelu.onnx to C:\src\webnn-developer-preview-amd\demos\whisper-base\models\whisper_base_encoder_lm_fp16_layernorm_gelu.onnx
-> downloading [========================================] 100% 0.0s
[32/40] Downloaded https://huggingface.co/webnn/whisper-base-webnn/resolve/main/whisper_base_encoder_lm_fp16_layernorm_gelu.onnx to C:\src\webnn-developer-preview-amd\demos\whisper-base\models\whisper_base_encoder_lm_fp16_layernorm_gelu.onnx
| @@ -0,0 +1,2026 @@ | |||
| { | |||
| "architectures": [ | |||
| "ResNetForImageClassification" | |||
There was a problem hiding this comment.
I was trying the sample locally- I did get back an inference but did see some log output indicating attempts at local file system access... is this expected?
[60792:33108:0407/135409.409:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [FATAL: onnxruntime, vaiml_mlopslib_partition_rule.cpp:1317 vaiml_mlopslib_partition_rule.cpp] Failed to open "C:\\temp\\adityar\\vaip\\.cache\\ef572deb32890331e9d986c1826163b6\\aie_unsupported_original_ops.json"
There was a problem hiding this comment.
Including the logs from about://gpu here as well. It seems like that perhaps the computation in my local testing is happening on the CPU?
[53892:58412:0407/140321.915:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, stat.cpp:193 stat.cpp] [Vitis AI EP] No. of Operators :
[53892:58412:0407/140321.915:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, stat.cpp:204 stat.cpp] CPU 124
[53892:58412:0407/140321.915:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, stat.cpp:213 stat.cpp]
There was a problem hiding this comment.
The failed file access caused the model to fall back to CPU. But that on-disk file access should not happen. Could you share the steps to reproduce this?
There was a problem hiding this comment.
Sure- as prerequisites, you'll need to have the 1.8.59 AMD NPU EP installed, and then you'll need to have the 2.0 Preview2 Windows App SDK build installed (you can run the installer from here). I used Google Chrome Canary as my test browser.
I cloned your fork locally, pulled down the models locally, and ran the site via npm run dev.
Once I had the server launched, I launched Google Chrome Canary with the following incantation:
PS C:\Users\adityar\AppData\Local\Google\Chrome SxS\Application> .\chrome.exe --no-first-run --no-default-browser-check --enable-features="WebMachineLearningNeuralNetwork,WebNNOnnxRuntime" --webnn-ort-ignore-ep-blocklist --webnn-ort-logging-level=VERBOSE --webnn-ort-library-path-for-testing="C:\Program Files\WindowsApps\Microsoft.WindowsAppRuntime.2-preview2_2.0.0.2_x64__8wekyb3d8bbwe" --webnn-ort-ep-device="VitisAIExecutionProvider,0x1022,0x17F0" --allow-third-party-modules about:blank --webnn-ort-ep-library-path-for-testing=VitisAIExecutionProvider?"C:\Program Files\WindowsApps\MicrosoftCorporationII.WinML.AMD.NPU.EP.1.8_1.8.59.0_x64__8wekyb3d8bbwe\ExecutionProvider\onnxruntime_vitisai_ep.dll"
(More details on some of these arguments is available here and if you have any questions on any of them, let me know.)
I then navigated to the local web site (http://127.0.0.1:8080/), loaded the Image Classification sample, and tried running the FP32 model. I was inspecting the verbose ORT logs via the chrome://gpu page.
If I should be using a newer AMD NPU EP, please let me know (I know that there's been ongoing work to eliminate the file accesses during model compilation, maybe I am on a build that doesn't have the latest fixes).
There was a problem hiding this comment.
My test EP seems to be an older version than the EP version shown in your log. But I also tried with a newer version and it also has no issue. Is it possible for you to test it with a newer EP? I will also try this specific version at the same time.
There was a problem hiding this comment.
Sorry for the delay on my end. Hm, you are right, moving the EP binaries to a different directory seemed to have gotten me further (I will need to investigate why that is the case).
My command line:
PS C:\Users\adityar\AppData\Local\Google\Chrome SxS\Application> .\chrome.exe --no-first-run --no-default-browser-check --enable-features="WebMachineLearningNeuralNetwork,WebNNOnnxRuntime" --webnn-ort-ignore-ep-blocklist --webnn-ort-logging-level=VERBOSE --webnn-ort-library-path-for-testing="C:\Program Files\WindowsApps\Microsoft.WindowsAppRuntime.2-preview2_2.0.0.2_x64__8wekyb3d8bbwe" --webnn-ort-ep-device="VitisAIExecutionProvider,0x1022,0x17F0" --allow-third-party-modules --webnn-ort-ep-library-path-for-testing=VitisAIExecutionProvider?"C:\Program Files\AMD-EP\onnxruntime_vitisai_ep.dll"
Contents of that AMD-EP directory:
PS C:\Program Files\AMD-EP> dir
Directory: C:\Program Files\AMD-EP
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 3/26/2026 10:49 PM 10102944 aiecompiler_client.dll
-a--- 2/24/2026 3:41 PM 1897555 AMD_WINML_0217_Drop_for_MSFT_Third_Party_Notices.pdf
-a--- 3/26/2026 10:49 PM 151787168 dyn_dispatch_core.dll
-a--- 3/26/2026 10:49 PM 285744 onnxruntime_providers_vitisai.dll
-a--- 3/26/2026 10:49 PM 1414816 onnxruntime_vitis_ai_custom_ops.dll
-a--- 3/26/2026 10:49 PM 147870368 onnxruntime_vitisai_ep.dll
-a--- 10/23/2025 11:21 AM 16180139 third-party.zip
-a--- 3/26/2026 10:49 PM 322732232 vaiml.dll
-a--- 3/26/2026 10:49 PM 593096 zlib.dll
I got further but there's a new error now- something related to driver detection, possibly? Is this a problem with the driver on my device?
[17116:17492:0420/091155.319:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:1369 vitisai_compile_model.cpp] Vitis AI EP Load ONNX Model Success
[17116:17492:0420/091155.319:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:1370 vitisai_compile_model.cpp] Graph Input Node Name/Shape (1)
[17116:17492:0420/091155.319:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:1374 vitisai_compile_model.cpp] input_1 : [1x3x224x224]
[17116:17492:0420/091155.320:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:1380 vitisai_compile_model.cpp] Graph Output Node Name/Shape (1)
[17116:17492:0420/091155.320:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:1384 vitisai_compile_model.cpp] logits_0 : [1x1000]
[17116:17492:0420/091155.387:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:455 vitisai_compile_model.cpp] File base signature :
[17116:17492:0420/091155.387:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:456 vitisai_compile_model.cpp] Algorithm-A: based on topologically ordered signature : ef572deb32890331e9d986c1826163b6
[17116:17492:0420/091155.387:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:458 vitisai_compile_model.cpp] Algorithm-B: based on graph inputs/outputs signature : 1f0eaf1ff73323fcca7c147314662faa
[17116:17492:0420/091155.387:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:460 vitisai_compile_model.cpp] Algorithm-B: node count: 122
[17116:17492:0420/091155.387:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vitisai_compile_model.cpp:436 vitisai_compile_model.cpp] Can not find signature in meptable , use in memory signature ef572deb32890331e9d986c1826163b6
[17116:17492:0420/091155.390:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, cache_dir.cpp:70 cache_dir.cpp] skip update cache dir: in-mem mode
[17116:17492:0420/091155.421:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vaiml_config.hpp:349 vaiml_config.hpp] VAIP commit: 09c5d5065a9a44c02518ac86a22065b7af38ad71
[17116:17492:0420/091155.423:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, vaip_driver_info.cpp:193 vaip_driver_info.cpp] Driver detection through setup api: 314
[17116:11764:0420/091212.629:ERROR:ui\gl\egl_util.cc:92] : EGL Driver message (Error) eglCreateContext: Requested version is not supported
[17116:11764:0420/091244.788:ERROR:ui\gl\egl_util.cc:92] : EGL Driver message (Error) eglCreateContext: Requested version is not supported
[17116:17492:0420/091318.593:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [WARNING: onnxruntime, node-arg-producer-map.cpp:138 node-arg-producer-map.cpp] NodeArgIndex graph ID (GraphId(staging=false, index=2)) does not match producer map graph ID (GraphId(staging=false, index=0))
[17116:17492:0420/091318.593:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [WARNING: onnxruntime, node-arg-producer-map.cpp:138 node-arg-producer-map.cpp] NodeArgIndex graph ID (GraphId(staging=false, index=2)) does not match producer map graph ID (GraphId(staging=false, index=0))
[17116:11764:0420/091322.213:ERROR:ui\gl\egl_util.cc:92] : EGL Driver message (Error) eglCreateContext: Requested version is not supported
There was a problem hiding this comment.
Those EGL driver messages is irrelevant to the compilation process and shouldn't cause any trouble. Similarly, the file access attempts also shouldn't cause errors. But we can try to remove those if needed. Did the compilation go through successfully for you with outputs shown in the page? The process might take 5-10 min.
There was a problem hiding this comment.
The process might take 5-10 min.
Ah, I might have given up too early. Let me try again.
Have you tried enabling any of the other samples, by chance? If these models are taking 5-10 minutes to compile, then the other heavier-weight ones like the GenAI ones might not be usable.
There was a problem hiding this comment.
No, I haven't. Good point though.
|
@Json288 please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
Contributor License AgreementContribution License AgreementThis Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”),
|
| log("[Transformer.js] env.allowRemoteModels: " + transformers.env.allowRemoteModels); | ||
| log("[Transformer.js] env.allowLocalModels: " + transformers.env.allowLocalModels); | ||
|
|
||
| const FP16_MODEL_PATHS = { |
There was a problem hiding this comment.
One thing I am still wondering about- I tried using just the production EPs via the following browser command:
PS C:\Users\adityar\AppData\Local\Google\Chrome SxS\Application> .\chrome.exe --no-first-run --no-default-browser-check --enable-features="WebMachineLearningNeuralNetwork,WebNNOnnxRuntime" --webnn-ort-ignore-ep-blocklist --webnn-ort-logging-level=VERBOSE --webnn-ort-library-path-for-testing="C:\Program Files\WindowsApps\Microsoft.WindowsAppRuntime.2-preview2_2.0.0.2_x64__8wekyb3d8bbwe" --webnn-ort-ep-device="VitisAIExecutionProvider,0x1022,0x17F0"
Both the GPU and NPU EP failed to register in this configuration because of a missing DLL dependency, and then the GPU process crashed shortly thereafter.
[12432:33028:0420/092011.118:ERROR:services\webnn\ort\environment.cc:591] : [WebNN] Failed to call ort_api->RegisterExecutionProviderLibrary( env.get(), ep_name.c_str(), package_info->library_path.value().c_str()): [WebNN] ORT status error code: 1 error message: Error loading "C:\Program Files\WindowsApps\MicrosoftCorporationII.WinML.AMD.GPU.EP.1.8_1.8.55.0_x64__8wekyb3d8bbwe\ExecutionProvider\onnxruntime_providers_migraphx.dll" which depends on "migraphx_c.dll" which is missing. (Error 1114: "A dynamic link library (DLL) initialization routine failed.")
[12432:33028:0420/092011.124:ERROR:services\webnn\ort\environment.cc:89] : [ORT] [INFO: onnxruntime, utils.cc:552 onnxruntime::LoadPluginOrProviderBridge] Loading EP library: 0000133C02659FC0 as a plugin
[12432:33028:0420/092011.132:ERROR:services\webnn\ort\environment.cc:591] : [WebNN] Failed to call ort_api->RegisterExecutionProviderLibrary( env.get(), ep_name.c_str(), package_info->library_path.value().c_str()): [WebNN] ORT status error code: 1 error message: Error loading "C:\Program Files\WindowsApps\MicrosoftCorporationII.WinML.AMD.NPU.EP.1.8_1.8.59.0_x64__8wekyb3d8bbwe\ExecutionProvider\onnxruntime_providers_vitisai.dll" which depends on "onnxruntime_providers_shared.dll" which is missing. (Error 126: "The specified module could not be found.")
[12432:33028:0420/092011.132:ERROR:services\webnn\ort\logging.cc:146] : [WebNN] [INFO] Registered OrtEpDevice #0: {ep_name: CPUExecutionProvider, ep_vendor: Microsoft, ep_metadata: {version: 1.24.4}, ep_options: {}}, OrtHardwareDevice: {type: CPU, vendor: AMD, vendor_id: 0x1022, device_id: 0x7, device_metadata: {Description: AMD Ryzen AI 9 365 w/ Radeon 880M }}
[12432:33028:0420/092011.132:ERROR:services\webnn\ort\logging.cc:146] : [WebNN] [INFO] Registered OrtEpDevice #1: {ep_name: DmlExecutionProvider, ep_vendor: Microsoft, ep_metadata: {version: 1.24.4}, ep_options: {device_id: 0}}, OrtHardwareDevice: {type: GPU, vendor: Advanced Micro Devices, Inc., vendor_id: 0x1002, device_id: 0x150e, device_metadata: {Description: AMD Radeon(TM) 880M Graphics, LUID: 202817, DxgiAdapterNumber: 0, DxgiHighPerformanceIndex: 0, DxgiVideoMemory: 4096 MB, Discrete: 0}}
[12432:33028:0420/092011.132:ERROR:services\webnn\ort\environment.cc:501] : [WebNN] No EP device can be selected due to no matching device for user-specified webnn-ort-ep-device: VitisAIExecutionProvider,0x1022,0x17F0. Please check the registered EP devices in the logs by setting webnn-ort-logging-level to VERBOSE or INFO.
[12432:33028:0420/092011.132:ERROR:services\webnn\ort\environment.cc:501] : [WebNN] No EP device can be selected due to no matching device for user-specified webnn-ort-ep-device: VitisAIExecutionProvider,0x1022,0x17F0. Please check the registered EP devices in the logs by setting webnn-ort-logging-level to VERBOSE or INFO.
[12432:33028:0420/092011.132:ERROR:services\webnn\ort\environment.cc:501] : [WebNN] No EP device can be selected due to no matching device for user-specified webnn-ort-ep-device: VitisAIExecutionProvider,0x1022,0x17F0. Please check the registered EP devices in the logs by setting webnn-ort-logging-level to VERBOSE or INFO.
GpuProcessHost: The GPU process crashed! Exit code: STATUS_BREAKPOINT.
The reason I am interested in this configuration is that it's closer to the production environment (i.e., using the released EPs). I'll dig around more on my end but wanted to check if this looks familiar to you or if you spot something wrong with this configuration.
There was a problem hiding this comment.
I noticed that you removed this arg: --webnn-ort-ep-library-path-for-testing=VitisAIExecutionProvider?"C:\Program Files\AMD-EP\onnxruntime_vitisai_ep.dll". Why is that?
There was a problem hiding this comment.
Yeah good question- it goes back to this remark:
The reason I am interested in this configuration is that it's closer to the production environment
Basically, trying to understand and diagnose why the official versions of the EP don't work from the MSIX since that is how AMD customers would use the feature.
There was a problem hiding this comment.
For error caused by loading ep dll from msix installation path, we have a pr to fix it that's waiting to be merged.
As for this one, I think it should try to load onnxruntime_vitisai_ep.dll instead of onnxruntime_providers_vitisai.dll. Is this expected?
There was a problem hiding this comment.
I think it should try to load onnxruntime_vitisai_ep.dll instead of onnxruntime_providers_vitisai.dll. Is this expected?
Yeah, I noticed this as well- something is causing the EP routing to get confused and pick the wrong EP version... not sure what's going on here, trying to debug ORT to get more details.

Why is this change being made?
To verify and demonstrate VitisAI EP 's support for WebNN with fp32 image_classification models: ResNet50 & MobileNet.
What changed?
Added
demos/image-classification/models/amd/resnet50/—config.json,preprocessor_config.json(local assets for AMD FP32 ResNet-50).demos/image-classification/models/amd/MobileNetV2/—config.json,preprocessor_config.json(local assets for AMD FP32 MobileNetV2).Modified
demos/image-classification/index.js— FP32 AMD paths (amd/resnet50,amd/MobileNetV2), WebNN Hub layout handling (webnn/JSON +webnn/onnxweights on remote vs flatonnx/locally),env.fetchrewrites for Hubconfig.json/preprocessor_config.json,options.subfolderfor remote/local, and AMDpixel_values→inputpatch for the classifier pipeline.demos/image-classification/index.html— UI wiring for FP32 / model selection as needed for the new paths.demos/image-classification/static/main.css— Styles for any new/updated controls.fetch_models.js— Extra download entries for AMD ResNet-50 / MobileNetV2 from the Hubwebnn/onnxpaths into the localonnx/mirror layout.How was the change tested?
./modelsflows; WebNN NPU + FP32 + MobileNet V2 / ResNet50.node fetch_modelsagainst Hub when validating AMDwebnn/asset URLs.