Replies: 3 comments
-
|
@bli1348 Could you share the GGUF model used in your case? Thank you! |
Beta Was this translation helpful? Give feedback.
-
|
b7349 works like b7229 now, but got the following error when running ministral3-14b-instruct-2512-ud-q4_k_xl from unsloth srv params_from_: Chat format: peg-native |
Beta Was this translation helpful? Give feedback.
-
|
I try b7351, it's passed. Here is the cmd: |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I was running b7229 windows SYCL backend fine, but when I updated it to b7311 or b7313 I have the following error when running llama-server
Failed to load libsycl-native-bfloat16.spvException caught at file:D:\a\llama.cpp\llama.cpp\ggml\src\ggml-sycl\ggml-sycl.cpp, line:2997
Beta Was this translation helpful? Give feedback.
All reactions