-
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 4 replies
-
|
@oscampo Did the download finish successfully? Could you run this in the console, to see if the model file is where it's expected? |
Beta Was this translation helpful? Give feedback.
-
|
Hmm, not sure why this doesn't work for you. Can you load the model like this? import llama_cpp
llm = llama_cpp.Llama('Llama-3.2-3B-Instruct-Q4_K_M.gguf')
llm |
Beta Was this translation helpful? Give feedback.
-
|
Okay, thanks! It looks like the llama_cpp library can't be loaded, not quite sure why right now... Does this work for anyone else? |
Beta Was this translation helpful? Give feedback.
-
|
Okay, now that I've installed it directly from TestFlight, I can reproduce the issue. Looking into it. |
Beta Was this translation helpful? Give feedback.
-
|
This should be fixed in build 7, which is on TestFlight now, could you try that? |
Beta Was this translation helpful? Give feedback.



This should be fixed in build 7, which is on TestFlight now, could you try that?