Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importing Ollama models is broken #179

Open
dadmobile opened this issue Oct 31, 2024 · 5 comments
Open

Importing Ollama models is broken #179

dadmobile opened this issue Oct 31, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@dadmobile
Copy link
Member

Reported by discord user Rich:

I had imported a downloaded model via Ollama. When I clicked on the "Import Local Model" there were models available to import. After I had done so, I selected one of them and then when I tried to run it, I got the error in the second image. I deleted it and tried importing again, same error.

Screenshot_2024-10-30_at_10 33 31_PM
@dadmobile
Copy link
Member Author

Confirmed that this doesn't work for me either, but in my case the error message included the path to my model:

Monosnap Monosnap 2024-10-31 15-10-57

@dadmobile
Copy link
Member Author

The problem is that there's a colon in the directory name (it should be a dash).

@dadmobile dadmobile added the bug Something isn't working label Oct 31, 2024
@dadmobile dadmobile self-assigned this Oct 31, 2024
@dadmobile dadmobile changed the title Issue importing Llama 3.2 from ollama Importing Ollama models is broken Nov 4, 2024
@dadmobile
Copy link
Member Author

dadmobile commented Nov 4, 2024

Fixed the first part. There seems to be 2 remaining issues currently breaking this:

  1. We changed the model ID to something prettier than the ugly SHA, but there is code somewhere that requires that GGUF model IDs and the actual filename are the same. This error manifests as:
    image

  2. If you get by issue number 1, I am finding that all of my ollama GGUF models are throwing the following in the API:

Monosnap transformerlab-api — python ◂ run sh -c — 214×57 2024-11-04 15-17-58

asyncio.exceptions.CancelledError: Cancelled by cancel scope 31dc562d0

But you can still export these models to llamafile and they work! Probably need to try running these through LLama.cpp and seeing what the underlying issue is.

@delfireinoso
Copy link

delfireinoso commented Nov 12, 2024

When I select Import Models, the list includes all versions as one name, but in my ollama installation there is no Qwen2.5 but Qwen2.5:7b and other weights (to test them in different situations). And it's unable to find them.

@dadmobile
Copy link
Member Author

Issue number 2 I mentioned above is fixed by d5bdb9c.

The only remaining issue is the misaligned model names.

I may separate delfireinoso's comment out in to a separate issue. Let me reproduce first to see.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants