Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

Ollama v0.1.34 Timeout issue on Codellama34B bug Something isn't working
#4283 opened May 9, 2024 by humza-sami
ollama can support Huawei Ascend NPU? feature request New feature or request
#4282 opened May 9, 2024 by lonngxiang
Get Entropy feature request New feature or request
#4281 opened May 9, 2024 by antonbugaets
Error: pull model manifest: file does not exist bug Something isn't working
#4280 opened May 9, 2024 by taozhiyuai
Ollama reports an error when running the AI model using GPU bug Something isn't working
#4279 opened May 9, 2024 by xiaomo0925
bge-m3 model request Model requests
#4276 opened May 9, 2024 by Mimicvat
Update command for Linux version feature request New feature or request
#4274 opened May 9, 2024 by Maplerxyz
API useing bug Something isn't working
#4273 opened May 9, 2024 by w1757876747
Partial pruning does not wrok bug Something isn't working
#4271 opened May 9, 2024 by jmorganca
windows ollama 0.1.34 can not use GPU,with nvidia RTX 4060 bug Something isn't working
#4270 opened May 9, 2024 by zhafree
403 using zrok
#4262 opened May 8, 2024 by quantumalchemy
Error: could not connect to ollama app, is it running? amd Issues relating to AMD GPUs and ROCm bug Something isn't working windows
#4260 opened May 8, 2024 by starMagic
stop loading model while i close my computer. bug Something isn't working
#4259 opened May 8, 2024 by chaserstrong
Support for InternVL-Chat-V1.5 model request Model requests
#4257 opened May 8, 2024 by wwjCMP
The ollama model how resides on the gpu? feature request New feature or request needs more info More information is needed to assist
#4254 opened May 8, 2024 by lonngxiang
A repeatable hang issue on Linux - dual radeon amd Issues relating to AMD GPUs and ROCm bug Something isn't working
#4253 opened May 8, 2024 by eliranwong
Ollama中文社区群聊发布 feature request New feature or request
#4252 opened May 8, 2024 by zsq2010
Ollama using minimal GPU on Windows needs more info More information is needed to assist
#4251 opened May 8, 2024 by Freffles
ProTip! Type g p on any issue or pull request to go back to the pull request listing page.