You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
error: could not compile atoi (lib)
warning: build failed, waiting for other jobs to finish...
error: failed to compile lsp-ai v0.4.0, intermediate artifacts can be found at /tmp/cargo-installbmlwJ1.
To reuse those artifacts with a future compilation, set the environment variable CARGO_TARGET_DIR to that path.
Also , its not clear from documentation what to do after you get lsp-ai installed? How do you integrate it with VSCode. After going through existing issues, I managed to get that lsp-ai extension needed to be installed. Once that extension is installed , how do I configure it to use lsp-ai server and Ollama? Where do I put that Ollama configuration which you have pasted in wiki?
Could I please request you to clarify above points as its not clear from documentation. Thanks.
The text was updated successfully, but these errors were encountered:
Hello,
I am trying to install this lsp-ai on Ubuntu 22.04 and getting following error:
Compiling atoi v2.0.0
error: Unrecognized option: 'diagnostic-width'
error: could not compile
atoi
(lib)warning: build failed, waiting for other jobs to finish...
error: failed to compile
lsp-ai v0.4.0
, intermediate artifacts can be found at/tmp/cargo-installbmlwJ1
.To reuse those artifacts with a future compilation, set the environment variable
CARGO_TARGET_DIR
to that path.(lsp) Ubuntu@0136-ict-prxmx50056:
/lsp-ai$ rustc -V/lsp-ai$ cargo --versionrustc 1.80.0 (051478957 2024-07-21)
(lsp) Ubuntu@0136-ict-prxmx50056:
cargo 1.75.0
Also , its not clear from documentation what to do after you get lsp-ai installed? How do you integrate it with VSCode. After going through existing issues, I managed to get that lsp-ai extension needed to be installed. Once that extension is installed , how do I configure it to use lsp-ai server and Ollama? Where do I put that Ollama configuration which you have pasted in wiki?
Could I please request you to clarify above points as its not clear from documentation. Thanks.
The text was updated successfully, but these errors were encountered: