Skip to content

docker/labs-nvim-copilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nvim-copilot

What is this

This project implements a very simple copilot-like experience in a terminal-based editor (neovim) using only local LLMs. We are exploring 2 things here.

  • can we create copilot-like experiences using only local LLMs?
  • how easily can we add llm prompts to a terminal-based editor like neovim?

Here's an example of our simple copilot in action. This is using llama3 running in Ollama.

copilot

How do you use this

This is distributed as a standard neovim plugin module. After installing, highlight some text in the buffer and type <leader>ai to ask the LLM a question about the highlighted text.

Installation

Installing with Lazy

If you're using lazy, then add docker/labs-nvim-copilot to your setup.

require('lazy').setup(
  { 
    {
      'docker/labs-nvim-copilot',
      lazy=false,
      dependencies = {
        'Olical/aniseed',
        'nvim-lua/plenary.nvim',
        'hrsh7th/nvim-cmp'
      },
      config = function(plugin, opts)
        require("dockerai").setup(
          {attach = bufKeymap})
      end,
    },
    {
      'hrsh7th/nvim-cmp',
      dependencies = {'hrsh7th/cmp-buffer',
                      'hrsh7th/cmp-nvim-lsp', }
    },
  }
)

Using Ollama

If you have Ollama installed installed and running, Docker AI will use it. Docker AI will not start Ollama - if you want to use it, you'll have to start it separately

Commands

  • :DockerDebug - download internal representations of project context for debug

Building

# docker:command=build

make

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages