Skip to content

🤖 A Neovim plugin that provides a floating window interface for interacting with LLMs

License

Notifications You must be signed in to change notification settings

saibayadon/llm-buffer.nvim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

LLMBuffer

A Neovim plugin that provides a floating window interface for interacting LLM models like Claude, GPT-4, and local models using Ollama.

llm-buffer-demo-sm.mp4

Warning

This project is for personal use and in active development. Use at your own risk!

The code may also be questionable since this is my first time dabbling in lua and plugin development 😅

Features

  • Floating window interface with markdown support
  • Visual selection or line selection as prompt
  • Streaming responses from different LLM providers
  • Local model support using Ollama
  • Configurable window dimensions and keybindings

Installation

Using lazy.nvim:

return {
    "saibayadon/llm-buffer.nvim",
    dependencies = { "nvim-lua/plenary.nvim" },
    name = "llm-buffer.nvim",
    config = function()
      require("llm-buffer").setup()
    end,
}

Configuration

Ensure you have an Anthropic API key set as an environment variable.

export ANTHROPIC_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
require("llm-buffer").setup({
    window_width = 0.8, -- Width as percentage of editor width
    window_height = 0.8, -- Height as percentage of editor height
    anthropic_api_key = os.getenv("ANTHROPIC_API_KEY"),
    openai_api_key = os.getenv("OPENAI_API_KEY"),
    ollama_api_host = "http://localhost:11434",
    provider = "anthropic", -- "anthropic" or "openai" or "ollama"
    model = "claude-3-5-sonnet-20241022", -- Claude model to use
    mappings = {
      send_prompt = "<C-l>",
      close_window = "q",
      toggle_window = "<leader>llm"
    }
})

Default Keybindings

  • <leader>llm Toggle the LLM buffer window
  • <C-l> Send current line or visual selection as the prompt.
  • q Close the buffer window.
  • <Esc> Cancel ongoing request (closing the buffer will do this as well).

You can also use the :LLMBuffer command to toggle the window.

Swapping providers on the fly

To swap providers on the fly, you can use the update_options function:

require("llm-buffer").update_options({
  provider = "openai"
  model = "gpt-4o-mini"
})

You can bind this to a keymap like <leader>llo to quickly switch between providers.

License

MIT

About

🤖 A Neovim plugin that provides a floating window interface for interacting with LLMs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages