Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expand AI Code Completion beyond Copilot and Supermaven #18490

Open
1 task done
zerocorebeta opened this issue Sep 29, 2024 · 2 comments
Open
1 task done

Expand AI Code Completion beyond Copilot and Supermaven #18490

zerocorebeta opened this issue Sep 29, 2024 · 2 comments
Labels
ai Improvement related to Assistant, Copilot, or other AI features enhancement [core label] inline completion Umbrella label for Copilot, Supermaven, etc. completions

Comments

@zerocorebeta
Copy link

Check for existing issues

  • Completed

Describe the feature

After going through: https://zed.dev/docs/completions

Zed currently supports completions via external LLM APIs like GitHub Copilot and Supermaven, but this is restrictive. Many users, for privacy or performance reasons, might prefer alternatives like Gemini Flash or local models via Ollama.

There are several advanced LLMs that support the Fill-in-the-Middle (FIM) objective, such as CodeGemma. Additionally, platforms like Continue.dev allow code completion via local models, with Startcoder via Ollama as the default.

Expanding LLM support to include more flexible, local, or privacy-focused options would greatly enhance Zed's appeal and utility for a wider range of developers.

If applicable, add mockups / screenshots to help present your vision of the feature

No response

@zerocorebeta zerocorebeta added admin read Pending admin review enhancement [core label] triage Maintainer needs to classify the issue labels Sep 29, 2024
@JosephTLyons JosephTLyons added ai Improvement related to Assistant, Copilot, or other AI features inline completion Umbrella label for Copilot, Supermaven, etc. completions and removed triage Maintainer needs to classify the issue admin read Pending admin review labels Oct 12, 2024
@ggerganov
Copy link

We have recently extended the llama.cpp server with a specialized /infill endpoint that enables FIM requests with large contexts to run efficiently in local environments. A simple example of using this endpoint with Qwen2.5-Coder in Neovim is demonstrated here: ggerganov/llama.cpp#9787

I believe it could be an interesting option to explore in the scope of this issue. Feel free to ping me if you have any questions.

@20manas
Copy link

20manas commented Oct 23, 2024

I think Zed AI should also provide its own code completion functionality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai Improvement related to Assistant, Copilot, or other AI features enhancement [core label] inline completion Umbrella label for Copilot, Supermaven, etc. completions
Projects
None yet
Development

No branches or pull requests

4 participants