Granite Code Assistant
simplifies the setup of the Continue extension to integrate IBM's Granite code models, as your code assistant in Visual Studio Code, using Ollama as the runtime environment.
By leveraging Granite code models and open-source components such as Ollama and Continue, you can write, generate, explain, or document code with full control over your data, ensuring it stays private and secure on your machine.
This project features an intuitive UI, designed to simplify the installation and management of Ollama and Granite Code models. The first time the extension starts, a setup wizard is automatically launched to guide you through the installation process.
You can later open the setup wizard anytime from the command palette by executing the "Granite: Setup Granite Code as code assistant" command.
- OS: MacOS, Linux or Windows
- Disk Space: Minimum 30 GB
- Latest Version of Visual Studio Code
Open Visual Studio Code, navigate to the Extensions tab on the left sidebar, select "vscode-granite," and click "install."
The Continue.dev extension will be automatically added as a dependency, if not already installed. If you installed Granite Code Assistant
manually, you may need to also install the Continue extension separately.
Once the extension is running, the setup wizard will prompt you to install Ollama.
The following Ollama installation options are available :
- Install with Homebrew: If Homebrew is detected on your machine (Mac/Linux).
- Install with Script: Available on Linux.
- Install Manually: Supported on all platforms. If you choose this option, you will be redirected to the official Ollama download page to complete the installation.
Once Ollama is installed, the page will refresh automatically. Depending on the security settings of your plateform, you may need to start Ollama manually the first time.
Select the Granite model(s) you wish to install and follow the on-screen instructions to complete the setup.
After the models are pulled into Ollama, Continue will be configured automatically to use them, and the Continue chat view will open, allowing you to interact with the models via the UI or tab completion.
The Granite Code models are optimized for enterprise software development workflows, performing well across various coding tasks (e.g., code generation, fixing, and explanation). They are versatile "all-around" code models.
Granite Code comes in various sizes to fit your workstation's resources. Generally, larger models yield better results but require more disk space, memory, and processing power.
Recommendation: Use Model Size 8B for chat and 8B for tab code completion.
For more details, refer to Granite Code Models.
Many corporations have privacy regulations that prohibit sending internal code or data to third-party services. Running LLMs locally allows you to sidestep these restrictions and ensures no sensitive information is sent to a remote service. Ollama is one of the simplest and most popular open-source solutions for running LLMs locally.
Continue is the leading open-source AI code assistant. You can connect any models and contexts to build custom autocomplete and chat experiences inside VS Code and JetBrains.
- Easily understand code sections
- Tab to autocomplete code suggestions
- Refactor functions while coding
- Ask questions about your codebase
- Quickly use documentation as context
For more details, refer to continue.dev.
This project is licensed under Apache 2.0. See LICENSE for more information.
With your approval, the vscode-granite extension collects anonymous usage data and sends it to Red Hat servers to help improve our products and services. Read our privacy statement to learn more. This extension respects the redhat.telemetry.enabled
setting, which you can learn more about at Red Hat Telemetry.