A user-friendly web interface for interacting with Ollama language models. This client allows you to easily chat with various Ollama models, adjust parameters, and manage conversations.
- 🚀 Easy model selection from available Ollama models
- 💬 Interactive chat interface with markdown support
- 🎛️ Adjustable temperature and top_p parameters
- 🌓 Dark mode toggle for comfortable viewing
- 📝 System prompt input for context setting
- 🔄 Real-time streaming of model responses
Before you begin, ensure you have met the following requirements:
- Node.js (v14 or later)
- npm (usually comes with Node.js)
- Ollama installed and running on your local machine (default port: 11434)
-
Clone the repository:
git clone https://github.com/hassanshabbirahmed/ollama-web-client.git cd ollama-web-client
-
Install the dependencies:
npm install
-
Start the development server:
npm run dev
-
Open your browser and navigate to
http://localhost:5173
(or the port shown in your terminal)
- Select a model from the dropdown menu at the top of the page.
- (Optional) Enter a system prompt to set the context for your conversation.
- Adjust the temperature and top_p sliders if desired.
- Type your message in the input box at the bottom of the page.
- Press Enter or click the "Send" button to submit your message.
- The model's response will stream in real-time below your message.
- Model Selection: Choose from available Ollama models in the dropdown menu.
- System Prompt: Set a context for your conversation by entering a system prompt.
- Temperature: Adjust the randomness of the model's outputs (0.0 to 1.0).
- Top P: Control the diversity of the model's outputs (0.0 to 1.0).
- Dark Mode: Toggle between light and dark themes using the sun/moon icon.
To create a production build:
npm run build
This will generate a dist
folder with the compiled assets.
This project includes Docker support for easy deployment and distribution.
- Docker installed on your machine
To build the Docker image, follow these steps:
-
Ensure you are in the project root directory containing the Dockerfile.
-
Run the following command to build the image:
docker build -t ollama-web-client .
This command builds a Docker image with the tag
ollama-web-client
using the Dockerfile in the current directory. The.
at the end specifies the build context (current directory). -
Wait for the build process to complete. This may take a few minutes depending on your internet speed and machine performance.
-
Once completed, you can verify the image has been created by running:
docker images
You should see
ollama-web-client
in the list of images.
After building the image, you can run the container with:
docker run -p 8080:80 ollama-web-client
This will start the Ollama Web Client, and you can access it at http://localhost:8080
in your web browser.
By default, the client tries to connect to Ollama at http://localhost:11434
. When running in Docker, you'll need to ensure that the container can reach your Ollama instance.
If Ollama is running on your host machine, you can use host networking:
docker run --network host ollama-web-client
Or, provide the host machine's IP address in the client configuration (you may need to modify the source code to accept a configurable Ollama URL).
When running the client in a Docker container and connecting to Ollama on the host machine, you may encounter CORS (Cross-Origin Resource Sharing) issues. To resolve this, you might need to configure Ollama to accept requests from the client's origin, or use a reverse proxy to handle CORS.
This project can be packaged as a desktop application using Electron, which bundles all dependencies together.
- Node.js (v14 or later)
- npm (usually comes with Node.js)
To build the desktop application:
-
Ensure you are in the project root directory.
-
Install the project dependencies:
npm install
-
Build and package the application:
npm run dist
-
Once the process completes, you'll find the packaged application in the
release
folder.
After building:
- Navigate to the
release
folder. - Find the appropriate file for your operating system:
- For Windows: Look for an
.exe
file - For macOS: Look for a
.dmg
file - For Linux: Look for an
.AppImage
file
- For Windows: Look for an
- Double-click the file to install (Windows/macOS) or run (Linux) the application.
The desktop application still requires Ollama to be running on your machine. Ensure Ollama is installed and running before starting the Ollama Web Client desktop application.
- Ensure Ollama is running on your local machine (default port: 11434).
- If you encounter CORS issues, you may need to configure Ollama to allow cross-origin requests.
- Check the browser console for any error messages if the application is not behaving as expected.
Contributions to the Ollama Web Client are welcome! Please follow these steps:
- Fork the repository.
- Create a new branch:
git checkout -b feature/your-feature-name
. - Make your changes and commit them:
git commit -m 'Add some feature'
. - Push to the branch:
git push origin feature/your-feature-name
. - Submit a pull request.
If you have any questions or feedback, please open an issue on the GitHub repository.
Happy chatting with Ollama!