Skip to content

Setting up the Ollama API

Dominic Feliton edited this page Sep 9, 2024 · 5 revisions

Ollama is the ninth service supported by WorldwideChat. It allows you to build your own translation service, similar to Libre Translate. The other major upside is that you can simply swap models on a whim if you gain access to more advanced hardware. This is great for both privacy enthusiasts and people who already have the necessary equipment.

Llama is one of the most capable models (made by Meta) that can be ran on your local machine. We use a fine-tuned version of llama 3.1 by default that supports Structured JSON outputs. The model we default to is Hermes 3, a finetune of Llama 3.1.

The Ollama page we specifically use is https://ollama.com/finalend/hermes-3-llama-3.1:8b. Feel free to use a different distribution or upgrade to the 70B/405B models if you have the appropriate hardware.

If you want to use Ollama, please follow their installation guides for your current platform:


Windows:

Linux:

macOS:

  • Install homebrew from here: https://brew.sh/
  • Then install Ollama: brew install ollama

On all three platforms, open your respective command interpreter and run two commands:

  • ollama serve - Starts an Ollama instance accessible to your local machine. If you need a more advanced setup, please consult the documentation above.
  • ollama pull <modelname> - Pulls the model that you are currently using. If you don't run this command, Ollama will return an error and tell you to do this. Our current model is finalend/hermes-3-llama-3.1:8b - make sure you write this down for ollamaModel. You only have to run this command once!

Ollama is now ready for requests! The default URL to access it is http://localhost:11434 - make sure you write this down for ollamaURL. Note that this will not work without the http:// prefix!

Enabling Ollama in WorldwideChat

If you do not want to use /wwcc, you can add these values manually:

Go to the WorldwideChat plugin folder on your Minecraft server, and open config.yml. Scroll down to this section:

useOllama: true

ollamaURL: (<---- Paste your URL one space after the colon)

ollamaModel: (<---- Paste your selected model one space after the colon)

Make sure that ONLY useOllama is set to true. Multiple translators cannot be enabled.

And finally, restart your Minecraft server or reload the plugin.