Open-source, high-performance LLM gateway written in Rust. Connect to any LLM provider with a single API. Observability Included.
Hub is a next generation smart proxy for LLM applications. It centralizes control and tracing of all LLM calls and traces. It's built in Rust so it's fast and efficient. It's completely open-source and free to use.
Built and maintained by Traceloop under the Apache 2.0 license.
Make sure to copy a config.yaml
file from config-example.yaml
and set the correct values, following the configuration instructions.
You can then run the hub locally by running cargo run
in the root directory, or using the docker image:
docker run --rm -p 3000:3000 -v $(pwd)/config.yaml:/usr/local/bin/config.yaml:ro -t traceloop/hub
Connect to the hub by using the OpenAI SDK on any language, and setting the base URL to:
http://localhost:3000/api/v1
For example, in Python:
client = OpenAI(
base_url="http://localhost:3000/api/v1",
api_key=os.getenv("OPENAI_API_KEY"),
# default_headers={"x-traceloop-pipeline": "azure-only"},
)
completion = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
max_tokens=1000,
)
Whether big or small, we love contributions ❤️ Check out our guide to see how to get started.
Not sure where to get started? You can:
- Book a free pairing session with one of our teammates!
- Join our Slack, and ask us any questions there.
- Slack (For live discussion with the community and the Traceloop team)
- GitHub Discussions (For help with building and deeper conversations about features)
- GitHub Issues (For any bugs and errors you encounter using OpenLLMetry)
- Twitter (Get news fast)