Skip to content
/ hub Public

High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included

License

Notifications You must be signed in to change notification settings

traceloop/hub

Repository files navigation

Traceloop Hub

Open-source, high-performance LLM gateway written in Rust. Connect to any LLM provider with a single API. Observability Included.

Hub is a next generation smart proxy for LLM applications. It centralizes control and tracing of all LLM calls and traces. It's built in Rust so it's fast and efficient. It's completely open-source and free to use.

Built and maintained by Traceloop under the Apache 2.0 license.

🚀 Getting Started

Make sure to copy a config.yaml file from config-example.yaml and set the correct values, following the configuration instructions.

You can then run the hub locally by running cargo run in the root directory, or using the docker image:

docker run --rm -p 3000:3000 -v $(pwd)/config.yaml:/usr/local/bin/config.yaml:ro -t traceloop/hub

Connect to the hub by using the OpenAI SDK on any language, and setting the base URL to:

http://localhost:3000/api/v1

For example, in Python:

client = OpenAI(
    base_url="http://localhost:3000/api/v1",
    api_key=os.getenv("OPENAI_API_KEY"),
    # default_headers={"x-traceloop-pipeline": "azure-only"},
)
completion = client.chat.completions.create(
    model="claude-3-5-sonnet-20241022",
    messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
    max_tokens=1000,
)

🌱 Contributing

Whether big or small, we love contributions ❤️ Check out our guide to see how to get started.

Not sure where to get started? You can:

💚 Community & Support

  • Slack (For live discussion with the community and the Traceloop team)
  • GitHub Discussions (For help with building and deeper conversations about features)
  • GitHub Issues (For any bugs and errors you encounter using OpenLLMetry)
  • Twitter (Get news fast)