title | description |
---|---|
Quickstart |
Start shipping traces to Langtrace Cloud or your preferred OpenTelemetry-compatible backend in under 5 minutes! |
Langtrace offers flexible options for trace collection and analysis:
- Langtrace Cloud ☁️: Our managed SaaS solution for easy setup and immediate insights.
- Self-hosted Langtrace 🏠: For organizations that prefer to host Langtrace on their own infrastructure.
- OpenTelemetry Integration 🔗: Langtrace SDK supports sending traces to any OpenTelemetry-compatible backend (Datadog, New Relic, Grafana, etc), allowing you to use your existing observability stack.
Choose the option that best fits your needs and follow the corresponding setup instructions below.
To use the managed SaaS version of Langtrace, follow these steps:
- Sign up by going to this link.
- Create a new Project after signing up. Projects are containers for storing traces and metrics generated by your application. If you have only one application, creating 1 project will do.
- Generate an API key. This key will be used to authenticate your application with Langtrace Cloud. You may also create new projects and generate API keys for each of them later.
- In your application, install the Langtrace SDK. The code for installing the SDK is shown below:
# Install the SDK
pip install langtrace-python-sdk
- Initialize the Langtrace SDK with the API key you generated in the step 3. The code for setting up the SDK is shown below:
Langtrace.init({ api_key: '<LANGTRACE_API_KEY>'})
```python Python
# Import it into your project
from langtrace_python_sdk import langtrace # Must precede any llm module imports
langtrace.init(api_key = '<LANGTRACE_API_KEY>')
- You can now use Langtrace in your code. Here's a simple example:
response = client.chat.completions.create( model="gpt-3.5-turbo", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ] ) print(response.choices[0].message)
```typescript Typescript
import OpenAI from 'openai';
const openai = new OpenAI();
async function main() {
const completion = await openai.chat.completions.create({
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
model: 'gpt-3.5-turbo',
});
console.log(completion.choices[0].message.content);
}
main();
If you prefer to use Langtrace with your existing OpenTelemetry-compatible backend:
- Install the Langtrace SDK as described in the Langtrace Cloud section.
- Instead of initializing with a Langtrace API key, configure the SDK to use your OpenTelemetry exporter. Refer to our documentation for OpenTelemetry tools. Or consult your platform of choice.
For users/organizations that want to host Langtrace on their own infrastructure, follow these steps to get started.
For more details on configuring the Langtrace SDK, refer to the page Langtrace SDK Features
| Parameter | Type | Default Value | Description | | -------------------------- | ----------------------------------- | ----------------------------- | -----------------------------------------------------------------------------------------------------------------------------------------| | `batch` | `bool` | `True` | Whether to batch spans before sending them. | | `api_key` | `str` | `LANGTRACE_API_KEY`or `None` | The API key for authentication. | | `write_spans_to_console` | `bool` | `False` | Whether to write spans to the console. | | `custom_remote_exporter` | `Optional[Exporter]` | `None` | Custom remote exporter.
If `None`, a default `LangTraceExporter` will be used. | | `api_host` | `Optional[str]` | `https://langtrace.ai/` | The API host for the remote exporter. | | `service_name` | `Optional[str]` | `None` | The Service name for initializing langtrace | | `disable_instrumentations` | `Optional`
`[DisableInstrumentations]` | `None` | You can pass an object to disable instrumentation for specific vendors,
e.g., `{'only': ['openai']}`
or `{'all_except': ['openai']}`. | | Parameter | Type | Default Value | Description | | -------------------------- | ----------------------------------- | ----------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | `api_key` | `string` | `LANGTRACE_API_KEY` or `None` | The API key for authentication. | | `batch` | `boolean` | `false` | Whether to batch spans before sending them. | | `write_spans_to_console` | `boolean` | `false` | Whether to write spans to the console. | | `custom_remote_exporter` | `SpanExporter` | `undefined` | Custom remote exporter. If `undefined`, a default `LangTraceExporter` will be used. | | `api_host` | `string` | `https://langtrace.ai/` | The API host for the remote exporter. For self hosted setups the url needs to be appended with `/api/trace`. | | `instrumentations` | `{ [key in InstrumentationType]?: any }` | `undefined` | This is a required option for next.js applications. It is used to enable or disable instrumentations. ex `instrumentations: {openai: openai}` where the value is `import * as openai from 'openai'`. | | `service_name` | `Optional[str]` | `undefined` | The Service name for initializing langtrace. Can also be set via `OTEL_SERVICE_NAME` environment variable | | `disable_instrumentations` | `{all_except?: InstrumentationType[], only?: InstrumentationType[]}` | `{}` | You can pass an object to disable instrumentation for specific vendors, e.g., `{'only': ['openai']}` or `{'all_except': ['openai']}`. | | `disable_tracing_for_methods` | `Partial` | `undefined` | You can pass an object to disable tracing for specific methods. ex `disable_tracing_for_methods: { openai: ['openai.chat.completion'] }`. Full list of methods can be found [here]() | | `disable_latest_version_check` | `boolean` | `false` | Disable the latest version check. This disables the warning when the sdk version is outdated |