Skip to content

Commit

Permalink
feat(observe): implement OpenTelemetry instrumentation
Browse files Browse the repository at this point in the history
Signed-off-by: GALLLASMILAN <[email protected]>
  • Loading branch information
GALLLASMILAN authored Nov 11, 2024
1 parent 1c6d3c1 commit 2cc1ef4
Show file tree
Hide file tree
Showing 29 changed files with 1,644 additions and 13 deletions.
5 changes: 5 additions & 0 deletions .env.template
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
BEE_FRAMEWORK_LOG_PRETTY=true
BEE_FRAMEWORK_LOG_LEVEL="info"
BEE_FRAMEWORK_LOG_SINGLE_LINE="false"
## Instrumentation
# BEE_FRAMEWORK_INSTRUMENTATION_ENABLED=true
# BEE_FRAMEWORK_INSTRUMENTATION_IGNORED_KEYS=

# For WatsonX LLM Adapter
# WATSONX_API_KEY=
Expand All @@ -21,3 +24,5 @@ BEE_FRAMEWORK_LOG_SINGLE_LINE="false"
# For Google Search Tool
# GOOGLE_API_KEY=your-google-api-key
# GOOGLE_CSE_ID=your-custom-search-engine-id


2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ The Bee Agent Framework makes it easy to build scalable agent-based workflows wi
- 👩‍💻 **Code interpreter**: Run code safely in a [sandbox container](https://github.com/i-am-bee/bee-code-interpreter).
- 💾 **Memory**: Multiple [strategies](/docs/memory.md) to optimize token spend.
- ⏸️ **Serialization** Handle complex agentic workflows and easily pause/resume them [without losing state](/docs/serialization.md).
- 🔍 **Traceability**: Get full [visibility](/docs/emitter.md) of your agent’s inner workings, [log](/docs/logger.md) all running events, and use our [MLFlow integration](https://github.com/i-am-bee/bee-observe-connector) to debug performance.
- 🔍 **Instrumentation**: Use [Instrumentation](/docs/instrumentation.md) based on [Emitter](/docs/emitter.md) to have full visibility of your agent’s inner workings.
- 🎛️ **Production-level** control with [caching](/docs/cache.md) and [error handling](/docs/errors.md).
- 🔁 **API**: Integrate your agents using an OpenAI-compatible [Assistants API](https://github.com/i-am-bee/bee-api) and [Python SDK](https://github.com/i-am-bee/bee-python-sdk).
- 🖥️ **Chat UI**: Serve your agent to users in a [delightful UI](https://github.com/i-am-bee/bee-ui) with built-in transparency, explainability, and user controls.
Expand Down
2 changes: 1 addition & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ The Bee Agent Framework makes it easy to build scalable agent-based workflows wi
- 👩‍💻 **Code interpreter**: Run code safely in a [sandbox container](https://github.com/i-am-bee/bee-code-interpreter).
- 💾 **Memory**: Multiple [strategies](/docs/memory.md) to optimize token spend.
- ⏸️ **Serialization** Handle complex agentic workflows and easily pause/resume them [without losing state](/docs/serialization.md).
- 🔍 **Traceability**: Get full [visibility](/docs/emitter.md) of your agent’s inner workings, [log](/docs/logger.md) all running events, and use our [MLFlow integration](https://github.com/i-am-bee/bee-observe-connector) to debug performance.
- 🔍 **Instrumentation**: Use [Instrumentation](/docs/instrumentation.md) based on [Emitter](/docs/emitter.md) to have full visibility of your agent’s inner workings.
- 🎛️ **Production-level** control with [caching](/docs/cache.md) and [error handling](/docs/errors.md).
- 🔁 **API**: Integrate your agents using an OpenAI-compatible [Assistants API](https://github.com/i-am-bee/bee-api) and [Python SDK](https://github.com/i-am-bee/bee-python-sdk).
- 🖥️ **Chat UI**: Serve your agent to users in a [delightful UI](https://github.com/i-am-bee/bee-ui) with built-in transparency, explainability, and user controls.
Expand Down
98 changes: 98 additions & 0 deletions docs/instrumentation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
# OpenTelemetry Instrumentation in Bee-Agent-Framework

This document provides an overview of the OpenTelemetry instrumentation setup in the Bee-Agent-Framework.
The implementation is designed to [create telemetry spans](https://opentelemetry.io/docs/languages/js/instrumentation/#create-spans) for observability when instrumentation is enabled.

## Overview

OpenTelemetry instrumentation allows you to collect telemetry data, such as traces and metrics, to monitor the performance of your services.
This setup involves creating middleware to handle instrumentation automatically when the `INSTRUMENTATION_ENABLED` flag is active.

## Setting up OpenTelemetry

Follow the official OpenTelemetry [Node.js Getting Started Guide](https://opentelemetry.io/docs/languages/js/getting-started/nodejs/) to initialize and configure OpenTelemetry in your application.

## Instrumentation Configuration

### Environment Variable

Use the environment variable `BEE_FRAMEWORK_INSTRUMENTATION_ENABLED` to enable or disable instrumentation.

```bash
# Enable instrumentation
export BEE_FRAMEWORK_INSTRUMENTATION_ENABLED=true
# Ignore sensitive keys from collected events data
export INSTRUMENTATION_IGNORED_KEYS="apiToken,accessToken"
```

If `BEE_FRAMEWORK_INSTRUMENTATION_ENABLED` is false or unset, the framework will run without instrumentation.

## Creating Custom Spans

You can manually create spans during the `run` process to track specific parts of the execution. This is useful for adding custom telemetry to enhance observability.

Example of creating a span:

```ts
import { trace } from "@opentelemetry/api";

const tracer = trace.getTracer("bee-agent-framework");

function exampleFunction() {
const span = tracer.startSpan("example-function-span");
try {
// Your code logic here
} catch (error) {
span.recordException(error);
throw error;
} finally {
span.end();
}
}
```

## Verifying Instrumentation

Once you have enabled the instrumentation, you can view telemetry data using any [compatible OpenTelemetry backend](https://opentelemetry.io/docs/languages/js/exporters/), such as [Jaeger](https://www.jaegertracing.io/), [Zipkin](https://zipkin.io/), [Prometheus](https://prometheus.io/docs/prometheus/latest/feature_flags/#otlp-receiver), etc...
Ensure your OpenTelemetry setup is properly configured to export trace data to your chosen backend.

## Run examples

> the right version of node.js must be correctly set
```
nvm use
```

### Agent instrumentation

Running the Instrumented Application (`examples/agents/bee_instrumentation.js`) file.

```bash
## the telemetry example is run on built js files
yarn start:telemetry ./examples/agents/bee_instrumentation.ts
```

### LLM instrumentation

Running (`./examples/llms/instrumentation.js`) file.

```bash
## the telemetry example is run on built js files

yarn start:telemetry ./examples/llms/instrumentation.ts
```

### Tool instrumentation

Running (`./examples/tools/instrumentation.js`) file.

```bash
## the telemetry example is run on built js files
yarn start:telemetry ./examples/tools/instrumentation.ts
```

## Conclusion

This setup provides basic OpenTelemetry instrumentation with the flexibility to enable or disable it as needed.
By creating custom spans and using `createTelemetryMiddleware`, you can capture detailed telemetry for better observability and performance insights.
5 changes: 5 additions & 0 deletions docs/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,17 @@ The source directory (`src`) provides numerous modules that one can use.
| [**serializer**](./serialization.md) | Core component for the ability to serialize/deserialize modules into the serialized format. |
| [**version**](./version.md) | Constants representing the framework (e.g., the latest version) |
| [**emitter**](./emitter.md) | Bringing visibility to the system by emitting events. |
| [**instrumentation**](./instrumentation.md) | Integrate monitoring tools into your application. |
| **internals** | Modules used by other modules within the framework. |

### Emitter

Moved to a [standalone page](emitter.md).

### Instrumentation

Moved to a [standalone page](instrumentation.md).

### LLMs

Moved to a [standalone page](llms.md).
Expand Down
5 changes: 4 additions & 1 deletion eslint.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -77,11 +77,14 @@ export default tseslint.config(
},
},
{
files: ["examples/**/*.ts"],
files: ["examples/**/*.{ts,js}"],
languageOptions: {
parserOptions: {
project: "./tsconfig.examples.json",
},
globals: {
setTimeout: "readonly",
},
},
rules: {
"no-console": "off",
Expand Down
46 changes: 46 additions & 0 deletions examples/agents/bee_instrumentation.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
////////////////////////////////////////////////////////////////////////////////////////////////////////
/////// RUN THIS EXAMPLE VIA `yarn start:telemetry ./examples/agents/bee_instrumentation.ts` ///////////
////////////////////////////////////////////////////////////////////////////////////////////////////////

import { BeeAgent } from "bee-agent-framework/agents/bee/agent";
import { FrameworkError } from "bee-agent-framework/errors";
import { TokenMemory } from "bee-agent-framework/memory/tokenMemory";
import { Logger } from "bee-agent-framework/logger/logger";
import { DuckDuckGoSearchTool } from "bee-agent-framework/tools/search/duckDuckGoSearch";
import { WikipediaTool } from "bee-agent-framework/tools/search/wikipedia";
import { OpenMeteoTool } from "bee-agent-framework/tools/weather/openMeteo";
import { OllamaChatLLM } from "bee-agent-framework/adapters/ollama/chat";

Logger.root.level = "silent"; // disable internal logs
const logger = new Logger({ name: "app", level: "trace" });

const llm = new OllamaChatLLM({
modelId: "llama3.1", // llama3.1:70b for better performance
});

const agent = new BeeAgent({
llm,
memory: new TokenMemory({ llm }),
tools: [
new DuckDuckGoSearchTool(),
new WikipediaTool(),
new OpenMeteoTool(), // weather tool
],
});

try {
const response = await agent.run(
{ prompt: "what is the weather like in Granada?" },
{
execution: {
maxRetriesPerStep: 3,
totalMaxRetries: 10,
maxIterations: 20,
},
},
);

logger.info(`Agent 🤖 : ${response.result.text}`);
} catch (error) {
logger.error(FrameworkError.ensure(error).dump());
}
15 changes: 15 additions & 0 deletions examples/helpers/telemetry.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import "@opentelemetry/instrumentation/hook.mjs";
import { NodeSDK } from "@opentelemetry/sdk-node";
import { ConsoleSpanExporter } from "@opentelemetry/sdk-trace-node";
import { Resource } from "@opentelemetry/resources";
import { ATTR_SERVICE_NAME, ATTR_SERVICE_VERSION } from "@opentelemetry/semantic-conventions";

const sdk = new NodeSDK({
resource: new Resource({
[ATTR_SERVICE_NAME]: "bee-agent-framework",
[ATTR_SERVICE_VERSION]: "0.0.1",
}),
traceExporter: new ConsoleSpanExporter(),
});

await sdk.start();
27 changes: 27 additions & 0 deletions examples/llms/instrumentation.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
//////////////////////////////////////////////////////////////////////////////////////////////////
/////// RUN THIS EXAMPLE VIA `yarn start:telemetry ./examples/llms/instrumentation.ts` ///////////
//////////////////////////////////////////////////////////////////////////////////////////////////
import { BaseMessage, Role } from "bee-agent-framework/llms/primitives/message";
import { OllamaChatLLM } from "bee-agent-framework/adapters/ollama/chat";
import { Logger } from "bee-agent-framework/logger/logger";

Logger.root.level = "silent"; // disable internal logs
const logger = new Logger({ name: "app", level: "trace" });

const llm = new OllamaChatLLM({
modelId: "llama3.1", // llama3.1:70b for better performance
});

const response = await llm.generate([
BaseMessage.of({
role: Role.USER,
text: "hello",
}),
]);

logger.info(`LLM 🤖 (txt) : ${response.getTextContent()}`);

// Wait briefly to ensure all telemetry data has been processed
setTimeout(() => {
logger.info("Process exiting after OpenTelemetry flush.");
}, 5_000); // Adjust the delay as needed
21 changes: 21 additions & 0 deletions examples/tools/instrumentation.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
///////////////////////////////////////////////////////////////////////////////////////////////////
/////// RUN THIS EXAMPLE VIA `yarn start:telemetry ./examples/tools/instrumentation.ts` ///////////
///////////////////////////////////////////////////////////////////////////////////////////////////
import { OpenMeteoTool } from "bee-agent-framework/tools/weather/openMeteo";
import { Logger } from "bee-agent-framework/logger/logger";

Logger.root.level = "silent"; // disable internal logs
const logger = new Logger({ name: "app", level: "trace" });

const tool = new OpenMeteoTool();
const result = await tool.run({
location: { name: "New York" },
start_date: "2024-10-10",
end_date: "2024-10-10",
});
logger.info(`OpenMeteoTool 🤖 (txt) : ${result.getTextContent()}`);

// Wait briefly to ensure all telemetry data has been processed
setTimeout(() => {
logger.info("Process exiting after OpenTelemetry flush.");
}, 5_000); // Adjust the delay as needed
7 changes: 7 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,7 @@
"ts:check": "tsc --noEmit && tsc -p tsconfig.examples.json --noEmit",
"start": "tsx --tsconfig tsconfig.examples.json",
"start:bee": "yarn start -- examples/agents/bee.ts",
"start:telemetry": "BEE_FRAMEWORK_INSTRUMENTATION_ENABLED=true yarn start --import ./examples/helpers/telemetry.ts",
"docs:build": "embedme --source-root=. docs/**/*.md && cp *.md docs/ && yarn lint:fix docs/ && yarn prettier --write docs/",
"docs:check": "embedme --source-root=. docs/**/*.md --verify && yarn docs:links",
"docs:links": "linkinator \"**/*.md\" --timeout=10000 --retry --markdown --directory-listing --skip node_modules --skip \"https://ai.meta.com/blog/meta-llama-3-1/\" ",
Expand All @@ -143,6 +144,7 @@
"@ai-zen/node-fetch-event-source": "^2.1.4",
"@connectrpc/connect": "^1.6.1",
"@connectrpc/connect-node": "^1.6.1",
"@opentelemetry/api": "^1.9.0",
"@streamparser/json": "^0.0.21",
"ajv": "^8.17.1",
"ajv-formats": "^3.0.1",
Expand Down Expand Up @@ -194,6 +196,11 @@
"@ibm-generative-ai/node-sdk": "~3.2.3",
"@langchain/community": "~0.3.12",
"@langchain/core": "~0.3.17",
"@opentelemetry/instrumentation": "^0.54.0",
"@opentelemetry/resources": "^1.27.0",
"@opentelemetry/sdk-node": "^0.54.0",
"@opentelemetry/sdk-trace-node": "^1.27.0",
"@opentelemetry/semantic-conventions": "^1.27.0",
"@release-it/conventional-changelog": "^8.0.2",
"@rollup/plugin-commonjs": "^28.0.1",
"@swc/core": "^1.7.36",
Expand Down
5 changes: 4 additions & 1 deletion src/agents/base.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,9 @@ import { Serializable } from "@/internals/serializable.js";
import { GetRunContext, RunContext } from "@/context.js";
import { Emitter } from "@/emitter/emitter.js";
import { BaseMemory } from "@/memory/base.js";
import { createTelemetryMiddleware } from "@/instrumentation/create-telemetry-middleware.js";
import { INSTRUMENTATION_ENABLED } from "@/instrumentation/config.js";
import { doNothing } from "remeda";

export class AgentError extends FrameworkError {}

Expand Down Expand Up @@ -62,7 +65,7 @@ export abstract class BaseAgent<
this.isRunning = false;
}
},
);
).middleware(INSTRUMENTATION_ENABLED ? createTelemetryMiddleware() : doNothing());
}

protected abstract _run(
Expand Down
2 changes: 1 addition & 1 deletion src/emitter/emitter.ts
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ export class Emitter<T = Record<keyof any, Callback<unknown>>> extends Serializa
public readonly namespace: string[];
public readonly creator?: object;
public readonly context: object;
protected readonly trace?: EventTrace;
public readonly trace?: EventTrace;

constructor(input: EmitterInput = {}) {
super();
Expand Down
28 changes: 28 additions & 0 deletions src/instrumentation/config.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
/**
* Copyright 2024 IBM Corp.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

import { parseEnv } from "@/internals/env.js";
import { z } from "zod";

export const INSTRUMENTATION_ENABLED = parseEnv.asBoolean("BEE_FRAMEWORK_INSTRUMENTATION_ENABLED");

export const INSTRUMENTATION_IGNORED_KEYS = parseEnv(
"BEE_FRAMEWORK_INSTRUMENTATION_IGNORED_KEYS",
z.string(),
"",
)
.split(",")
.filter(Boolean);
Loading

0 comments on commit 2cc1ef4

Please sign in to comment.