Skip to content

Commit

Permalink
Add presence and frequency penalty
Browse files Browse the repository at this point in the history
  • Loading branch information
karthikscale3 committed May 22, 2024
1 parent 6dd78a1 commit 988a116
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 1 deletion.
5 changes: 4 additions & 1 deletion docs/attributes-registry/gen-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,10 @@ This document defines the attributes used to describe telemetry in the context o
| `gen_ai.request.temperature` | double | The temperature setting for the LLM request. | `0.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.request.top_p` | double | The top_p sampling setting for the LLM request. | `1.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.request.stop` | string | Up to 4 sequences where the API will stop generating further tokens. | `1.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.request.top_k` | double | The top_k sampling setting for the LLM request. | `1.0`
| `gen_ai.request.top_k` | double | The top_k sampling setting for the LLM request. | `1.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.request.frequency_penalty` | double | The frequency penalty setting for the LLM request. | `1.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.request.presence_penalty` | double | The presence penalty setting for the LLM request. | `1.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.request.top_p` | double | The top_p sampling setting for the LLM request. | `1.0` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.response.finish_reasons` | string[] | Array of reasons the model stopped generating tokens, corresponding to each generation received. | `stop` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.response.id` | string | The unique identifier for the completion. | `chatcmpl-123` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
| `gen_ai.response.model` | string | The name of the LLM a response was generated from. | `gpt-4-0613` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
Expand Down
12 changes: 12 additions & 0 deletions model/registry/gen-ai.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,18 @@ groups:
brief: The stop sequences to provide.
examples: ['\n']
tag: llm-generic-request
- id: request.frequency_penalty
stability: experimental
type: double
brief: The frequency penalty to provide.
examples: ['0.1']
tag: llm-generic-request
- id: request.presence_penalty
stability: experimental
type: double
brief: The presence penalty to provide.
examples: ['0.1']
tag: llm-generic-request
- id: response.id
stability: experimental
type: string
Expand Down
4 changes: 4 additions & 0 deletions model/trace/gen-ai.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ groups:
requirement_level: recommended
- ref: gen_ai.request.stop
requirement_level: recommended
- ref: gen_ai.request.frequncy_penalty
requirement_level: recommended
- ref: gen_ai.request.presence_penalty
requirement_level: recommended
- ref: gen_ai.response.id
requirement_level: recommended
- ref: gen_ai.response.model
Expand Down

0 comments on commit 988a116

Please sign in to comment.