-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Blog post for OpenTelemetry Generative AI updates #5575
Conversation
content/en/blog/2024/otel-generative-ai/aspire_dashboard_trace.png
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool one! Added some comments.
The below is about if we can make it zero code or not, and what's remaining to do so: I want to run I have a similar example, I've tried locally, and I don't see a way to implicitly configure the logging provider, yet. I'm not sure if we want to make a hybrid to reduce the amount of code or just leave the explicit tracing and logging stuff in until logging can be env configured. cc @anuraaga and @xrmx in case I got below wrong also requirements
env
Best I could manage was to add hooks only for the log/event stuff import os
from openai import OpenAI
# NOTE: OpenTelemetry Python Logs and Events APIs are in beta
from opentelemetry import _logs, _events
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
_logs.set_logger_provider(LoggerProvider())
_logs.get_logger_provider().add_log_record_processor(BatchLogRecordProcessor(OTLPLogExporter()))
_events.set_event_logger_provider(EventLoggerProvider())
def main():
client = OpenAI()
messages = [
{
"role": "user",
"content": "Answer in up to 3 words: Which ocean contains the falkland islands?",
},
]
model = os.getenv("CHAT_MODEL", "gpt-4o-mini")
chat_completion = client.chat.completions.create(model=model, messages=messages)
print(chat_completion.choices[0].message.content)
if __name__ == "__main__":
main() then, I get a warning about overriding the event provider, but at least the events do show up $ dotenv run -- opentelemetry-instrument python main.py
Overriding of current EventLoggerProvider is not allowed
Indian Ocean |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess what's out of scope is metrics, as they aren't implemented yet. One concern is that folks follow this, then later when metrics are implemented, they need more instructions, or people need to remember to go back and change the docs etc.
Since there are recent developments like asyncapi etc going in fairly quickly, if metrics were added quickly also, would it make sense to hold the blog until they are released? or would it make more sense to do a second blog and revisit the setup instructions once that's supported?
Metrics are discussed as part of the semantic conventions, but correct, they are not yet implemented in the library yet. I think it's worth getting an article out there earlier than later. We might even attract some contributors for the Metrics implementation. |
good point! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great read, thanks for writing this!
content/en/blog/2024/otel-generative-ai/aspire-dashboard-content-capture.png
Outdated
Show resolved
Hide resolved
content/en/blog/2024/otel-generative-ai/aspire-dashboard-trace.png
Outdated
Show resolved
Hide resolved
update on #5575 (comment) based on feedback from @lzchen. We can pare down the code impact to a few lines, assuming you have env like below:
This part will have to stick around until we have an ENV variable to control the event logger provider (or it is enabled by default). # NOTE: OpenTelemetry Python Log Events APIs is in beta
from opentelemetry import _events
from opentelemetry.sdk._events import EventLoggerProvider
_events.set_event_logger_provider(EventLoggerProvider()) Correct me, if I'm wrong, but in any case this seems very close in terms of making zero code work. |
raised open-telemetry/opentelemetry-python#4269 to hopefully sort out the manual code last mile |
just to verify, is this issue blocking this PR? |
No, I think it may get simpler with a no code instrumentation, but the code in the blog will continue to work. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for following up! It looks great!
@chalin can you give this one a review? I think we are ready to go once you approve. |
hey @drewby seems the screen shots are still out of date as they don't match the code as mentioned before. Do you need a hand getting a new copy of these? or you can grab one from this open-telemetry/opentelemetry-python-contrib#3006 |
/fix:refcache |
You triggered fix:refcache action run at https://github.com/open-telemetry/opentelemetry.io/actions/runs/12167879073 |
fix:refcache failed or was cancelled. For details, see https://github.com/open-telemetry/opentelemetry.io/actions/runs/12167879073. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I defer to @svrnm et al. for the blog content. "Infrastrucutre" wise, this LGTM once all GH actions pass.
92c8bd6
to
1cbcc10
Compare
Signed-off-by: svrnm <[email protected]>
Title: OpenTelemetry for Generative AI
This blog post introduces enhancements to OpenTelemetry specifically tailored for generative AI technologies, focusing on the development of Semantic Conventions and the Python Instrumentation Library.
Samples are in Python
SIG: GenAI Observability
Sponsors: @tedsuo @lmolkova
Closes: #5581
Preview: https://deploy-preview-5575--opentelemetry.netlify.app/blog/2024/otel-generative-ai/