feat(examples): migrate AI examples to OpenTelemetry instrumentation#482
feat(examples): migrate AI examples to OpenTelemetry instrumentation#482richardsolomou wants to merge 7 commits intomasterfrom
Conversation
Switch from PostHog direct SDK wrappers to OpenTelemetry auto-instrumentation for all AI provider examples where OTel instrumentations are available. Uses opentelemetry-instrumentation-openai-v2 for OpenAI-compatible providers, opentelemetry-instrumentation-anthropic for Anthropic, opentelemetry-instrumentation-google-generativeai for Gemini, opentelemetry-instrumentation-langchain for LangChain/LangGraph, opentelemetry-instrumentation-llamaindex for LlamaIndex, and opentelemetry-instrumentation-crewai for CrewAI.
posthog-python Compliance ReportDate: 2026-04-08 10:06:46 UTC ✅ All Tests Passed!0/0 tests passed |
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
Switch from BatchSpanProcessor to SimpleSpanProcessor so spans are exported immediately. This removes the need for provider.shutdown() which could throw export errors on exit.
CrewAI manages its own TracerProvider internally, which conflicts with setting one externally. LiteLLM callbacks remain the correct integration approach for CrewAI.
…el instrumentation
|
| message = client.messages.create( | ||
| model="claude-sonnet-4-5-20250929", | ||
| max_tokens=1024, | ||
| posthog_distinct_id="example-user", |
There was a problem hiding this comment.
does the otel example show this?
andrewm4894
left a comment
There was a problem hiding this comment.
i think the otel examples should be as full featured as possible and show stuff like setting distinct id and any custom foo-bar properties - am i missing that or something?
Problem
Our AI examples use PostHog's direct SDK wrappers (
posthog.ai.openai,posthog.ai.anthropic, etc.) for tracking LLM calls. We want to silently deprecate these in favor of standard OpenTelemetry instrumentation, which is more portable and follows industry conventions.Changes
Migrates 28 Python AI examples from PostHog direct wrappers to OpenTelemetry auto-instrumentation:
opentelemetry-instrumentation-openai-v2opentelemetry-instrumentation-openai-v2opentelemetry-instrumentation-anthropicopentelemetry-instrumentation-google-generativeaiopentelemetry-instrumentation-langchainopentelemetry-instrumentation-llamaindexKept as-is: CrewAI (manages its own TracerProvider, uses LiteLLM callbacks), AWS Bedrock/Pydantic AI (already OTel), LiteLLM/DSPy (LiteLLM callbacks), OpenAI Agents/Claude Agent SDK (native PostHog integration).
Key implementation details:
SimpleSpanProcessorinstead ofBatchSpanProcessorso spans export immediately without needingprovider.shutdown()Instrumentor().instrument()are marked with# noqa: E402