What Is OTLP Ingestion?
OpenTelemetry Protocol (OTLP) is a vendor-neutral standard for trace data. Maxim accepts OTLP traces over HTTP and maps supported semantic convention attributes into Maxim traces, spans, generations, and tool calls.Before you begin
- A Maxim account and Log Repository
- Your Log Repository ID (for the
x-maxim-repo-idheader) - Your Maxim API Key (for the
x-maxim-api-keyheader) - Learn how to obtain API keys
Ensure you have created a Log Repository in Maxim and have your Log Repository ID ready. You can find it in the Maxim Dashboard under Logs > Repositories.
Endpoint & Protocol Configuration
Endpoint:https://api.getmaxim.ai/v1/otel
Supported Protocols: HTTP with OTLP binary Protobuf or JSON
| Protocol | Content-Type |
|---|---|
| HTTP + Protobuf (binary) | application/x-protobuf or application/protobuf |
| HTTP + JSON | application/json |
- HTTPS/TLS is required.
Authentication Headers
Maxim’s OTLP endpoint requires the following headers:x-maxim-repo-id: Your Maxim Log Repository IDx-maxim-api-key: Your Maxim API KeyContent-Type:application/json,application/x-protobuf, orapplication/protobuf
Supported Trace Format
Maxim currently supports OTLP traces using the following semantic conventions:- OpenTelemetry GenAI conventions (
gen_ai.*) - OpenInference conventions (
llm.*) - AI SDK conventions (
ai.*)
Conventions and support
- OpenTelemetry GenAI: Generative AI Semantic Conventions
- OpenInference: OpenInference Semantic Conventions
- AI SDK: AI SDK Semantic Conventions
Quick start (OTLP JSON)
Use OTLP JSON with required headers:Ingestion via OTLP also supports the latest (v1.39.0) version of OpenTelemetry Semantic Conventions.
Best Practices
- Use binary Protobuf (
application/x-protobuf) for optimal performance and robustness - Batch traces to reduce network overhead
- Include rich attributes following supported conventions (
gen_ai.*,llm.*, orai.*) - Secure your headers and avoid exposing credentials
- Monitor attribute size limits and apply appropriate quotas
Error Codes and Responses
| HTTP Status | Condition | Description |
|---|---|---|
| 200 | Success | { "data": { "success": true } } |
| 403 | Missing or invalid headers - x-maxim-repo-id or x-maxim-api-key | { "code": 403, "message": "Invalid access error" } |
Examples
OpenTelemetry GenAI: simple chat
OpenTelemetry GenAI: simple chat
Save as
payload.json and send with the curl command above:OpenTelemetry GenAI: additional Maxim metadata
OpenTelemetry GenAI: additional Maxim metadata
For GenAI spans, pass additional values in
maxim.metadata JSON.OpenInference: minimal `llm.*` + `openinference.span.kind` payload
OpenInference: minimal `llm.*` + `openinference.span.kind` payload
For OpenInference, include
llm.* attributes and set openinference.span.kind.AI SDK: simple chat with ai.* attributes
AI SDK: simple chat with ai.* attributes
For AI SDK spans, use
ai.* attributes. Messages can be provided via the ai.prompt.messages attribute or via events (ai.prompt.system, ai.prompt.prompt, ai.response.text, etc.).AI SDK: additional Maxim metadata
AI SDK: additional Maxim metadata
For AI SDK spans, pass additional values in
maxim.metadata JSON.OTLP: trace linked to a session via maxim.metadata
OTLP: trace linked to a session via maxim.metadata
On the root span of a trace, set Use the same OTLP
maxim.metadata (or metadata) to a JSON string that includes maxim-session-id. Maxim will create the session (with optional name, tags, and metrics) and attach the OTLP trace to that session.traceId for every span that belongs to one logical user interaction so traces group correctly in the UI.OTLP: session end marker span (maxim.session.end)
OTLP: session end marker span (maxim.session.end)
To end a session from OTLP without going through the Logging API, send a root-level span whose only required Maxim-specific attribute is
maxim.session.end with a string value equal to the session id. The span’s endTimeUnixNano is used as the session end time. That span does not emit ordinary trace or child-span logs—it only closes the session.If this span appears in the same OTLP payload as other spans for the same conversation, give it the same traceId as those spans so the export appears as a single trace in Maxim.Linking traces to sessions via OTLP
When you ingest OTLP traces, Maxim can create a session and associate the trace with it if the root span (a span with no parent in the payload, or whose parent is missing from the batch) includes session details inside structured metadata.Where to put session fields
Put session fields in a JSON object serialized as the string value of either:maxim.metadata, ormetadata
maxim-session-id are optional):
| Key | Purpose |
|---|---|
maxim-session-id | Required to link the trace to a session. Non-empty string; becomes the session id in Maxim. |
maxim-session-name | Display name for the session. |
maxim-session-tags | Object map of string tags applied to the session (values may be coerced to strings). |
maxim-session-metrics | Object map of numeric metrics (numbers, or strings that parse as finite numbers). |
maxim-trace-name, maxim-trace-tags, maxim-tags, maxim-trace-metrics, and maxim-metrics.
Ending a session
A session can be closed in more than one way. Choose the option that fits how you instrument your app (pure OTLP, hybrid with HTTP APIs, or SDK elsewhere).Behavior you should know
- Trace end ≠ session end. When a root OTLP span finishes, Maxim records a trace end with that span’s end time. That does not end the session.
- Session end is a separate signal: it finalizes the session lifecycle (including triggers such as session-level evaluations, depending on your setup).
Option 1 — OTLP span attribute maxim.session.end
Add a span attribute:
- Key:
maxim.session.end - Value: string (or value that resolves to a string), equal to the session id you want to close.
- The span’s
endTimeUnixNanois used as the session end timestamp. - No Maxim trace or generation logs are produced from that span; it acts as a session-end marker only.
Option 2 — Public Logging API (POST /v1/logging)
You can end a session with the single-action public logging endpoint, which is the same API used by Maxim SDKs under the hood for structured logging.
Endpoint: https://api.getmaxim.ai/v1/logging
Headers:
Content-Type: application/jsonx-maxim-api-key: your Maxim API key
repoId— Log Repository id (same repository you use for OTLPx-maxim-repo-id).entityId— The session id to end (must match the id you used when creating or linking the session).
entity, action, and entityId. Use this when an HTTP client or backend that does not speak OTLP needs to explicitly close a session (for example after a webhook, batch job, or admin action).
Other session actions (create, add-trace, add-tag, add-metric, feedback, evaluate, and more) use the same endpoint with different action and data fields. Refer to your OpenAPI / API reference for the full Logging schema if you need those operations.
Choosing OTLP vs Logging API for session end
| Approach | Best when |
|---|---|
maxim.session.end on an OTLP span | Session boundaries are known inside the traced request flow; you already export OTLP to Maxim. |
POST /v1/logging with session + end | A non-OTLP component must end the session, or you already use the public logging API for other entities. |
Maxim-specific additional attributes
Trace and span metadata (inside JSON)
Maxim supports the following additional keys inside themaxim.metadata or metadata JSON string (in addition to session fields above):
maxim-trace-tags:Map<string, string>for parent trace tagsmaxim-tags:Map<string, string>for current span/generation tagsmaxim-trace-metrics:Map<string, number>for parent trace metricsmaxim-metrics:Map<string, number>for current span/generation metricsmaxim-trace-name: trace display name (GenAI and AI SDK paths)
Span-level OTLP attributes
maxim.session.end— String value = session id to end; see Ending a session (OTLPmaxim.session.end). Not part of the JSON metadata blob; set as a normal OTLP span attribute.
Where to place metadata keys:
- For OpenTelemetry GenAI (
gen_ai.*): put them insidemaxim.metadata(ormetadata) - For OpenInference (
llm.*): put them insidemetadata - For AI SDK (
ai.*): put them insidemaxim.metadata(ormetadata)