Status | |
---|---|
Stability | [alpha]: traces, metrics, logs |
Distributions | contrib |
Warnings | Identity Conflict |
Code Owners | @jriguera |
The context processor modifies context metadata of a span, log, or metric. Please refer to config.go for the config spec.
Typical use cases:
- Be able to dynamically define tenants for Mimir/Cortex.
- Dynamically define metadata attributes in the context, to offer a link to pass resource attribute to extensions
- Change metadata generated from the receivers
It takes a list of actions which are performed in order specified in the config. The supported actions are:
insert
: Inserts a new attribute in input data where the key does not already exist.update
: Updates an attribute in input data where the key does exist.upsert
: Performs insert or update. Inserts a new attribute in input data where the key does not already exist and updates an attribute in input data where the key does exist.delete
: Deletes an attribute from the input data.
For the actions insert
, update
and upsert
,
key
is requiredvalue
and/orfrom_attribute
are requiredaction
is required.
# Key specifies the attribute to act upon.
- key: <key>
action: {insert, update, upsert}
# Value specifies the value to populate for the key.
value: <value>
# Key specifies the attribute to act upon.
- key: <key>
action: {insert, update, upsert}
# FromAttribute specifies the attribute from the context metadata to use to populate
# the value. If the attribute doesn't exist, value is used.
from_attribute: <other key>
value: <value>
For the delete
action,
key
is requiredaction: delete
is required.
# Key specifies the attribute to act upon.
- key: <key>
action: delete
The list of actions can be composed to create rich scenarios, such as back filling attribute, copying values to a new key, redacting sensitive information. The following is a sample configuration.
processors:
context/example:
actions:
- action: upsert
key: tenant
value: anonymous
from_attribute: service.name
- action: delete
key: tenant
It is highly recommended to use this processor with groupbyattrs
processor, potentially the batch processor can be used. This is a example configuration:
extensions:
headers_setter:
headers:
- action: update
key: X-Scope-OrgID
from_context: x-scope-orgid
receivers:
otlp:
protocols:
grpc:
include_metadata: true
http:
include_metadata: true
processors:
batch/tenant:
send_batch_size: 1000
send_batch_max_size: 2000
metadata_keys:
- x-scope-orgid
timeout: 1s
groupbyattrs/tenant:
keys: [tenant]
context/tenant:
actions:
- action: upsert
key: x-scope-orgid
value: anonymous
from_attribute: tenant
# In this example, each tenant has its own namespace. Data can come from different clusters!
transform/tenant:
error_mode: ignore
metric_statements:
- context: resource
statements:
- set(cache["tenant"], "anonymous")
- set(cache["tenant"], attributes["k8s.namespace.name"])
- set(attributes["tenant"], cache["tenant"]) where attributes["tenant"] == nil or attributes["tenant"] == ""
log_statements:
- context: resource
statements:
- set(cache["tenant"], "anonymous")
- set(cache["tenant"], attributes["k8s.namespace.name"])
- set(attributes["tenant"], cache["tenant"]) where attributes["tenant"] == nil or attributes["tenant"] == ""
trace_statements:
- context: resource
statements:
- set(cache["tenant"], "anonymous")
- set(cache["tenant"], attributes["k8s.namespace.name"])
- set(attributes["tenant"], cache["tenant"]) where attributes["tenant"] == nil or attributes["tenant"] == ""
exporters:
prometheusremotewrite/mimir:
endpoint: "http://mimir-gateway/api/v1/push"
resource_to_telemetry_conversion:
enabled: true
auth:
authenticator: headers_setter
otlphttp/loki:
endpoint: "http://loki-gateway/loki/otlp/v1/logs"
tls:
insecure: true
auth:
authenticator: headers_setter
otlp/tempo:
endpoint: "dns:///tempo-distributor-discovery.ns.svc.cluster.local:4317"
compression: "gzip"
tls:
insecure: true
auth:
authenticator: headers_setter
pipelines:
metrics:
receivers: [otlp]
processors: [transform/tenant, context/tenant, groupbyattrs/tenant, batch/tenant]
exporters: [prometheusremotewrite/mimir]
logs:
receivers: [otlp]
processors: [transform/tenant, context/tenant, groupbyattrs/tenant, batch/tenant]
exporters: [otlphttp/loki]
traces:
receivers: [otlp]
processors: [transform/tenant, context/tenant, groupbyattrs/tenant, batch/tenant]
exporters: [otlp/tempo]
extensions:
- headers_setter
In general, the Context processor is a very safe processor to use, but depending on the attribute used for the tenant and the receiver it can cause a lot of fragmentation which can affect performance sending data to the next system. The recomendation is used together with Group by Attributes processor and Batch processor