Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

not getting metrics when using sidecar based approach of otel collector for python app #3445

Open
navnitkum opened this issue Nov 11, 2024 · 1 comment
Assignees
Labels
area:collector Issues for deploying collector bug Something isn't working

Comments

@navnitkum
Copy link

Component(s)

auto-instrumentation

Describe the issue you're reporting

collector logs

2024-11-10T08:54:30.715Z	info	[email protected]/service.go:221	Everything is ready. Begin running and processing data.
2024-11-11T11:02:26.059Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:02:41.030Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-11-11T11:02:42.441Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:03:02.414Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-11-11T11:03:06.030Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:03:16.033Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:03:22.414Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:03:27.414Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:03:41.031Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:03:52.415Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-11-11T11:04:11.033Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:04:12.418Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 2}
2024-11-11T11:04:27.416Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:04:31.033Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:04:56.034Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:05:16.035Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:05:41.037Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:06:01.036Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}
2024-11-11T11:06:31.038Z	info	TracesExporter	{"kind": "exporter", "data_type": "traces", "name": "debug", "resource spans": 1, "spans": 1}

collector yaml

apiVersion: opentelemetry.io/v1beta1
kind: OpenTelemetryCollector
metadata:
  labels:
    app.kubernetes.io/managed-by: opentelemetry-operator
  name: apm
  namespace: kube-tracing
spec:
  args:
    feature-gates: '-component.UseLocalHostAsDefaultHost'
  config:
    exporters:
      debug: {}
      otlphttp/tempo:
        endpoint: http://tempo.monitoring.svc.cluster.local:4318/
        tls:
          insecure: true
      otlphttp/victoriametrics:
        compression: gzip
        encoding: proto
        endpoint: >-
          http://common-victoria-metrics-agent.monitoring.svc.cluster.local:8429/api/v1/push
        tls:
          insecure: true
    extensions:
      health_check: {}
    processors: {}
    receivers:
      otlp:
        protocols:
          grpc:
            endpoint: 0.0.0.0:4317
          http:
            endpoint: 0.0.0.0:4318
    service:
      extensions:
        - health_check
      pipelines:
        metrics:
          exporters:
            - debug
            - otlphttp/victoriametrics
          receivers:
            - otlp
        traces:
          exporters:
            - otlphttp/tempo
            - debug
          receivers:
            - otlp

sidecar.yaml

apiVersion: opentelemetry.io/v1beta1
kind: OpenTelemetryCollector
metadata:
  name: sidecar
  namespace: kube-tracing
spec:
  args:
    feature-gates: '-component.UseLocalHostAsDefaultHost'
  config:
    exporters:
      otlphttp:
        endpoint: http://apm-collector.kube-tracing.svc.cluster.local:4318
    receivers:
      otlp:
        protocols:
          grpc: null
          http: null
    service:
      pipelines:
        traces:
          exporters:
            - otlphttp
          receivers:
            - otlp
      telemetry:
        logs:
          level: debug
  configVersions: 3
  daemonSetUpdateStrategy: {}
  deploymentUpdateStrategy: {}
  ingress:
    route: {}
  ipFamilyPolicy: SingleStack
  managementState: managed
  mode: sidecar
 

i am not getting any auto-instrumented metrics for my python app

apiVersion: apps/v1
kind: Deployment
metadata:
  name: dice-server
  namespace: kube-tracing
spec:
  replicas: 1
  selector:
    matchLabels:
      app: dice-server
  template:
    metadata:
      creationTimestamp: null
      labels:
        app: dice-server
      annotations:
        instrumentation.opentelemetry.io/inject-python: 'true'
        instrumentation.opentelemetry.io/python-container-names: dice-server
        sidecar.opentelemetry.io/inject: 'true'
    spec:
      containers:
        - name: dice-server
          image: >-
            asia-southeast2-docker.pkg.dev/lip-prod-sre-3a7f/apm-golang/dice-server:v3
          ports:
            - containerPort: 8080
              protocol: TCP
@jaronoff97
Copy link
Contributor

I'm pretty sure this is because the service we create for sidecars doesn't target the collector sidecar correctly, I can look into this next week!

@jaronoff97 jaronoff97 self-assigned this Nov 23, 2024
@jaronoff97 jaronoff97 added bug Something isn't working area:collector Issues for deploying collector and removed needs triage labels Nov 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:collector Issues for deploying collector bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants