Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sending logs with http to otlp receiver not working as expected #7009

Closed
noamisr opened this issue Jan 24, 2023 · 9 comments
Closed

Sending logs with http to otlp receiver not working as expected #7009

noamisr opened this issue Jan 24, 2023 · 9 comments
Labels
area:receiver bug Something isn't working

Comments

@noamisr
Copy link

noamisr commented Jan 24, 2023

Describe the bug
After configuring a simple pipeline in my otel-collector, and trying to send logs to the otlp via http, the service returned 200OK with an empty partialSuccess.

Steps to reproduce
using the latest collector, and config file:

receivers:
  otlp:
    protocols:
      grpc:
      http:

exporters:
  logging:
    loglevel: "debug"

service:
  telemetry:
    logs:
      level: "debug"
    metrics:
      level: detailed
      address: 0.0.0.0:8888
  pipelines:
    logs:
      receivers: [otlp]
      processors: []
      exporters: [logging]

and calling a simple http request - for example curl:

curl -i http://otel-collector.opentelemetry-system:4318/v1/logs -X POST -H "Content-Type: application/json" -d {}

What did you expect to see?

Some logs in the otel collector logs, for receiving http requests in the otel receiver
Metrics seems fine also:

# HELP otelcol_exporter_enqueue_failed_log_records Number of log records failed to be added to the sending queue.
# TYPE otelcol_exporter_enqueue_failed_log_records counter
otelcol_exporter_enqueue_failed_log_records{exporter="logging",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
otelcol_exporter_enqueue_failed_log_records{exporter="otlp",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
# HELP otelcol_exporter_enqueue_failed_metric_points Number of metric points failed to be added to the sending queue.
# TYPE otelcol_exporter_enqueue_failed_metric_points counter
otelcol_exporter_enqueue_failed_metric_points{exporter="logging",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
otelcol_exporter_enqueue_failed_metric_points{exporter="otlp",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
# HELP otelcol_exporter_enqueue_failed_spans Number of spans failed to be added to the sending queue.
# TYPE otelcol_exporter_enqueue_failed_spans counter
otelcol_exporter_enqueue_failed_spans{exporter="logging",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
otelcol_exporter_enqueue_failed_spans{exporter="otlp",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
# HELP otelcol_exporter_queue_capacity Fixed capacity of the retry queue (in batches)
# TYPE otelcol_exporter_queue_capacity gauge
otelcol_exporter_queue_capacity{exporter="otlp",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 5000
# HELP otelcol_exporter_queue_size Current size of the retry queue (in batches)
# TYPE otelcol_exporter_queue_size gauge
otelcol_exporter_queue_size{exporter="otlp",service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0
# HELP otelcol_otelsvc_k8s_pod_added Number of pod add events received
# TYPE otelcol_otelsvc_k8s_pod_added counter
otelcol_otelsvc_k8s_pod_added{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 10
# HELP otelcol_otelsvc_k8s_pod_table_size Size of table containing pod info
# TYPE otelcol_otelsvc_k8s_pod_table_size gauge
otelcol_otelsvc_k8s_pod_table_size{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 19
# HELP otelcol_process_cpu_seconds Total CPU user and system time in seconds
# TYPE otelcol_process_cpu_seconds counter
otelcol_process_cpu_seconds{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 0.69
# HELP otelcol_process_memory_rss Total physical memory (resident set size)
# TYPE otelcol_process_memory_rss gauge
otelcol_process_memory_rss{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 1.47718144e+08
# HELP otelcol_process_runtime_heap_alloc_bytes Bytes of allocated heap objects (see 'go doc runtime.MemStats.HeapAlloc')
# TYPE otelcol_process_runtime_heap_alloc_bytes gauge
otelcol_process_runtime_heap_alloc_bytes{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 7.37950728e+08
# HELP otelcol_process_runtime_total_alloc_bytes Cumulative bytes allocated for heap objects (see 'go doc runtime.MemStats.TotalAlloc')
# TYPE otelcol_process_runtime_total_alloc_bytes counter
otelcol_process_runtime_total_alloc_bytes{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 7.52518944e+08
# HELP otelcol_process_runtime_total_sys_memory_bytes Total bytes of memory obtained from the OS (see 'go doc runtime.MemStats.Sys')
# TYPE otelcol_process_runtime_total_sys_memory_bytes gauge
otelcol_process_runtime_total_sys_memory_bytes{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 7.79155896e+08
# HELP otelcol_process_uptime Uptime of the process
# TYPE otelcol_process_uptime counter
otelcol_process_uptime{service_instance_id="d8961e6e-991b-4b28-a29f-ed31e9089bb5",service_name="otelcol-contrib",service_version="0.70.0"} 277.293832365

What did you see instead?

No logs at all. not from the logging exporter or the logs of the collector itself.

What version did you use?
Version: v0.70.0

What config did you use?
Config: (e.g. the yaml config file)

receivers:
  otlp:
    protocols:
      grpc:
      http:

exporters:
  logging:
    loglevel: "debug"

service:
  telemetry:
    logs:
      level: "debug"
    metrics:
      level: detailed
      address: 0.0.0.0:8888
  pipelines:
    logs:
      receivers: [otlp]
      processors: []
      exporters: [logging]

Environment
OS: "Ubuntu 20.04"

@noamisr noamisr added the bug Something isn't working label Jan 24, 2023
@noamisr
Copy link
Author

noamisr commented Jan 25, 2023

After investigating the same problem with traces that gets empty partialSuccess, it seems that when the format of logs/traces is incorrect, it just returns an empty partialSuccess and nothing more. not any error/warning logs in otel, and not a return code that is not 200.

is this on purpose ?

@bogdandrutu
Copy link
Member

You sent an empty request, is that an error? What should partialSuccess contain when the request is empty?

@evan-bradley
Copy link
Contributor

@bogdandrutu The OTLP specification says that partial success messages can contain a rejected count of 0 and have a message that gives a warning when the request was fully accepted but is not entirely valid. Would a message indicating "received empty request" make sense here? Alternatively, would it make sense to print a debug log?

This also occurs with OTLP messages with JSON encoding when an unknown key is present in the payload (I have seen this come up when instrumentationLibrary is used instead of instrumentationScope). I think it would likely make sense to provide some information to the user that there are issues with their payload, whether through a log or in the response.

@noamisr
Copy link
Author

noamisr commented Jan 27, 2023

@evan-bradley thank you for your answer!
Yes, the missing feature here is indicators to the user that something is not valid.
Some of the improvement you mentioned can be a big improvement to the UX.

Another thing I thought would help developers understand what is happening is to add some access logs to the http receiver. It took me a while to understand that the requests are acually getting to the collector's receiver.

@jpkrohling
Copy link
Member

IMO, we need a troubleshooting guide showing people how to determine whether things are arriving at the collector and if they are flowing through components up to the final destination.

@ff-sdesai
Copy link

After investigating the same problem with traces that gets empty partialSuccess, it seems that when the format of logs/traces is incorrect, it just returns an empty partialSuccess and nothing more. not any error/warning logs in otel, and not a return code that is not 200.

is this on purpose ?

@noamisr Can you please send me a sample json that can be used for logs?

@noamisr
Copy link
Author

noamisr commented Jan 31, 2023

After investigating the same problem with traces that gets empty partialSuccess, it seems that when the format of logs/traces is incorrect, it just returns an empty partialSuccess and nothing more. not any error/warning logs in otel, and not a return code that is not 200.
is this on purpose ?

@noamisr Can you please send me a sample json that can be used for logs?

I still don't have a working sample of json log. I assumed that the explanation of why the traces' endpoint behaved like this is the same as with logs' endpoint.

Is that answer your question?

@ff-sdesai
Copy link

After investigating the same problem with traces that gets empty partialSuccess, it seems that when the format of logs/traces is incorrect, it just returns an empty partialSuccess and nothing more. not any error/warning logs in otel, and not a return code that is not 200.
is this on purpose ?

@noamisr Can you please send me a sample json that can be used for logs?

I still don't have a working sample of json log. I assumed that the explanation of why the traces' endpoint behaved like this is the same as with logs' endpoint.

Is that answer your question?

Yes. Btw, I found the sample logs and traces both at https://opentelemetry.io/docs/reference/specification/protocol/file-exporter/#examples

@atoulme
Copy link
Contributor

atoulme commented Dec 14, 2023

I am moving to close this issue as inactive. Please comment and reopen if more work is needed.

@atoulme atoulme closed this as not planned Won't fix, can't repro, duplicate, stale Dec 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:receiver bug Something isn't working
Projects
None yet
Development

No branches or pull requests

6 participants