Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

otlphttp exporter not able to export logs to the service endpoint #5779

Closed
navneet1075 opened this issue Oct 17, 2021 · 7 comments
Closed

otlphttp exporter not able to export logs to the service endpoint #5779

navneet1075 opened this issue Oct 17, 2021 · 7 comments
Labels
question Further information is requested Stale waiting for author

Comments

@navneet1075
Copy link

Describe the bug
otlphttp exporter does not work .

Steps to reproduce
used the filelog receiver to retrieve files and then send the ouput to a sink using otlphttp exporter.

What did you expect to see?
exporter able to send logs .

What did you see instead?
i dont see any logs being sent after reading from files.

this is the log i see for the collector container


2021-10-15T15:15:15.357Z	info	service/collector.go:176	Applying configuration...
2021-10-15T15:15:15.357Z	info	builder/exporters_builder.go:265	Exporter was built.	{"kind": "exporter", "name": "otlphttp"}
2021-10-15T15:15:15.357Z	info	builder/pipelines_builder.go:214	Pipeline was built.	{"pipeline_name": "logs", "pipeline_datatype": "logs"}
2021-10-15T15:15:15.358Z	info	builder/receivers_builder.go:228	Receiver was built.	{"kind": "receiver", "name": "filelog", "datatype": "logs"}
2021-10-15T15:15:15.358Z	info	service/service.go:101	Starting extensions...
2021-10-15T15:15:15.358Z	info	service/service.go:106	Starting exporters...
2021-10-15T15:15:15.358Z	info	builder/exporters_builder.go:92	Exporter is starting...	{"kind": "exporter", "name": "otlphttp"}
2021-10-15T15:15:15.358Z	info	builder/exporters_builder.go:97	Exporter started.	{"kind": "exporter", "name": "otlphttp"}
2021-10-15T15:15:15.358Z	info	service/service.go:111	Starting processors...
2021-10-15T15:15:15.358Z	info	builder/pipelines_builder.go:51	Pipeline is starting...	{"pipeline_name": "logs", "pipeline_datatype": "logs"}
2021-10-15T15:15:15.358Z	info	builder/pipelines_builder.go:62	Pipeline is started.	{"pipeline_name": "logs", "pipeline_datatype": "logs"}
2021-10-15T15:15:15.358Z	info	service/service.go:116	Starting receivers...
2021-10-15T15:15:15.358Z	info	builder/receivers_builder.go:70	Receiver is starting...	{"kind": "receiver", "name": "filelog"}
2021-10-15T15:15:15.358Z	info	[email protected]/receiver.go:51	Starting stanza receiver	{"kind": "receiver", "name": "filelog"}
2021-10-15T15:15:15.359Z	info	builder/receivers_builder.go:75	Receiver started.	{"kind": "receiver", "name": "filelog"}
2021-10-15T15:15:15.359Z	info	service/telemetry.go:65	Setting up own telemetry...
2021-10-15T15:15:15.360Z	info	service/telemetry.go:113	Serving Prometheus metrics	{"address": ":8888", "level": 0, "service.instance.id": "49496ae7-2395-4d05-9048-0eef1772bf13"}
2021-10-15T15:15:15.360Z	info	service/collector.go:230	Starting otelcontribcol...	{"Version": "v0.36.0", "NumCPU": 8}
2021-10-15T15:15:15.360Z	info	service/collector.go:134	Everything is ready. Begin running and processing data.
2021-10-15T15:15:15.559Z	warn	no files match the configured include patterns	{"kind": "receiver", "name": "filelog", "operator_id": "$.file_input", "operator_type": "file_input", "include": ["/var/logs/*.json"]}
2021-10-15T15:15:44.325Z	info	Started watching file	{"kind": "receiver", "name": "filelog", "operator_id": "$.file_input", "operator_type": "file_input", "path": "/var/logs/3UpsTCLI.json"}
2021-10-15T15:15:47.124Z	info	Started watching file	{"kind": "receiver", "name": "filelog", "operator_id": "$.file_input", "operator_type": "file_input", "path": "/var/logs/LpYaUN4J.json"}
2021-10-15T15:15:48.724Z	info	Started watching file	{"kind": "receiver", "name": "filelog", "operator_id": "$.file_input", "operator_type": "file_input", "path": "/var/logs/HEQkuviM.json"}

i dont see any exporter logs in action and i can clearly see that exporter is started and working . there is no error there.

the service is running as a clusterip service in the cluster and the url is correct

 k describe service backend-otel                                                                                                                                           (k3d-audit-logging/default)
Name:              backend-otel
Namespace:         default
Labels:            <none>
Annotations:       <none>
Selector:          app=backend-deploy-otel
Type:              ClusterIP
IP:                10.43.246.51
Port:              http  9303/TCP
TargetPort:        9303/TCP
Endpoints:         10.42.0.64:9303

What version did you use?

What config did you use?

apiVersion: v1
kind: ConfigMap
metadata:
  name: otel-collector-config
data:
  config.yaml: |
    receivers:
      filelog:
        include: [ /var/logs/*.json ]
        start_at: beginning
        operators:
          - type: regex_parser
            regex: '^Host=(?<host>)$'
            parse_from: message
            regex: '^Host=(?P<host>[^,]+), Type=(?P<type>.*)$'

    exporters:
      otlphttp:
        logs_endpoint: "backend-otel.default:9303/audit-log/logging/v1/eventLogs"

this service is pointing to a deployment which has the endpoint : /audit-log/logging/v1/eventLogs. i am able to hit the endpoint when i use port-forward with the service and hit in the browser , but i am not able to hit it using the exporter.

Please let me know what is wrong here and if possible can someone give a correct example of using filelog receiver and otlphttp exporter . thanks .

Environment
OS: MacOS.

@navneet1075 navneet1075 added the bug Something isn't working label Oct 17, 2021
@navneet1075 navneet1075 changed the title otlhttp exporter not able to export logs to the service endpoint otlphttp exporter not able to export logs to the service endpoint Oct 18, 2021
@github-actions
Copy link
Contributor

github-actions bot commented Nov 7, 2022

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Nov 7, 2022
@jpkrohling
Copy link
Member

It's hard to tell what's going on. Please add the logging exporter with verbosity: detailed and report back. It's unclear if the problem is the lack of received log entries or a connection to the remote endpoint.

For reference, I was testing a local service I'm building and I didn't have problems with the OTLP HTTP Exporter for Logs. My configuration was something like:

receivers:
  journald:
    directory: /var/log/journal/a04e3a44cdd740f88d6a7ae3bb8c70cf

exporters:
  logging:
    verbosity: detailed
  otlphttp:
    endpoint: http://localhost:8000/otlp
    tls:
      insecure: true
    headers:
      "X-Scope-OrgID": 1

processors:

service:
  extensions: []
  pipelines:
    logs:
      receivers: [journald]
      processors: []
      exporters: [logging, otlphttp]

@jpkrohling jpkrohling added question Further information is requested and removed Stale bug Something isn't working labels Nov 29, 2022
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Jan 30, 2023
@jpkrohling
Copy link
Member

Perhaps related to open-telemetry/opentelemetry-collector#7009 ?

@github-actions
Copy link
Contributor

github-actions bot commented Jun 2, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions github-actions bot added the Stale label Jun 2, 2023
@jpkrohling
Copy link
Member

Closing, as we didn't hear back after a few months...

@yasha145
Copy link

yasha145 commented Mar 7, 2024

hello @jpkrohling
I am facing the same issue

2024-03-07T14:12:07.150+0530 info [email protected]/service.go:169 Everything is ready. Begin running and processing data.
2024-03-07T14:12:07.150+0530 warn localhostgate/featuregate.go:63 The default endpoints for all servers in components will change to use localhost instead of 0.0.0.0 in a future version. Use the feature gate to preview the new default. {"feature gate ID": "component.UseLocalHostAsDefaultHost"}
2024-03-07T14:12:07.352+0530 debug fileconsumer/file.go:133 matched files {"kind": "receiver", "name": "otlpjsonfile", "data_type": "metrics", "component": "fileconsumer", "paths": ["/Users/Documents/jsonFile.json"]}
2024-03-07T14:12:07.352+0530 debug fileconsumer/file.go:170 Consuming files{paths 1 0 [/Users/Documents/jsonFile.json]} {"kind": "receiver", "name": "otlpjsonfile", "data_type": "metrics", "component": "fileconsumer"}
2024-03-07T14:12:07.352+0530 info fileconsumer/file.go:260 Started watching file {"kind": "receiver", "name": "otlpjsonfile", "data_type": "metrics", "component": "fileconsumer", "path": "/Users/Documents/jsonFile.json"}

And these log messages repeatedly getting printed, I dont know how to proceed further

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale waiting for author
Projects
None yet
Development

No branches or pull requests

4 participants