Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for traces in Cloud Foundry receiver #33914

Open
jriguera opened this issue Jul 4, 2024 · 3 comments
Open

Add support for traces in Cloud Foundry receiver #33914

jriguera opened this issue Jul 4, 2024 · 3 comments
Labels
enhancement New feature or request never stale Issues marked with this label will be never staled and automatically removed receiver/cloudfoundry

Comments

@jriguera
Copy link
Contributor

jriguera commented Jul 4, 2024

Component(s)

receiver/cloudfoundry

Is your feature request related to a problem? Please describe.

Applications running in Cloud Foundry can sent traces directly to an OTLP endpoint, but the trace ID for the root span is generated (if not present) in the Gorouters (the reverse proxy on top of the applications HTTP routes). When an user query Spans or Traces in the backed (Grafana Tempo in our case) they get a message <root span not yet received> because there is no way to ingest those in the backend as root spans.

Following the PR #33044 we would like to continue adding support for traces in Cloud Foundry.

Describe the solution you'd like

Implement parsing of CF Gorouter access logs (source_id RTR) to convert them to Spans if possible.

The Gorouter access log is text based format separated by spaces, optionally extra headers are present with the format key:"value"

www.example.com - [2024-05-21T15:40:13.892179798Z] "GET /articles/ssdfws HTTP/1.1" 200 0 110563 "-" "python-requests/2.26.0" "20.191.2.244:52238" "10.88.195.81:61222" x_forwarded_for:"18.21.57.150, 10.28.45.29, 35.16.25.46, 20.191.2.244" x_forwarded_proto:"https" vcap_request_id:"766afb19-1779-4bb9-65d4-f01306f9f912" response_time:0.191835 gorouter_time:0.000139 app_id:"e3267823-0938-43ce-85ff-003e3e3a5a23" app_index:"4" instance_id:"918dd283-a0ed-48be-7f0c-253b" x_cf_routererror:"-" x_forwarded_host:"www.example.com" x_b3_traceid:"766afb1917794bb965d4f01306f9f912" x_b3_spanid:"65d4f01306f9f912" x_b3_parentspanid:"-" b3:"766afb1917794bb965d4f01306f9f912-65d4f01306f9f912" traceparent:"00-766afb1917794bb965d4f01306f9f912-65d4f01306f9f912-01" tracestate:"gorouter=65d4f01306f9f912"

The idea is parse each logline, look for W3C standard tracing headers and if not present try Zipkin headers and convert all information to a Span.

Moreover, logs can get the traceID and SpanID fiels filled with the same implementation.

Describe alternatives you've considered

Currently there is no alternative to get CF root spans in OTEL

Additional context

Enabling tracing in CF: https://docs.cloudfoundry.org/adminguide/w3c_tracing.html

CF loggregator architecture: https://docs.cloudfoundry.org/loggregator/architecture.html

@jriguera jriguera added enhancement New feature or request needs triage New item requiring triage labels Jul 4, 2024
Copy link
Contributor

github-actions bot commented Jul 4, 2024

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Copy link
Contributor

github-actions bot commented Sep 9, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Sep 9, 2024
@crobert-1 crobert-1 added never stale Issues marked with this label will be never staled and automatically removed and removed Stale labels Sep 9, 2024
@jriguera
Copy link
Contributor Author

Still working on it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request never stale Issues marked with this label will be never staled and automatically removed receiver/cloudfoundry
Projects
None yet
Development

No branches or pull requests

2 participants