Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document how to work with fork process web server models(Gunicorn, uWSGI etc...) #1609

Merged
merged 21 commits into from
Mar 8, 2021
Merged
Show file tree
Hide file tree
Changes from 18 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [Unreleased](https://github.com/open-telemetry/opentelemetry-python/compare/v0.18b0...HEAD)

### Added
- Document how to work with fork process web server models(Gunicorn, uWSGI etc...)
([#1609](https://github.com/open-telemetry/opentelemetry-python/pull/1609))
- Add `max_attr_value_length` support to Jaeger exporter
([#1633])(https://github.com/open-telemetry/opentelemetry-python/pull/1633)

Expand Down
8 changes: 6 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,11 +101,15 @@ machine.

1. Install scalene using the following command

`pip install scalene`
```sh
pip install scalene
```

2. Run the `scalene` tests on any of the example Python programs

`scalene opentelemetry-<PACKAGE>/tests/performance/resource-usage/<PATH_TO_TEST>/profile_resource_usage_<NAME_OF_TEST>.py`
```sh
scalene opentelemetry-<PACKAGE>/tests/performance/resource-usage/<PATH_TO_TEST>/profile_resource_usage_<NAME_OF_TEST>.py
```


## Documentation
Expand Down
66 changes: 66 additions & 0 deletions docs/examples/fork-process-model/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
Working With Fork Process Models
================================

The `BatchSpanProcessor` is not fork-safe and doesn't work well with application servers
(Gunicorn, uWSGI) which are based on the pre-fork web server model. The `BatchSpanProcessor`
spawns a thread to run in the background to export spans to the telemetry backend. During the fork, the child
process inherits the lock which is held by the parent process and deadlock occurs. We can use fork hooks to
get around this limitation of the span processor.

Please see http://bugs.python.org/issue6721 for the problems about Python locks in (multi)threaded
context with fork.

Gunicorn post_fork hook
-----------------------

.. code-block:: python

from opentelemetry import trace
from opentelemetry.exporter.otlp.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor


def post_fork(server, worker):
server.log.info("Worker spawned (pid: %s)", worker.pid)

resource = Resource.create(attributes={
"service.name": "api-service"
})

trace.set_tracer_provider(TracerProvider(resource=resource))
span_processor = BatchSpanProcessor(
OTLPSpanExporter(endpoint="localhost:4317")
)
trace.get_tracer_provider().add_span_processor(span_processor)


uWSGI postfork decorator
------------------------

.. code-block:: python

from uwsgidecorators import postfork

from opentelemetry import trace
from opentelemetry.exporter.otlp.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor


@postfork
srikanthccv marked this conversation as resolved.
Show resolved Hide resolved
def init_tracing():
resource = Resource.create(attributes={
"service.name": "api-service"
})

trace.set_tracer_provider(TracerProvider(resource=resource))
span_processor = BatchSpanProcessor(
OTLPSpanExporter(endpoint="localhost:4317")
)
trace.get_tracer_provider().add_span_processor(span_processor)


The source code for the examples with Flask app are available :scm_web:`here <docs/examples/fork-process-model/>`.
11 changes: 11 additions & 0 deletions docs/examples/fork-process-model/flask-gunicorn/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
Installation
------------
.. code-block:: sh

pip install -rrequirements.txt

Run application
---------------
.. code-block:: sh

gunicorn app -c gunicorn.config.py
58 changes: 58 additions & 0 deletions docs/examples/fork-process-model/flask-gunicorn/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Copyright The OpenTelemetry Authors
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import flask
from flask import request

from opentelemetry import trace
from opentelemetry.instrumentation.flask import FlaskInstrumentor

application = flask.Flask(__name__)

FlaskInstrumentor().instrument_app(application)


def fib_slow(n):
if n <= 1:
return n
return fib_slow(n - 1) + fib_fast(n - 2)


def fib_fast(n):
nth_fib = [0] * (n + 2)
nth_fib[1] = 1
for i in range(2, n + 1):
nth_fib[i] = nth_fib[i - 1] + nth_fib[i - 2]
return nth_fib[n]


@application.route("/fibonacci")
def fibonacci():
tracer = trace.get_tracer(__name__)
srikanthccv marked this conversation as resolved.
Show resolved Hide resolved
n = int(request.args.get("n", 1))
with tracer.start_as_current_span("root"):
with tracer.start_as_current_span("fib_slow") as slow_span:
ans = fib_slow(n)
slow_span.set_attribute("n", n)
slow_span.set_attribute("nth_fibonacci", ans)
with tracer.start_as_current_span("fib_fast") as fast_span:
ans = fib_fast(n)
fast_span.set_attribute("n", n)
fast_span.set_attribute("nth_fibonacci", ans)

return "F({}) is: ({})".format(n, ans)


if __name__ == "__main__":
application.run()
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# Copyright The OpenTelemetry Authors
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from opentelemetry import trace
from opentelemetry.exporter.otlp.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor

bind = "127.0.0.1:8000"
ocelotl marked this conversation as resolved.
Show resolved Hide resolved

# Sample Worker processes
workers = 4
worker_class = "sync"
worker_connections = 1000
timeout = 30
keepalive = 2

# Sample logging
errorlog = "-"
loglevel = "info"
accesslog = "-"
access_log_format = (
'%(h)s %(l)s %(u)s %(t)s "%(r)s" %(s)s %(b)s "%(f)s" "%(a)s"'
)


def post_fork(server, worker):
server.log.info("Worker spawned (pid: %s)", worker.pid)

resource = Resource.create(attributes={"service.name": "api-service"})

trace.set_tracer_provider(TracerProvider(resource=resource))
# This uses insecure connection for the purpose of example. Please see the
# OTLP Exporter documentation for other options.
span_processor = BatchSpanProcessor(
OTLPSpanExporter(endpoint="localhost:4317", insecure=True)
)
trace.get_tracer_provider().add_span_processor(span_processor)
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
click==7.1.2
Flask==1.1.2
googleapis-common-protos==1.52.0
grpcio==1.35.0
gunicorn==20.0.4
itsdangerous==1.1.0
Jinja2==2.11.3
MarkupSafe==1.1.1
opentelemetry-api==0.18b0
opentelemetry-exporter-otlp==0.18b0
opentelemetry-instrumentation==0.18b0
opentelemetry-instrumentation-flask==0.18b1
opentelemetry-instrumentation-wsgi==0.18b1
opentelemetry-sdk==0.18b0
protobuf==3.14.0
six==1.15.0
thrift==0.13.0
uWSGI==2.0.19.1
Werkzeug==1.0.1
wrapt==1.12.1
12 changes: 12 additions & 0 deletions docs/examples/fork-process-model/flask-uwsgi/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
Installation
------------
.. code-block:: sh

pip install -rrequirements.txt

Run application
---------------

.. code-block:: sh

uwsgi --http :8000 --wsgi-file app.py --callable application --master --enable-threads
76 changes: 76 additions & 0 deletions docs/examples/fork-process-model/flask-uwsgi/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# Copyright The OpenTelemetry Authors
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import flask
srikanthccv marked this conversation as resolved.
Show resolved Hide resolved
from flask import request
from uwsgidecorators import postfork

from opentelemetry import trace
from opentelemetry.exporter.otlp.trace_exporter import OTLPSpanExporter
from opentelemetry.instrumentation.flask import FlaskInstrumentor
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor

application = flask.Flask(__name__)

FlaskInstrumentor().instrument_app(application)


@postfork
def init_tracing():
resource = Resource.create(attributes={"service.name": "api-service"})

trace.set_tracer_provider(TracerProvider(resource=resource))
# This uses insecure connection for the purpose of example. Please see the
# OTLP Exporter documentation for other options.
span_processor = BatchSpanProcessor(
OTLPSpanExporter(endpoint="localhost:4317", insecure=True)
)
trace.get_tracer_provider().add_span_processor(span_processor)


def fib_slow(n):
if n <= 1:
return n
return fib_slow(n - 1) + fib_fast(n - 2)


def fib_fast(n):
nth_fib = [0] * (n + 2)
nth_fib[1] = 1
for i in range(2, n + 1):
nth_fib[i] = nth_fib[i - 1] + nth_fib[i - 2]
return nth_fib[n]


@application.route("/fibonacci")
def fibonacci():
tracer = trace.get_tracer(__name__)
n = int(request.args.get("n", 1))
with tracer.start_as_current_span("root"):
with tracer.start_as_current_span("fib_slow") as slow_span:
ans = fib_slow(n)
slow_span.set_attribute("n", n)
slow_span.set_attribute("nth_fibonacci", ans)
with tracer.start_as_current_span("fib_fast") as fast_span:
ans = fib_fast(n)
fast_span.set_attribute("n", n)
fast_span.set_attribute("nth_fibonacci", ans)

return "F({}) is: ({})".format(n, ans)


if __name__ == "__main__":
application.run()
20 changes: 20 additions & 0 deletions docs/examples/fork-process-model/flask-uwsgi/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
click==7.1.2
Flask==1.1.2
googleapis-common-protos==1.52.0
grpcio==1.35.0
gunicorn==20.0.4
itsdangerous==1.1.0
Jinja2==2.11.3
MarkupSafe==1.1.1
opentelemetry-api==0.18b0
opentelemetry-exporter-otlp==0.18b0
opentelemetry-instrumentation==0.18b0
opentelemetry-instrumentation-flask==0.18b1
opentelemetry-instrumentation-wsgi==0.18b1
opentelemetry-sdk==0.18b0
protobuf==3.14.0
six==1.15.0
thrift==0.13.0
uWSGI==2.0.19.1
Werkzeug==1.0.1
wrapt==1.12.1