diff --git a/Introduction/SCR-20230813-nhkq.png b/Introduction/SCR-20230813-nhkq.png
new file mode 100644
index 0000000..51ca427
Binary files /dev/null and b/Introduction/SCR-20230813-nhkq.png differ
diff --git a/Introduction/SCR-20230813-nkzc.png b/Introduction/SCR-20230813-nkzc.png
new file mode 100644
index 0000000..1d60c1c
Binary files /dev/null and b/Introduction/SCR-20230813-nkzc.png differ
diff --git a/Introduction/SCR-20230813-nmcj.png b/Introduction/SCR-20230813-nmcj.png
new file mode 100644
index 0000000..5a4ba1c
Binary files /dev/null and b/Introduction/SCR-20230813-nmcj.png differ
diff --git a/Introduction/SCR-20230813-nngy.png b/Introduction/SCR-20230813-nngy.png
new file mode 100644
index 0000000..e887758
Binary files /dev/null and b/Introduction/SCR-20230813-nngy.png differ
diff --git "a/Introduction/\346\234\211\351\201\223\347\277\273\350\257\221.md" "b/Introduction/\346\234\211\351\201\223\347\277\273\350\257\221.md"
new file mode 100644
index 0000000..05baf4b
--- /dev/null
+++ "b/Introduction/\346\234\211\351\201\223\347\277\273\350\257\221.md"
@@ -0,0 +1,77 @@
+# 有道翻译
+
+> 注意
+>
+> 本文信息可能会过时,仅供参考,请以服务商最新官方文档为准。
+>
+> 官方文档:[https://ai.youdao.com/doc.s#guide在新窗口打开](https://ai.youdao.com/doc.s#guide)
+
+## [#](https://bobtranslate.com/service/translate/youdao.html#_0-收费模式)0. 收费模式
+
+[查看详情在新窗口打开](https://ai.youdao.com/DOCSIRMA/html/自然语言翻译/产品定价/文本翻译服务/文本翻译服务-产品定价.html)
+
+| 服务 | 免费额度 | 超出免费额度 | 并发请求数 |
+| :--------------- | :------- | :-------------- | :--------- |
+| 中文与语种一互译 | 无 | 48元/100万字符 | - |
+| 中文与语种二互译 | 无 | 100元/100万字符 | - |
+| 其他语种间互译 | 无 | 100元/100万字符 | - |
+
+> 提示
+>
+> 新注册用户领100元体验金
+>
+> 新用户注册即免费获得10元体验金,实名认证通过再获得40元体验金,添加客服微信再得50元体验金!
+
+## [#](https://bobtranslate.com/service/translate/youdao.html#_1-注册登录)1. 注册登录
+
+[点击此处跳转网页在新窗口打开](https://ai.youdao.com/)
+
+![youdao_translate_login](https://cdn.ripperhe.com/oss/master/2020/0428/youdao_translate_login.png)
+
+> 注册完成后,按照页面提示添加有道客服微信并发送账号信息,可再获得50元体验金。
+
+## [#](https://bobtranslate.com/service/translate/youdao.html#_2-创建应用)2. 创建应用
+
+登录完成后,进入 [「业务指南-应用总览」在新窗口打开](https://ai.youdao.com/console/#/app-overview),点击「创建应用」
+
+![youdao_translate_app_1](https://cdn.wwang.de/w/image/bob/202205032017733.webp)
+
+应用名称随意填写,服务需勾选「文本翻译」,并按需勾选「语音合成」,接入方式选「API」,应用类别可随意选择,其他信息不用填,然后点击「确定」
+
+在翻译单词时,有道翻译可以为单词提供额外的美音和英音发音。如希望使用有道提供的发音功能,此处需同时勾选「语音合成」。语音合成是收费服务,会单独按发音使用量计费,详见 [「智能语音合成服务-产品定价」在新窗口打开](https://ai.youdao.com/DOCSIRMA/html/语音合成TTS/产品定价/语音合成服务/语音合成服务-产品定价.html)
+
+![youdao_translate_app_2](./SCR-20230813-nhkq.png)
+
+> 注意
+>
+> 请不要填写「服务器 IP」这一项设定,填写后很可能会导致你无法正常访问服务。
+
+## [#](https://bobtranslate.com/service/translate/youdao.html#_3-获取秘钥)3. 获取秘钥
+
+> 警告
+>
+> **前面的步骤,只要没明确说可以跳过,那就是不能跳过的,不然获取到秘钥也用不了!**
+
+此外,请妥善保管自己的秘钥,秘钥泄露可能会给你带来损失!
+
+进入 [「业务指南-业务总览」在新窗口打开](https://ai.youdao.com/console/#/),在「我的应用」中找到开通了「文本翻译」服务的应用,点击「应用 ID」和「应用密钥」旁的复制按钮可分别复制所需的应用 ID 和应用密钥
+
+![youdao_translate_secret_1](./SCR-20230813-nkzc.png)
+
+
+
+## 4. 填写秘钥
+
+在 Alfred 的 Workflows 中,选中「Youdao」,点击 `Configure Workflow` 。
+
+![](./SCR-20230813-nmcj.png)
+
+然后将刚才获取到的应用 ID 和应用密钥填写到对应位置即可。
+
+![](./SCR-20230813-nngy.png)
+
+
+
+## 5. 错误说明
+
+如果新版本有道智云遇到问题,请参见 [错误代码列表](https://ai.youdao.com/DOCSIRMA/html/trans/api/wbfy/index.html#section-14)。
diff --git a/README.md b/README.md
index e3fc832..e468b74 100755
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
# whyliam.workflows.youdao
-## 有道翻译 workflow v3.0.0
+## 有道翻译 workflow v3.1.0
默认快捷键 `yd`,查看翻译结果。
@@ -24,31 +24,13 @@
### 下载
-#### Python 3 版本
+[Python 3 版本](https://github.com/whyliam/whyliam.workflows.youdao/releases/download/3.1.0/whyliam.workflows.youdao.alfredworkflow) - 感谢 [Pid](https://github.com/zhugexiaobo)
-[Python 3 版本](https://github.com/whyliam/whyliam.workflows.youdao/releases/download/3.0.0/whyliam.workflows.youdao.alfredworkflow) - 感谢 [Pid](https://github.com/zhugexiaobo)
+### 使用说明
-#### Python 2 版本
-`macOS 12.3` 以下的用户请使用以下版本
+[使用说明](Introduction/有道翻译.md)
-[Python 2 版本](https://github.com/whyliam/whyliam.workflows.youdao/releases/download/2.2.5/whyliam.workflows.youdao.alfredworkflow)
-### 安装
-
-1\. [下载](https://github.com/whyliam/whyliam.workflows.youdao/releases)最新版本双击安装
-
-2\. [注册](http://ai.youdao.com/appmgr.s)有道智云应用
-
-3\. 在 Alfred 的设置中填入对应的`应用ID`和`应用密钥`
-
-![](https://tva1.sinaimg.cn/large/006tNbRwly1g9oapg37t0j31am0sgjxr.jpg)
-
-4\. 在 Alfred 的设置中设置快捷方式键
-![](http://ww2.sinaimg.cn/large/006tNbRwgy1feno6pzaxdj31a60p0jsl.jpg)
-
-### 问题
-
-如果新版本有道智云遇到问题,请参见 [错误代码列表](http://ai.youdao.com/docs/doc-trans-api.s#p08)。
### 演示
diff --git a/info.plist b/info.plist
index a0a6bcd..956a718 100644
--- a/info.plist
+++ b/info.plist
@@ -152,12 +152,14 @@
browser
+ skipqueryencode
+
+ skipvarencode
+
spaces
url
- http://dict.youdao.com/search?q={query}
- utf8
-
+ https://dict.youdao.com/result?word={query}&lang=en&keyfrom=whyliam.workflows.youdao
type
alfred.workflow.action.openurl
@@ -399,7 +401,7 @@
readme
- 有道翻译 Workflow v3.0.0
+ 有道翻译 Workflow v3.1.0
默认快捷键 yd, 查看翻译结果。
@@ -426,113 +428,144 @@
0907BEF4-816F-48FF-B157-03F5C2AACEAB
xpos
- 830
+ 830
ypos
- 420
+ 420
27E60581-8105-41DD-8E29-4FE811179098
xpos
- 500
+ 500
ypos
- 290
+ 290
4473C9D3-7A15-4D31-84F6-A096A7CFF46C
xpos
- 830
+ 830
ypos
- 150
+ 150
5751065C-52C1-4D19-8F7D-03B730BFE440
note
双击设置快捷方式
xpos
- 270
+ 270
ypos
- 440
+ 440
6A03FDC5-89AC-4F9D-9456-3762ACA751FE
xpos
- 830
+ 830
ypos
- 40
+ 40
7C1ABC41-3B36-401F-96C7-30BCB39181FF
xpos
- 500
+ 500
ypos
- 410
+ 410
7CAE8B02-CE31-4941-AD5F-C36CC84D164B
xpos
- 500
+ 500
ypos
- 560
+ 560
91C343E7-50D8-4B0D-9034-1C16C20DA8D4
xpos
- 270
+ 270
ypos
- 250
+ 250
DA8E4597-4B45-4C4C-A8D0-755BD785BD7A
xpos
- 830
+ 830
ypos
- 560
+ 560
DBA62127-3B78-4B80-B82B-1C6AEC393003
xpos
- 830
+ 830
ypos
- 290
+ 290
F99C4C55-10F5-4D62-A77D-F27058629B21
xpos
- 500
+ 500
ypos
- 40
+ 40
+ userconfigurationconfig
+
+
+ config
+
+ default
+
+ placeholder
+ 请输入应用ID
+ required
+
+ trim
+
+
+ description
+ 应用ID
+ label
+ 应用ID
+ type
+ textfield
+ variable
+ zhiyun_id
+
+
+ config
+
+ default
+
+ placeholder
+ 请输入应用密钥
+ required
+
+ trim
+
+
+ description
+ 应用密钥
+ label
+ 应用密钥
+ type
+ textfield
+ variable
+ zhiyun_key
+
+
variables
filepath
~/Documents/Alfred-youdao-wordbook.xml
password
- sentry
- True
username
- youdao_key
-
- youdao_keyfrom
-
- zhiyun_id
-
- zhiyun_key
-
variablesdontexport
- youdao_key
username
password
- youdao_keyfrom
- zhiyun_id
- zhiyun_key
version
- 3.0.0
+ 3.1.0
webaddress
https://github.com/whyliam/whyliam.workflows.youdao
diff --git a/sentry_sdk/__init__.py b/sentry_sdk/__init__.py
deleted file mode 100644
index ab5123e..0000000
--- a/sentry_sdk/__init__.py
+++ /dev/null
@@ -1,40 +0,0 @@
-from sentry_sdk.hub import Hub, init
-from sentry_sdk.scope import Scope
-from sentry_sdk.transport import Transport, HttpTransport
-from sentry_sdk.client import Client
-
-from sentry_sdk.api import * # noqa
-
-from sentry_sdk.consts import VERSION # noqa
-
-__all__ = [ # noqa
- "Hub",
- "Scope",
- "Client",
- "Transport",
- "HttpTransport",
- "init",
- "integrations",
- # From sentry_sdk.api
- "capture_event",
- "capture_message",
- "capture_exception",
- "add_breadcrumb",
- "configure_scope",
- "push_scope",
- "flush",
- "last_event_id",
- "start_span",
- "start_transaction",
- "set_tag",
- "set_context",
- "set_extra",
- "set_user",
- "set_level",
-]
-
-# Initialize the debug support after everything is loaded
-from sentry_sdk.debug import init_debug_support
-
-init_debug_support()
-del init_debug_support
diff --git a/sentry_sdk/_compat.py b/sentry_sdk/_compat.py
deleted file mode 100644
index 49a5539..0000000
--- a/sentry_sdk/_compat.py
+++ /dev/null
@@ -1,89 +0,0 @@
-import sys
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Optional
- from typing import Tuple
- from typing import Any
- from typing import Type
- from typing import TypeVar
-
- T = TypeVar("T")
-
-
-PY2 = sys.version_info[0] == 2
-
-if PY2:
- import urlparse # noqa
-
- text_type = unicode # noqa
-
- string_types = (str, text_type)
- number_types = (int, long, float) # noqa
- int_types = (int, long) # noqa
- iteritems = lambda x: x.iteritems() # noqa: B301
-
- def implements_str(cls):
- # type: (T) -> T
- cls.__unicode__ = cls.__str__
- cls.__str__ = lambda x: unicode(x).encode("utf-8") # noqa
- return cls
-
- exec("def reraise(tp, value, tb=None):\n raise tp, value, tb")
-
-
-else:
- import urllib.parse as urlparse # noqa
-
- text_type = str
- string_types = (text_type,) # type: Tuple[type]
- number_types = (int, float) # type: Tuple[type, type]
- int_types = (int,) # noqa
- iteritems = lambda x: x.items()
-
- def implements_str(x):
- # type: (T) -> T
- return x
-
- def reraise(tp, value, tb=None):
- # type: (Optional[Type[BaseException]], Optional[BaseException], Optional[Any]) -> None
- assert value is not None
- if value.__traceback__ is not tb:
- raise value.with_traceback(tb)
- raise value
-
-
-def with_metaclass(meta, *bases):
- # type: (Any, *Any) -> Any
- class MetaClass(type):
- def __new__(metacls, name, this_bases, d):
- # type: (Any, Any, Any, Any) -> Any
- return meta(name, bases, d)
-
- return type.__new__(MetaClass, "temporary_class", (), {})
-
-
-def check_thread_support():
- # type: () -> None
- try:
- from uwsgi import opt # type: ignore
- except ImportError:
- return
-
- # When `threads` is passed in as a uwsgi option,
- # `enable-threads` is implied on.
- if "threads" in opt:
- return
-
- if str(opt.get("enable-threads", "0")).lower() in ("false", "off", "no", "0"):
- from warnings import warn
-
- warn(
- Warning(
- "We detected the use of uwsgi with disabled threads. "
- "This will cause issues with the transport you are "
- "trying to use. Please enable threading for uwsgi. "
- '(Add the "enable-threads" flag).'
- )
- )
diff --git a/sentry_sdk/_functools.py b/sentry_sdk/_functools.py
deleted file mode 100644
index 8dcf79c..0000000
--- a/sentry_sdk/_functools.py
+++ /dev/null
@@ -1,66 +0,0 @@
-"""
-A backport of Python 3 functools to Python 2/3. The only important change
-we rely upon is that `update_wrapper` handles AttributeError gracefully.
-"""
-
-from functools import partial
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Callable
-
-
-WRAPPER_ASSIGNMENTS = (
- "__module__",
- "__name__",
- "__qualname__",
- "__doc__",
- "__annotations__",
-)
-WRAPPER_UPDATES = ("__dict__",)
-
-
-def update_wrapper(
- wrapper, wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES
-):
- # type: (Any, Any, Any, Any) -> Any
- """Update a wrapper function to look like the wrapped function
-
- wrapper is the function to be updated
- wrapped is the original function
- assigned is a tuple naming the attributes assigned directly
- from the wrapped function to the wrapper function (defaults to
- functools.WRAPPER_ASSIGNMENTS)
- updated is a tuple naming the attributes of the wrapper that
- are updated with the corresponding attribute from the wrapped
- function (defaults to functools.WRAPPER_UPDATES)
- """
- for attr in assigned:
- try:
- value = getattr(wrapped, attr)
- except AttributeError:
- pass
- else:
- setattr(wrapper, attr, value)
- for attr in updated:
- getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
- # Issue #17482: set __wrapped__ last so we don't inadvertently copy it
- # from the wrapped function when updating __dict__
- wrapper.__wrapped__ = wrapped
- # Return the wrapper so this can be used as a decorator via partial()
- return wrapper
-
-
-def wraps(wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES):
- # type: (Callable[..., Any], Any, Any) -> Callable[[Callable[..., Any]], Callable[..., Any]]
- """Decorator factory to apply update_wrapper() to a wrapper function
-
- Returns a decorator that invokes update_wrapper() with the decorated
- function as the wrapper argument and the arguments to wraps() as the
- remaining arguments. Default arguments are as for update_wrapper().
- This is a convenience function to simplify applying partial() to
- update_wrapper().
- """
- return partial(update_wrapper, wrapped=wrapped, assigned=assigned, updated=updated)
diff --git a/sentry_sdk/_queue.py b/sentry_sdk/_queue.py
deleted file mode 100644
index e368da2..0000000
--- a/sentry_sdk/_queue.py
+++ /dev/null
@@ -1,227 +0,0 @@
-"""
-A fork of Python 3.6's stdlib queue with Lock swapped out for RLock to avoid a
-deadlock while garbage collecting.
-
-See
-https://codewithoutrules.com/2017/08/16/concurrency-python/
-https://bugs.python.org/issue14976
-https://github.com/sqlalchemy/sqlalchemy/blob/4eb747b61f0c1b1c25bdee3856d7195d10a0c227/lib/sqlalchemy/queue.py#L1
-
-We also vendor the code to evade eventlet's broken monkeypatching, see
-https://github.com/getsentry/sentry-python/pull/484
-"""
-
-import threading
-
-from collections import deque
-from time import time
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
-
-__all__ = ["Empty", "Full", "Queue"]
-
-
-class Empty(Exception):
- "Exception raised by Queue.get(block=0)/get_nowait()."
- pass
-
-
-class Full(Exception):
- "Exception raised by Queue.put(block=0)/put_nowait()."
- pass
-
-
-class Queue(object):
- """Create a queue object with a given maximum size.
-
- If maxsize is <= 0, the queue size is infinite.
- """
-
- def __init__(self, maxsize=0):
- self.maxsize = maxsize
- self._init(maxsize)
-
- # mutex must be held whenever the queue is mutating. All methods
- # that acquire mutex must release it before returning. mutex
- # is shared between the three conditions, so acquiring and
- # releasing the conditions also acquires and releases mutex.
- self.mutex = threading.RLock()
-
- # Notify not_empty whenever an item is added to the queue; a
- # thread waiting to get is notified then.
- self.not_empty = threading.Condition(self.mutex)
-
- # Notify not_full whenever an item is removed from the queue;
- # a thread waiting to put is notified then.
- self.not_full = threading.Condition(self.mutex)
-
- # Notify all_tasks_done whenever the number of unfinished tasks
- # drops to zero; thread waiting to join() is notified to resume
- self.all_tasks_done = threading.Condition(self.mutex)
- self.unfinished_tasks = 0
-
- def task_done(self):
- """Indicate that a formerly enqueued task is complete.
-
- Used by Queue consumer threads. For each get() used to fetch a task,
- a subsequent call to task_done() tells the queue that the processing
- on the task is complete.
-
- If a join() is currently blocking, it will resume when all items
- have been processed (meaning that a task_done() call was received
- for every item that had been put() into the queue).
-
- Raises a ValueError if called more times than there were items
- placed in the queue.
- """
- with self.all_tasks_done:
- unfinished = self.unfinished_tasks - 1
- if unfinished <= 0:
- if unfinished < 0:
- raise ValueError("task_done() called too many times")
- self.all_tasks_done.notify_all()
- self.unfinished_tasks = unfinished
-
- def join(self):
- """Blocks until all items in the Queue have been gotten and processed.
-
- The count of unfinished tasks goes up whenever an item is added to the
- queue. The count goes down whenever a consumer thread calls task_done()
- to indicate the item was retrieved and all work on it is complete.
-
- When the count of unfinished tasks drops to zero, join() unblocks.
- """
- with self.all_tasks_done:
- while self.unfinished_tasks:
- self.all_tasks_done.wait()
-
- def qsize(self):
- """Return the approximate size of the queue (not reliable!)."""
- with self.mutex:
- return self._qsize()
-
- def empty(self):
- """Return True if the queue is empty, False otherwise (not reliable!).
-
- This method is likely to be removed at some point. Use qsize() == 0
- as a direct substitute, but be aware that either approach risks a race
- condition where a queue can grow before the result of empty() or
- qsize() can be used.
-
- To create code that needs to wait for all queued tasks to be
- completed, the preferred technique is to use the join() method.
- """
- with self.mutex:
- return not self._qsize()
-
- def full(self):
- """Return True if the queue is full, False otherwise (not reliable!).
-
- This method is likely to be removed at some point. Use qsize() >= n
- as a direct substitute, but be aware that either approach risks a race
- condition where a queue can shrink before the result of full() or
- qsize() can be used.
- """
- with self.mutex:
- return 0 < self.maxsize <= self._qsize()
-
- def put(self, item, block=True, timeout=None):
- """Put an item into the queue.
-
- If optional args 'block' is true and 'timeout' is None (the default),
- block if necessary until a free slot is available. If 'timeout' is
- a non-negative number, it blocks at most 'timeout' seconds and raises
- the Full exception if no free slot was available within that time.
- Otherwise ('block' is false), put an item on the queue if a free slot
- is immediately available, else raise the Full exception ('timeout'
- is ignored in that case).
- """
- with self.not_full:
- if self.maxsize > 0:
- if not block:
- if self._qsize() >= self.maxsize:
- raise Full()
- elif timeout is None:
- while self._qsize() >= self.maxsize:
- self.not_full.wait()
- elif timeout < 0:
- raise ValueError("'timeout' must be a non-negative number")
- else:
- endtime = time() + timeout
- while self._qsize() >= self.maxsize:
- remaining = endtime - time()
- if remaining <= 0.0:
- raise Full
- self.not_full.wait(remaining)
- self._put(item)
- self.unfinished_tasks += 1
- self.not_empty.notify()
-
- def get(self, block=True, timeout=None):
- """Remove and return an item from the queue.
-
- If optional args 'block' is true and 'timeout' is None (the default),
- block if necessary until an item is available. If 'timeout' is
- a non-negative number, it blocks at most 'timeout' seconds and raises
- the Empty exception if no item was available within that time.
- Otherwise ('block' is false), return an item if one is immediately
- available, else raise the Empty exception ('timeout' is ignored
- in that case).
- """
- with self.not_empty:
- if not block:
- if not self._qsize():
- raise Empty()
- elif timeout is None:
- while not self._qsize():
- self.not_empty.wait()
- elif timeout < 0:
- raise ValueError("'timeout' must be a non-negative number")
- else:
- endtime = time() + timeout
- while not self._qsize():
- remaining = endtime - time()
- if remaining <= 0.0:
- raise Empty()
- self.not_empty.wait(remaining)
- item = self._get()
- self.not_full.notify()
- return item
-
- def put_nowait(self, item):
- """Put an item into the queue without blocking.
-
- Only enqueue the item if a free slot is immediately available.
- Otherwise raise the Full exception.
- """
- return self.put(item, block=False)
-
- def get_nowait(self):
- """Remove and return an item from the queue without blocking.
-
- Only get an item if one is immediately available. Otherwise
- raise the Empty exception.
- """
- return self.get(block=False)
-
- # Override these methods to implement other queue organizations
- # (e.g. stack or priority queue).
- # These will only be called with appropriate locks held
-
- # Initialize the queue representation
- def _init(self, maxsize):
- self.queue = deque() # type: Any
-
- def _qsize(self):
- return len(self.queue)
-
- # Put a new item in the queue
- def _put(self, item):
- self.queue.append(item)
-
- # Get an item from the queue
- def _get(self):
- return self.queue.popleft()
diff --git a/sentry_sdk/_types.py b/sentry_sdk/_types.py
deleted file mode 100644
index 7ce7e9e..0000000
--- a/sentry_sdk/_types.py
+++ /dev/null
@@ -1,50 +0,0 @@
-try:
- from typing import TYPE_CHECKING as MYPY
-except ImportError:
- MYPY = False
-
-
-if MYPY:
- from types import TracebackType
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import Optional
- from typing import Tuple
- from typing import Type
- from typing import Union
- from typing_extensions import Literal
-
- ExcInfo = Tuple[
- Optional[Type[BaseException]], Optional[BaseException], Optional[TracebackType]
- ]
-
- Event = Dict[str, Any]
- Hint = Dict[str, Any]
-
- Breadcrumb = Dict[str, Any]
- BreadcrumbHint = Dict[str, Any]
-
- SamplingContext = Dict[str, Any]
-
- EventProcessor = Callable[[Event, Hint], Optional[Event]]
- ErrorProcessor = Callable[[Event, ExcInfo], Optional[Event]]
- BreadcrumbProcessor = Callable[[Breadcrumb, BreadcrumbHint], Optional[Breadcrumb]]
-
- TracesSampler = Callable[[SamplingContext], Union[float, int, bool]]
-
- # https://github.com/python/mypy/issues/5710
- NotImplementedType = Any
-
- EventDataCategory = Literal[
- "default",
- "error",
- "crash",
- "transaction",
- "security",
- "attachment",
- "session",
- "internal",
- ]
- SessionStatus = Literal["ok", "exited", "crashed", "abnormal"]
- EndpointType = Literal["store", "envelope"]
diff --git a/sentry_sdk/api.py b/sentry_sdk/api.py
deleted file mode 100644
index f4a44e4..0000000
--- a/sentry_sdk/api.py
+++ /dev/null
@@ -1,214 +0,0 @@
-import inspect
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.scope import Scope
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Optional
- from typing import overload
- from typing import Callable
- from typing import TypeVar
- from typing import ContextManager
- from typing import Union
-
- from sentry_sdk._types import Event, Hint, Breadcrumb, BreadcrumbHint, ExcInfo
- from sentry_sdk.tracing import Span, Transaction
-
- T = TypeVar("T")
- F = TypeVar("F", bound=Callable[..., Any])
-else:
-
- def overload(x):
- # type: (T) -> T
- return x
-
-
-# When changing this, update __all__ in __init__.py too
-__all__ = [
- "capture_event",
- "capture_message",
- "capture_exception",
- "add_breadcrumb",
- "configure_scope",
- "push_scope",
- "flush",
- "last_event_id",
- "start_span",
- "start_transaction",
- "set_tag",
- "set_context",
- "set_extra",
- "set_user",
- "set_level",
-]
-
-
-def hubmethod(f):
- # type: (F) -> F
- f.__doc__ = "%s\n\n%s" % (
- "Alias for :py:meth:`sentry_sdk.Hub.%s`" % f.__name__,
- inspect.getdoc(getattr(Hub, f.__name__)),
- )
- return f
-
-
-def scopemethod(f):
- # type: (F) -> F
- f.__doc__ = "%s\n\n%s" % (
- "Alias for :py:meth:`sentry_sdk.Scope.%s`" % f.__name__,
- inspect.getdoc(getattr(Scope, f.__name__)),
- )
- return f
-
-
-@hubmethod
-def capture_event(
- event, # type: Event
- hint=None, # type: Optional[Hint]
- scope=None, # type: Optional[Any]
- **scope_args # type: Any
-):
- # type: (...) -> Optional[str]
- return Hub.current.capture_event(event, hint, scope=scope, **scope_args)
-
-
-@hubmethod
-def capture_message(
- message, # type: str
- level=None, # type: Optional[str]
- scope=None, # type: Optional[Any]
- **scope_args # type: Any
-):
- # type: (...) -> Optional[str]
- return Hub.current.capture_message(message, level, scope=scope, **scope_args)
-
-
-@hubmethod
-def capture_exception(
- error=None, # type: Optional[Union[BaseException, ExcInfo]]
- scope=None, # type: Optional[Any]
- **scope_args # type: Any
-):
- # type: (...) -> Optional[str]
- return Hub.current.capture_exception(error, scope=scope, **scope_args)
-
-
-@hubmethod
-def add_breadcrumb(
- crumb=None, # type: Optional[Breadcrumb]
- hint=None, # type: Optional[BreadcrumbHint]
- **kwargs # type: Any
-):
- # type: (...) -> None
- return Hub.current.add_breadcrumb(crumb, hint, **kwargs)
-
-
-@overload
-def configure_scope(): # noqa: F811
- # type: () -> ContextManager[Scope]
- pass
-
-
-@overload
-def configure_scope( # noqa: F811
- callback, # type: Callable[[Scope], None]
-):
- # type: (...) -> None
- pass
-
-
-@hubmethod
-def configure_scope( # noqa: F811
- callback=None, # type: Optional[Callable[[Scope], None]]
-):
- # type: (...) -> Optional[ContextManager[Scope]]
- return Hub.current.configure_scope(callback)
-
-
-@overload
-def push_scope(): # noqa: F811
- # type: () -> ContextManager[Scope]
- pass
-
-
-@overload
-def push_scope( # noqa: F811
- callback, # type: Callable[[Scope], None]
-):
- # type: (...) -> None
- pass
-
-
-@hubmethod
-def push_scope( # noqa: F811
- callback=None, # type: Optional[Callable[[Scope], None]]
-):
- # type: (...) -> Optional[ContextManager[Scope]]
- return Hub.current.push_scope(callback)
-
-
-@scopemethod # noqa
-def set_tag(key, value):
- # type: (str, Any) -> None
- return Hub.current.scope.set_tag(key, value)
-
-
-@scopemethod # noqa
-def set_context(key, value):
- # type: (str, Dict[str, Any]) -> None
- return Hub.current.scope.set_context(key, value)
-
-
-@scopemethod # noqa
-def set_extra(key, value):
- # type: (str, Any) -> None
- return Hub.current.scope.set_extra(key, value)
-
-
-@scopemethod # noqa
-def set_user(value):
- # type: (Optional[Dict[str, Any]]) -> None
- return Hub.current.scope.set_user(value)
-
-
-@scopemethod # noqa
-def set_level(value):
- # type: (str) -> None
- return Hub.current.scope.set_level(value)
-
-
-@hubmethod
-def flush(
- timeout=None, # type: Optional[float]
- callback=None, # type: Optional[Callable[[int, float], None]]
-):
- # type: (...) -> None
- return Hub.current.flush(timeout=timeout, callback=callback)
-
-
-@hubmethod
-def last_event_id():
- # type: () -> Optional[str]
- return Hub.current.last_event_id()
-
-
-@hubmethod
-def start_span(
- span=None, # type: Optional[Span]
- **kwargs # type: Any
-):
- # type: (...) -> Span
- return Hub.current.start_span(span=span, **kwargs)
-
-
-@hubmethod
-def start_transaction(
- transaction=None, # type: Optional[Transaction]
- **kwargs # type: Any
-):
- # type: (...) -> Transaction
- return Hub.current.start_transaction(transaction, **kwargs)
diff --git a/sentry_sdk/attachments.py b/sentry_sdk/attachments.py
deleted file mode 100644
index b7b6b0b..0000000
--- a/sentry_sdk/attachments.py
+++ /dev/null
@@ -1,55 +0,0 @@
-import os
-import mimetypes
-
-from sentry_sdk._types import MYPY
-from sentry_sdk.envelope import Item, PayloadRef
-
-if MYPY:
- from typing import Optional, Union, Callable
-
-
-class Attachment(object):
- def __init__(
- self,
- bytes=None, # type: Union[None, bytes, Callable[[], bytes]]
- filename=None, # type: Optional[str]
- path=None, # type: Optional[str]
- content_type=None, # type: Optional[str]
- add_to_transactions=False, # type: bool
- ):
- # type: (...) -> None
- if bytes is None and path is None:
- raise TypeError("path or raw bytes required for attachment")
- if filename is None and path is not None:
- filename = os.path.basename(path)
- if filename is None:
- raise TypeError("filename is required for attachment")
- if content_type is None:
- content_type = mimetypes.guess_type(filename)[0]
- self.bytes = bytes
- self.filename = filename
- self.path = path
- self.content_type = content_type
- self.add_to_transactions = add_to_transactions
-
- def to_envelope_item(self):
- # type: () -> Item
- """Returns an envelope item for this attachment."""
- payload = None # type: Union[None, PayloadRef, bytes]
- if self.bytes is not None:
- if callable(self.bytes):
- payload = self.bytes()
- else:
- payload = self.bytes
- else:
- payload = PayloadRef(path=self.path)
- return Item(
- payload=payload,
- type="attachment",
- content_type=self.content_type,
- filename=self.filename,
- )
-
- def __repr__(self):
- # type: () -> str
- return "" % (self.filename,)
diff --git a/sentry_sdk/client.py b/sentry_sdk/client.py
deleted file mode 100644
index 1720993..0000000
--- a/sentry_sdk/client.py
+++ /dev/null
@@ -1,462 +0,0 @@
-import os
-import uuid
-import random
-from datetime import datetime
-import socket
-
-from sentry_sdk._compat import string_types, text_type, iteritems
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- current_stacktrace,
- disable_capture_event,
- format_timestamp,
- get_type_name,
- get_default_release,
- handle_in_app,
- logger,
-)
-from sentry_sdk.serializer import serialize
-from sentry_sdk.transport import make_transport
-from sentry_sdk.consts import DEFAULT_OPTIONS, SDK_INFO, ClientConstructor
-from sentry_sdk.integrations import setup_integrations
-from sentry_sdk.utils import ContextVar
-from sentry_sdk.sessions import SessionFlusher
-from sentry_sdk.envelope import Envelope
-from sentry_sdk.tracing_utils import has_tracestate_enabled, reinflate_tracestate
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import Optional
-
- from sentry_sdk.scope import Scope
- from sentry_sdk._types import Event, Hint
- from sentry_sdk.session import Session
-
-
-_client_init_debug = ContextVar("client_init_debug")
-
-
-def _get_options(*args, **kwargs):
- # type: (*Optional[str], **Any) -> Dict[str, Any]
- if args and (isinstance(args[0], (text_type, bytes, str)) or args[0] is None):
- dsn = args[0] # type: Optional[str]
- args = args[1:]
- else:
- dsn = None
-
- rv = dict(DEFAULT_OPTIONS)
- options = dict(*args, **kwargs)
- if dsn is not None and options.get("dsn") is None:
- options["dsn"] = dsn
-
- for key, value in iteritems(options):
- if key not in rv:
- raise TypeError("Unknown option %r" % (key,))
- rv[key] = value
-
- if rv["dsn"] is None:
- rv["dsn"] = os.environ.get("SENTRY_DSN")
-
- if rv["release"] is None:
- rv["release"] = get_default_release()
-
- if rv["environment"] is None:
- rv["environment"] = os.environ.get("SENTRY_ENVIRONMENT") or "production"
-
- if rv["server_name"] is None and hasattr(socket, "gethostname"):
- rv["server_name"] = socket.gethostname()
-
- return rv
-
-
-class _Client(object):
- """The client is internally responsible for capturing the events and
- forwarding them to sentry through the configured transport. It takes
- the client options as keyword arguments and optionally the DSN as first
- argument.
- """
-
- def __init__(self, *args, **kwargs):
- # type: (*Any, **Any) -> None
- self.options = get_options(*args, **kwargs) # type: Dict[str, Any]
- self._init_impl()
-
- def __getstate__(self):
- # type: () -> Any
- return {"options": self.options}
-
- def __setstate__(self, state):
- # type: (Any) -> None
- self.options = state["options"]
- self._init_impl()
-
- def _init_impl(self):
- # type: () -> None
- old_debug = _client_init_debug.get(False)
-
- def _capture_envelope(envelope):
- # type: (Envelope) -> None
- if self.transport is not None:
- self.transport.capture_envelope(envelope)
-
- try:
- _client_init_debug.set(self.options["debug"])
- self.transport = make_transport(self.options)
-
- self.session_flusher = SessionFlusher(capture_func=_capture_envelope)
-
- request_bodies = ("always", "never", "small", "medium")
- if self.options["request_bodies"] not in request_bodies:
- raise ValueError(
- "Invalid value for request_bodies. Must be one of {}".format(
- request_bodies
- )
- )
-
- self.integrations = setup_integrations(
- self.options["integrations"],
- with_defaults=self.options["default_integrations"],
- with_auto_enabling_integrations=self.options[
- "auto_enabling_integrations"
- ],
- )
- finally:
- _client_init_debug.set(old_debug)
-
- @property
- def dsn(self):
- # type: () -> Optional[str]
- """Returns the configured DSN as string."""
- return self.options["dsn"]
-
- def _prepare_event(
- self,
- event, # type: Event
- hint, # type: Hint
- scope, # type: Optional[Scope]
- ):
- # type: (...) -> Optional[Event]
-
- if event.get("timestamp") is None:
- event["timestamp"] = datetime.utcnow()
-
- if scope is not None:
- is_transaction = event.get("type") == "transaction"
- event_ = scope.apply_to_event(event, hint)
-
- # one of the event/error processors returned None
- if event_ is None:
- if self.transport:
- self.transport.record_lost_event(
- "event_processor",
- data_category=("transaction" if is_transaction else "error"),
- )
- return None
-
- event = event_
-
- if (
- self.options["attach_stacktrace"]
- and "exception" not in event
- and "stacktrace" not in event
- and "threads" not in event
- ):
- with capture_internal_exceptions():
- event["threads"] = {
- "values": [
- {
- "stacktrace": current_stacktrace(
- self.options["with_locals"]
- ),
- "crashed": False,
- "current": True,
- }
- ]
- }
-
- for key in "release", "environment", "server_name", "dist":
- if event.get(key) is None and self.options[key] is not None:
- event[key] = text_type(self.options[key]).strip()
- if event.get("sdk") is None:
- sdk_info = dict(SDK_INFO)
- sdk_info["integrations"] = sorted(self.integrations.keys())
- event["sdk"] = sdk_info
-
- if event.get("platform") is None:
- event["platform"] = "python"
-
- event = handle_in_app(
- event, self.options["in_app_exclude"], self.options["in_app_include"]
- )
-
- # Postprocess the event here so that annotated types do
- # generally not surface in before_send
- if event is not None:
- event = serialize(
- event,
- smart_transaction_trimming=self.options["_experiments"].get(
- "smart_transaction_trimming"
- ),
- )
-
- before_send = self.options["before_send"]
- if before_send is not None and event.get("type") != "transaction":
- new_event = None
- with capture_internal_exceptions():
- new_event = before_send(event, hint or {})
- if new_event is None:
- logger.info("before send dropped event (%s)", event)
- if self.transport:
- self.transport.record_lost_event(
- "before_send", data_category="error"
- )
- event = new_event # type: ignore
-
- return event
-
- def _is_ignored_error(self, event, hint):
- # type: (Event, Hint) -> bool
- exc_info = hint.get("exc_info")
- if exc_info is None:
- return False
-
- type_name = get_type_name(exc_info[0])
- full_name = "%s.%s" % (exc_info[0].__module__, type_name)
-
- for errcls in self.options["ignore_errors"]:
- # String types are matched against the type name in the
- # exception only
- if isinstance(errcls, string_types):
- if errcls == full_name or errcls == type_name:
- return True
- else:
- if issubclass(exc_info[0], errcls):
- return True
-
- return False
-
- def _should_capture(
- self,
- event, # type: Event
- hint, # type: Hint
- scope=None, # type: Optional[Scope]
- ):
- # type: (...) -> bool
- if event.get("type") == "transaction":
- # Transactions are sampled independent of error events.
- return True
-
- if scope is not None and not scope._should_capture:
- return False
-
- if (
- self.options["sample_rate"] < 1.0
- and random.random() >= self.options["sample_rate"]
- ):
- # record a lost event if we did not sample this.
- if self.transport:
- self.transport.record_lost_event("sample_rate", data_category="error")
- return False
-
- if self._is_ignored_error(event, hint):
- return False
-
- return True
-
- def _update_session_from_event(
- self,
- session, # type: Session
- event, # type: Event
- ):
- # type: (...) -> None
-
- crashed = False
- errored = False
- user_agent = None
-
- exceptions = (event.get("exception") or {}).get("values")
- if exceptions:
- errored = True
- for error in exceptions:
- mechanism = error.get("mechanism")
- if mechanism and mechanism.get("handled") is False:
- crashed = True
- break
-
- user = event.get("user")
-
- if session.user_agent is None:
- headers = (event.get("request") or {}).get("headers")
- for (k, v) in iteritems(headers or {}):
- if k.lower() == "user-agent":
- user_agent = v
- break
-
- session.update(
- status="crashed" if crashed else None,
- user=user,
- user_agent=user_agent,
- errors=session.errors + (errored or crashed),
- )
-
- def capture_event(
- self,
- event, # type: Event
- hint=None, # type: Optional[Hint]
- scope=None, # type: Optional[Scope]
- ):
- # type: (...) -> Optional[str]
- """Captures an event.
-
- :param event: A ready-made event that can be directly sent to Sentry.
-
- :param hint: Contains metadata about the event that can be read from `before_send`, such as the original exception object or a HTTP request object.
-
- :returns: An event ID. May be `None` if there is no DSN set or of if the SDK decided to discard the event for other reasons. In such situations setting `debug=True` on `init()` may help.
- """
- if disable_capture_event.get(False):
- return None
-
- if self.transport is None:
- return None
- if hint is None:
- hint = {}
- event_id = event.get("event_id")
- hint = dict(hint or ()) # type: Hint
-
- if event_id is None:
- event["event_id"] = event_id = uuid.uuid4().hex
- if not self._should_capture(event, hint, scope):
- return None
-
- event_opt = self._prepare_event(event, hint, scope)
- if event_opt is None:
- return None
-
- # whenever we capture an event we also check if the session needs
- # to be updated based on that information.
- session = scope._session if scope else None
- if session:
- self._update_session_from_event(session, event)
-
- attachments = hint.get("attachments")
- is_transaction = event_opt.get("type") == "transaction"
-
- # this is outside of the `if` immediately below because even if we don't
- # use the value, we want to make sure we remove it before the event is
- # sent
- raw_tracestate = (
- event_opt.get("contexts", {}).get("trace", {}).pop("tracestate", "")
- )
-
- # Transactions or events with attachments should go to the /envelope/
- # endpoint.
- if is_transaction or attachments:
-
- headers = {
- "event_id": event_opt["event_id"],
- "sent_at": format_timestamp(datetime.utcnow()),
- }
-
- tracestate_data = raw_tracestate and reinflate_tracestate(
- raw_tracestate.replace("sentry=", "")
- )
- if tracestate_data and has_tracestate_enabled():
- headers["trace"] = tracestate_data
-
- envelope = Envelope(headers=headers)
-
- if is_transaction:
- envelope.add_transaction(event_opt)
- else:
- envelope.add_event(event_opt)
-
- for attachment in attachments or ():
- envelope.add_item(attachment.to_envelope_item())
- self.transport.capture_envelope(envelope)
- else:
- # All other events go to the /store/ endpoint.
- self.transport.capture_event(event_opt)
- return event_id
-
- def capture_session(
- self, session # type: Session
- ):
- # type: (...) -> None
- if not session.release:
- logger.info("Discarded session update because of missing release")
- else:
- self.session_flusher.add_session(session)
-
- def close(
- self,
- timeout=None, # type: Optional[float]
- callback=None, # type: Optional[Callable[[int, float], None]]
- ):
- # type: (...) -> None
- """
- Close the client and shut down the transport. Arguments have the same
- semantics as :py:meth:`Client.flush`.
- """
- if self.transport is not None:
- self.flush(timeout=timeout, callback=callback)
- self.session_flusher.kill()
- self.transport.kill()
- self.transport = None
-
- def flush(
- self,
- timeout=None, # type: Optional[float]
- callback=None, # type: Optional[Callable[[int, float], None]]
- ):
- # type: (...) -> None
- """
- Wait for the current events to be sent.
-
- :param timeout: Wait for at most `timeout` seconds. If no `timeout` is provided, the `shutdown_timeout` option value is used.
-
- :param callback: Is invoked with the number of pending events and the configured timeout.
- """
- if self.transport is not None:
- if timeout is None:
- timeout = self.options["shutdown_timeout"]
- self.session_flusher.flush()
- self.transport.flush(timeout=timeout, callback=callback)
-
- def __enter__(self):
- # type: () -> _Client
- return self
-
- def __exit__(self, exc_type, exc_value, tb):
- # type: (Any, Any, Any) -> None
- self.close()
-
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- # Make mypy, PyCharm and other static analyzers think `get_options` is a
- # type to have nicer autocompletion for params.
- #
- # Use `ClientConstructor` to define the argument types of `init` and
- # `Dict[str, Any]` to tell static analyzers about the return type.
-
- class get_options(ClientConstructor, Dict[str, Any]): # noqa: N801
- pass
-
- class Client(ClientConstructor, _Client):
- pass
-
-
-else:
- # Alias `get_options` for actual usage. Go through the lambda indirection
- # to throw PyCharm off of the weakly typed signature (it would otherwise
- # discover both the weakly typed signature of `_init` and our faked `init`
- # type).
-
- get_options = (lambda: _get_options)()
- Client = (lambda: _Client)()
diff --git a/sentry_sdk/consts.py b/sentry_sdk/consts.py
deleted file mode 100644
index fe3b2f0..0000000
--- a/sentry_sdk/consts.py
+++ /dev/null
@@ -1,109 +0,0 @@
-from sentry_sdk._types import MYPY
-
-if MYPY:
- import sentry_sdk
-
- from typing import Optional
- from typing import Callable
- from typing import Union
- from typing import List
- from typing import Type
- from typing import Dict
- from typing import Any
- from typing import Sequence
- from typing_extensions import TypedDict
-
- from sentry_sdk.integrations import Integration
-
- from sentry_sdk._types import (
- BreadcrumbProcessor,
- Event,
- EventProcessor,
- TracesSampler,
- )
-
- # Experiments are feature flags to enable and disable certain unstable SDK
- # functionality. Changing them from the defaults (`None`) in production
- # code is highly discouraged. They are not subject to any stability
- # guarantees such as the ones from semantic versioning.
- Experiments = TypedDict(
- "Experiments",
- {
- "max_spans": Optional[int],
- "record_sql_params": Optional[bool],
- "smart_transaction_trimming": Optional[bool],
- "propagate_tracestate": Optional[bool],
- },
- total=False,
- )
-
-DEFAULT_QUEUE_SIZE = 100
-DEFAULT_MAX_BREADCRUMBS = 100
-
-
-# This type exists to trick mypy and PyCharm into thinking `init` and `Client`
-# take these arguments (even though they take opaque **kwargs)
-class ClientConstructor(object):
- def __init__(
- self,
- dsn=None, # type: Optional[str]
- with_locals=True, # type: bool
- max_breadcrumbs=DEFAULT_MAX_BREADCRUMBS, # type: int
- release=None, # type: Optional[str]
- environment=None, # type: Optional[str]
- server_name=None, # type: Optional[str]
- shutdown_timeout=2, # type: float
- integrations=[], # type: Sequence[Integration] # noqa: B006
- in_app_include=[], # type: List[str] # noqa: B006
- in_app_exclude=[], # type: List[str] # noqa: B006
- default_integrations=True, # type: bool
- dist=None, # type: Optional[str]
- transport=None, # type: Optional[Union[sentry_sdk.transport.Transport, Type[sentry_sdk.transport.Transport], Callable[[Event], None]]]
- transport_queue_size=DEFAULT_QUEUE_SIZE, # type: int
- sample_rate=1.0, # type: float
- send_default_pii=False, # type: bool
- http_proxy=None, # type: Optional[str]
- https_proxy=None, # type: Optional[str]
- ignore_errors=[], # type: List[Union[type, str]] # noqa: B006
- request_bodies="medium", # type: str
- before_send=None, # type: Optional[EventProcessor]
- before_breadcrumb=None, # type: Optional[BreadcrumbProcessor]
- debug=False, # type: bool
- attach_stacktrace=False, # type: bool
- ca_certs=None, # type: Optional[str]
- propagate_traces=True, # type: bool
- traces_sample_rate=None, # type: Optional[float]
- traces_sampler=None, # type: Optional[TracesSampler]
- auto_enabling_integrations=True, # type: bool
- auto_session_tracking=True, # type: bool
- send_client_reports=True, # type: bool
- _experiments={}, # type: Experiments # noqa: B006
- ):
- # type: (...) -> None
- pass
-
-
-def _get_default_options():
- # type: () -> Dict[str, Any]
- import inspect
-
- if hasattr(inspect, "getfullargspec"):
- getargspec = inspect.getfullargspec
- else:
- getargspec = inspect.getargspec # type: ignore
-
- a = getargspec(ClientConstructor.__init__)
- defaults = a.defaults or ()
- return dict(zip(a.args[-len(defaults) :], defaults))
-
-
-DEFAULT_OPTIONS = _get_default_options()
-del _get_default_options
-
-
-VERSION = "1.5.8"
-SDK_INFO = {
- "name": "sentry.python",
- "version": VERSION,
- "packages": [{"name": "pypi:sentry-sdk", "version": VERSION}],
-}
diff --git a/sentry_sdk/debug.py b/sentry_sdk/debug.py
deleted file mode 100644
index fe8ae50..0000000
--- a/sentry_sdk/debug.py
+++ /dev/null
@@ -1,44 +0,0 @@
-import sys
-import logging
-
-from sentry_sdk import utils
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import logger
-from sentry_sdk.client import _client_init_debug
-from logging import LogRecord
-
-
-class _HubBasedClientFilter(logging.Filter):
- def filter(self, record):
- # type: (LogRecord) -> bool
- if _client_init_debug.get(False):
- return True
- hub = Hub.current
- if hub is not None and hub.client is not None:
- return hub.client.options["debug"]
- return False
-
-
-def init_debug_support():
- # type: () -> None
- if not logger.handlers:
- configure_logger()
- configure_debug_hub()
-
-
-def configure_logger():
- # type: () -> None
- _handler = logging.StreamHandler(sys.stderr)
- _handler.setFormatter(logging.Formatter(" [sentry] %(levelname)s: %(message)s"))
- logger.addHandler(_handler)
- logger.setLevel(logging.DEBUG)
- logger.addFilter(_HubBasedClientFilter())
-
-
-def configure_debug_hub():
- # type: () -> None
- def _get_debug_hub():
- # type: () -> Hub
- return Hub.current
-
- utils._get_debug_hub = _get_debug_hub
diff --git a/sentry_sdk/envelope.py b/sentry_sdk/envelope.py
deleted file mode 100644
index 928c691..0000000
--- a/sentry_sdk/envelope.py
+++ /dev/null
@@ -1,317 +0,0 @@
-import io
-import json
-import mimetypes
-
-from sentry_sdk._compat import text_type, PY2
-from sentry_sdk._types import MYPY
-from sentry_sdk.session import Session
-from sentry_sdk.utils import json_dumps, capture_internal_exceptions
-
-if MYPY:
- from typing import Any
- from typing import Optional
- from typing import Union
- from typing import Dict
- from typing import List
- from typing import Iterator
-
- from sentry_sdk._types import Event, EventDataCategory
-
-
-def parse_json(data):
- # type: (Union[bytes, text_type]) -> Any
- # on some python 3 versions this needs to be bytes
- if not PY2 and isinstance(data, bytes):
- data = data.decode("utf-8", "replace")
- return json.loads(data)
-
-
-class Envelope(object):
- def __init__(
- self,
- headers=None, # type: Optional[Dict[str, Any]]
- items=None, # type: Optional[List[Item]]
- ):
- # type: (...) -> None
- if headers is not None:
- headers = dict(headers)
- self.headers = headers or {}
- if items is None:
- items = []
- else:
- items = list(items)
- self.items = items
-
- @property
- def description(self):
- # type: (...) -> str
- return "envelope with %s items (%s)" % (
- len(self.items),
- ", ".join(x.data_category for x in self.items),
- )
-
- def add_event(
- self, event # type: Event
- ):
- # type: (...) -> None
- self.add_item(Item(payload=PayloadRef(json=event), type="event"))
-
- def add_transaction(
- self, transaction # type: Event
- ):
- # type: (...) -> None
- self.add_item(Item(payload=PayloadRef(json=transaction), type="transaction"))
-
- def add_session(
- self, session # type: Union[Session, Any]
- ):
- # type: (...) -> None
- if isinstance(session, Session):
- session = session.to_json()
- self.add_item(Item(payload=PayloadRef(json=session), type="session"))
-
- def add_sessions(
- self, sessions # type: Any
- ):
- # type: (...) -> None
- self.add_item(Item(payload=PayloadRef(json=sessions), type="sessions"))
-
- def add_item(
- self, item # type: Item
- ):
- # type: (...) -> None
- self.items.append(item)
-
- def get_event(self):
- # type: (...) -> Optional[Event]
- for items in self.items:
- event = items.get_event()
- if event is not None:
- return event
- return None
-
- def get_transaction_event(self):
- # type: (...) -> Optional[Event]
- for item in self.items:
- event = item.get_transaction_event()
- if event is not None:
- return event
- return None
-
- def __iter__(self):
- # type: (...) -> Iterator[Item]
- return iter(self.items)
-
- def serialize_into(
- self, f # type: Any
- ):
- # type: (...) -> None
- f.write(json_dumps(self.headers))
- f.write(b"\n")
- for item in self.items:
- item.serialize_into(f)
-
- def serialize(self):
- # type: (...) -> bytes
- out = io.BytesIO()
- self.serialize_into(out)
- return out.getvalue()
-
- @classmethod
- def deserialize_from(
- cls, f # type: Any
- ):
- # type: (...) -> Envelope
- headers = parse_json(f.readline())
- items = []
- while 1:
- item = Item.deserialize_from(f)
- if item is None:
- break
- items.append(item)
- return cls(headers=headers, items=items)
-
- @classmethod
- def deserialize(
- cls, bytes # type: bytes
- ):
- # type: (...) -> Envelope
- return cls.deserialize_from(io.BytesIO(bytes))
-
- def __repr__(self):
- # type: (...) -> str
- return "" % (self.headers, self.items)
-
-
-class PayloadRef(object):
- def __init__(
- self,
- bytes=None, # type: Optional[bytes]
- path=None, # type: Optional[Union[bytes, text_type]]
- json=None, # type: Optional[Any]
- ):
- # type: (...) -> None
- self.json = json
- self.bytes = bytes
- self.path = path
-
- def get_bytes(self):
- # type: (...) -> bytes
- if self.bytes is None:
- if self.path is not None:
- with capture_internal_exceptions():
- with open(self.path, "rb") as f:
- self.bytes = f.read()
- elif self.json is not None:
- self.bytes = json_dumps(self.json)
- else:
- self.bytes = b""
- return self.bytes
-
- @property
- def inferred_content_type(self):
- # type: (...) -> str
- if self.json is not None:
- return "application/json"
- elif self.path is not None:
- path = self.path
- if isinstance(path, bytes):
- path = path.decode("utf-8", "replace")
- ty = mimetypes.guess_type(path)[0]
- if ty:
- return ty
- return "application/octet-stream"
-
- def __repr__(self):
- # type: (...) -> str
- return "" % (self.inferred_content_type,)
-
-
-class Item(object):
- def __init__(
- self,
- payload, # type: Union[bytes, text_type, PayloadRef]
- headers=None, # type: Optional[Dict[str, Any]]
- type=None, # type: Optional[str]
- content_type=None, # type: Optional[str]
- filename=None, # type: Optional[str]
- ):
- if headers is not None:
- headers = dict(headers)
- elif headers is None:
- headers = {}
- self.headers = headers
- if isinstance(payload, bytes):
- payload = PayloadRef(bytes=payload)
- elif isinstance(payload, text_type):
- payload = PayloadRef(bytes=payload.encode("utf-8"))
- else:
- payload = payload
-
- if filename is not None:
- headers["filename"] = filename
- if type is not None:
- headers["type"] = type
- if content_type is not None:
- headers["content_type"] = content_type
- elif "content_type" not in headers:
- headers["content_type"] = payload.inferred_content_type
-
- self.payload = payload
-
- def __repr__(self):
- # type: (...) -> str
- return "- " % (
- self.headers,
- self.payload,
- self.data_category,
- )
-
- @property
- def type(self):
- # type: (...) -> Optional[str]
- return self.headers.get("type")
-
- @property
- def data_category(self):
- # type: (...) -> EventDataCategory
- ty = self.headers.get("type")
- if ty == "session":
- return "session"
- elif ty == "attachment":
- return "attachment"
- elif ty == "transaction":
- return "transaction"
- elif ty == "event":
- return "error"
- elif ty == "client_report":
- return "internal"
- else:
- return "default"
-
- def get_bytes(self):
- # type: (...) -> bytes
- return self.payload.get_bytes()
-
- def get_event(self):
- # type: (...) -> Optional[Event]
- """
- Returns an error event if there is one.
- """
- if self.type == "event" and self.payload.json is not None:
- return self.payload.json
- return None
-
- def get_transaction_event(self):
- # type: (...) -> Optional[Event]
- if self.type == "transaction" and self.payload.json is not None:
- return self.payload.json
- return None
-
- def serialize_into(
- self, f # type: Any
- ):
- # type: (...) -> None
- headers = dict(self.headers)
- bytes = self.get_bytes()
- headers["length"] = len(bytes)
- f.write(json_dumps(headers))
- f.write(b"\n")
- f.write(bytes)
- f.write(b"\n")
-
- def serialize(self):
- # type: (...) -> bytes
- out = io.BytesIO()
- self.serialize_into(out)
- return out.getvalue()
-
- @classmethod
- def deserialize_from(
- cls, f # type: Any
- ):
- # type: (...) -> Optional[Item]
- line = f.readline().rstrip()
- if not line:
- return None
- headers = parse_json(line)
- length = headers.get("length")
- if length is not None:
- payload = f.read(length)
- f.readline()
- else:
- # if no length was specified we need to read up to the end of line
- # and remove it (if it is present, i.e. not the very last char in an eof terminated envelope)
- payload = f.readline().rstrip(b"\n")
- if headers.get("type") in ("event", "transaction", "metric_buckets"):
- rv = cls(headers=headers, payload=PayloadRef(json=parse_json(payload)))
- else:
- rv = cls(headers=headers, payload=payload)
- return rv
-
- @classmethod
- def deserialize(
- cls, bytes # type: bytes
- ):
- # type: (...) -> Optional[Item]
- return cls.deserialize_from(io.BytesIO(bytes))
diff --git a/sentry_sdk/hub.py b/sentry_sdk/hub.py
deleted file mode 100644
index addca57..0000000
--- a/sentry_sdk/hub.py
+++ /dev/null
@@ -1,708 +0,0 @@
-import copy
-import sys
-
-from datetime import datetime
-from contextlib import contextmanager
-
-from sentry_sdk._compat import with_metaclass
-from sentry_sdk.scope import Scope
-from sentry_sdk.client import Client
-from sentry_sdk.tracing import Span, Transaction
-from sentry_sdk.session import Session
-from sentry_sdk.utils import (
- exc_info_from_error,
- event_from_exception,
- logger,
- ContextVar,
-)
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Union
- from typing import Any
- from typing import Optional
- from typing import Tuple
- from typing import Dict
- from typing import List
- from typing import Callable
- from typing import Generator
- from typing import Type
- from typing import TypeVar
- from typing import overload
- from typing import ContextManager
-
- from sentry_sdk.integrations import Integration
- from sentry_sdk._types import (
- Event,
- Hint,
- Breadcrumb,
- BreadcrumbHint,
- ExcInfo,
- )
- from sentry_sdk.consts import ClientConstructor
-
- T = TypeVar("T")
-
-else:
-
- def overload(x):
- # type: (T) -> T
- return x
-
-
-_local = ContextVar("sentry_current_hub")
-
-
-def _update_scope(base, scope_change, scope_kwargs):
- # type: (Scope, Optional[Any], Dict[str, Any]) -> Scope
- if scope_change and scope_kwargs:
- raise TypeError("cannot provide scope and kwargs")
- if scope_change is not None:
- final_scope = copy.copy(base)
- if callable(scope_change):
- scope_change(final_scope)
- else:
- final_scope.update_from_scope(scope_change)
- elif scope_kwargs:
- final_scope = copy.copy(base)
- final_scope.update_from_kwargs(**scope_kwargs)
- else:
- final_scope = base
- return final_scope
-
-
-def _should_send_default_pii():
- # type: () -> bool
- client = Hub.current.client
- if not client:
- return False
- return client.options["send_default_pii"]
-
-
-class _InitGuard(object):
- def __init__(self, client):
- # type: (Client) -> None
- self._client = client
-
- def __enter__(self):
- # type: () -> _InitGuard
- return self
-
- def __exit__(self, exc_type, exc_value, tb):
- # type: (Any, Any, Any) -> None
- c = self._client
- if c is not None:
- c.close()
-
-
-def _init(*args, **kwargs):
- # type: (*Optional[str], **Any) -> ContextManager[Any]
- """Initializes the SDK and optionally integrations.
-
- This takes the same arguments as the client constructor.
- """
- client = Client(*args, **kwargs) # type: ignore
- Hub.current.bind_client(client)
- rv = _InitGuard(client)
- return rv
-
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- # Make mypy, PyCharm and other static analyzers think `init` is a type to
- # have nicer autocompletion for params.
- #
- # Use `ClientConstructor` to define the argument types of `init` and
- # `ContextManager[Any]` to tell static analyzers about the return type.
-
- class init(ClientConstructor, ContextManager[Any]): # noqa: N801
- pass
-
-
-else:
- # Alias `init` for actual usage. Go through the lambda indirection to throw
- # PyCharm off of the weakly typed signature (it would otherwise discover
- # both the weakly typed signature of `_init` and our faked `init` type).
-
- init = (lambda: _init)()
-
-
-class HubMeta(type):
- @property
- def current(cls):
- # type: () -> Hub
- """Returns the current instance of the hub."""
- rv = _local.get(None)
- if rv is None:
- rv = Hub(GLOBAL_HUB)
- _local.set(rv)
- return rv
-
- @property
- def main(cls):
- # type: () -> Hub
- """Returns the main instance of the hub."""
- return GLOBAL_HUB
-
-
-class _ScopeManager(object):
- def __init__(self, hub):
- # type: (Hub) -> None
- self._hub = hub
- self._original_len = len(hub._stack)
- self._layer = hub._stack[-1]
-
- def __enter__(self):
- # type: () -> Scope
- scope = self._layer[1]
- assert scope is not None
- return scope
-
- def __exit__(self, exc_type, exc_value, tb):
- # type: (Any, Any, Any) -> None
- current_len = len(self._hub._stack)
- if current_len < self._original_len:
- logger.error(
- "Scope popped too soon. Popped %s scopes too many.",
- self._original_len - current_len,
- )
- return
- elif current_len > self._original_len:
- logger.warning(
- "Leaked %s scopes: %s",
- current_len - self._original_len,
- self._hub._stack[self._original_len :],
- )
-
- layer = self._hub._stack[self._original_len - 1]
- del self._hub._stack[self._original_len - 1 :]
-
- if layer[1] != self._layer[1]:
- logger.error(
- "Wrong scope found. Meant to pop %s, but popped %s.",
- layer[1],
- self._layer[1],
- )
- elif layer[0] != self._layer[0]:
- warning = (
- "init() called inside of pushed scope. This might be entirely "
- "legitimate but usually occurs when initializing the SDK inside "
- "a request handler or task/job function. Try to initialize the "
- "SDK as early as possible instead."
- )
- logger.warning(warning)
-
-
-class Hub(with_metaclass(HubMeta)): # type: ignore
- """The hub wraps the concurrency management of the SDK. Each thread has
- its own hub but the hub might transfer with the flow of execution if
- context vars are available.
-
- If the hub is used with a with statement it's temporarily activated.
- """
-
- _stack = None # type: List[Tuple[Optional[Client], Scope]]
-
- # Mypy doesn't pick up on the metaclass.
-
- if MYPY:
- current = None # type: Hub
- main = None # type: Hub
-
- def __init__(
- self,
- client_or_hub=None, # type: Optional[Union[Hub, Client]]
- scope=None, # type: Optional[Any]
- ):
- # type: (...) -> None
- if isinstance(client_or_hub, Hub):
- hub = client_or_hub
- client, other_scope = hub._stack[-1]
- if scope is None:
- scope = copy.copy(other_scope)
- else:
- client = client_or_hub
- if scope is None:
- scope = Scope()
-
- self._stack = [(client, scope)]
- self._last_event_id = None # type: Optional[str]
- self._old_hubs = [] # type: List[Hub]
-
- def __enter__(self):
- # type: () -> Hub
- self._old_hubs.append(Hub.current)
- _local.set(self)
- return self
-
- def __exit__(
- self,
- exc_type, # type: Optional[type]
- exc_value, # type: Optional[BaseException]
- tb, # type: Optional[Any]
- ):
- # type: (...) -> None
- old = self._old_hubs.pop()
- _local.set(old)
-
- def run(
- self, callback # type: Callable[[], T]
- ):
- # type: (...) -> T
- """Runs a callback in the context of the hub. Alternatively the
- with statement can be used on the hub directly.
- """
- with self:
- return callback()
-
- def get_integration(
- self, name_or_class # type: Union[str, Type[Integration]]
- ):
- # type: (...) -> Any
- """Returns the integration for this hub by name or class. If there
- is no client bound or the client does not have that integration
- then `None` is returned.
-
- If the return value is not `None` the hub is guaranteed to have a
- client attached.
- """
- if isinstance(name_or_class, str):
- integration_name = name_or_class
- elif name_or_class.identifier is not None:
- integration_name = name_or_class.identifier
- else:
- raise ValueError("Integration has no name")
-
- client = self.client
- if client is not None:
- rv = client.integrations.get(integration_name)
- if rv is not None:
- return rv
-
- @property
- def client(self):
- # type: () -> Optional[Client]
- """Returns the current client on the hub."""
- return self._stack[-1][0]
-
- @property
- def scope(self):
- # type: () -> Scope
- """Returns the current scope on the hub."""
- return self._stack[-1][1]
-
- def last_event_id(self):
- # type: () -> Optional[str]
- """Returns the last event ID."""
- return self._last_event_id
-
- def bind_client(
- self, new # type: Optional[Client]
- ):
- # type: (...) -> None
- """Binds a new client to the hub."""
- top = self._stack[-1]
- self._stack[-1] = (new, top[1])
-
- def capture_event(
- self,
- event, # type: Event
- hint=None, # type: Optional[Hint]
- scope=None, # type: Optional[Any]
- **scope_args # type: Any
- ):
- # type: (...) -> Optional[str]
- """Captures an event. Alias of :py:meth:`sentry_sdk.Client.capture_event`."""
- client, top_scope = self._stack[-1]
- scope = _update_scope(top_scope, scope, scope_args)
- if client is not None:
- is_transaction = event.get("type") == "transaction"
- rv = client.capture_event(event, hint, scope)
- if rv is not None and not is_transaction:
- self._last_event_id = rv
- return rv
- return None
-
- def capture_message(
- self,
- message, # type: str
- level=None, # type: Optional[str]
- scope=None, # type: Optional[Any]
- **scope_args # type: Any
- ):
- # type: (...) -> Optional[str]
- """Captures a message. The message is just a string. If no level
- is provided the default level is `info`.
-
- :returns: An `event_id` if the SDK decided to send the event (see :py:meth:`sentry_sdk.Client.capture_event`).
- """
- if self.client is None:
- return None
- if level is None:
- level = "info"
- return self.capture_event(
- {"message": message, "level": level}, scope=scope, **scope_args
- )
-
- def capture_exception(
- self,
- error=None, # type: Optional[Union[BaseException, ExcInfo]]
- scope=None, # type: Optional[Any]
- **scope_args # type: Any
- ):
- # type: (...) -> Optional[str]
- """Captures an exception.
-
- :param error: An exception to catch. If `None`, `sys.exc_info()` will be used.
-
- :returns: An `event_id` if the SDK decided to send the event (see :py:meth:`sentry_sdk.Client.capture_event`).
- """
- client = self.client
- if client is None:
- return None
- if error is not None:
- exc_info = exc_info_from_error(error)
- else:
- exc_info = sys.exc_info()
-
- event, hint = event_from_exception(exc_info, client_options=client.options)
- try:
- return self.capture_event(event, hint=hint, scope=scope, **scope_args)
- except Exception:
- self._capture_internal_exception(sys.exc_info())
-
- return None
-
- def _capture_internal_exception(
- self, exc_info # type: Any
- ):
- # type: (...) -> Any
- """
- Capture an exception that is likely caused by a bug in the SDK
- itself.
-
- These exceptions do not end up in Sentry and are just logged instead.
- """
- logger.error("Internal error in sentry_sdk", exc_info=exc_info)
-
- def add_breadcrumb(
- self,
- crumb=None, # type: Optional[Breadcrumb]
- hint=None, # type: Optional[BreadcrumbHint]
- **kwargs # type: Any
- ):
- # type: (...) -> None
- """
- Adds a breadcrumb.
-
- :param crumb: Dictionary with the data as the sentry v7/v8 protocol expects.
-
- :param hint: An optional value that can be used by `before_breadcrumb`
- to customize the breadcrumbs that are emitted.
- """
- client, scope = self._stack[-1]
- if client is None:
- logger.info("Dropped breadcrumb because no client bound")
- return
-
- crumb = dict(crumb or ()) # type: Breadcrumb
- crumb.update(kwargs)
- if not crumb:
- return
-
- hint = dict(hint or ()) # type: Hint
-
- if crumb.get("timestamp") is None:
- crumb["timestamp"] = datetime.utcnow()
- if crumb.get("type") is None:
- crumb["type"] = "default"
-
- if client.options["before_breadcrumb"] is not None:
- new_crumb = client.options["before_breadcrumb"](crumb, hint)
- else:
- new_crumb = crumb
-
- if new_crumb is not None:
- scope._breadcrumbs.append(new_crumb)
- else:
- logger.info("before breadcrumb dropped breadcrumb (%s)", crumb)
-
- max_breadcrumbs = client.options["max_breadcrumbs"] # type: int
- while len(scope._breadcrumbs) > max_breadcrumbs:
- scope._breadcrumbs.popleft()
-
- def start_span(
- self,
- span=None, # type: Optional[Span]
- **kwargs # type: Any
- ):
- # type: (...) -> Span
- """
- Create and start timing a new span whose parent is the currently active
- span or transaction, if any. The return value is a span instance,
- typically used as a context manager to start and stop timing in a `with`
- block.
-
- Only spans contained in a transaction are sent to Sentry. Most
- integrations start a transaction at the appropriate time, for example
- for every incoming HTTP request. Use `start_transaction` to start a new
- transaction when one is not already in progress.
- """
- # TODO: consider removing this in a future release.
- # This is for backwards compatibility with releases before
- # start_transaction existed, to allow for a smoother transition.
- if isinstance(span, Transaction) or "transaction" in kwargs:
- deprecation_msg = (
- "Deprecated: use start_transaction to start transactions and "
- "Transaction.start_child to start spans."
- )
- if isinstance(span, Transaction):
- logger.warning(deprecation_msg)
- return self.start_transaction(span)
- if "transaction" in kwargs:
- logger.warning(deprecation_msg)
- name = kwargs.pop("transaction")
- return self.start_transaction(name=name, **kwargs)
-
- if span is not None:
- return span
-
- kwargs.setdefault("hub", self)
-
- span = self.scope.span
- if span is not None:
- return span.start_child(**kwargs)
-
- return Span(**kwargs)
-
- def start_transaction(
- self,
- transaction=None, # type: Optional[Transaction]
- **kwargs # type: Any
- ):
- # type: (...) -> Transaction
- """
- Start and return a transaction.
-
- Start an existing transaction if given, otherwise create and start a new
- transaction with kwargs.
-
- This is the entry point to manual tracing instrumentation.
-
- A tree structure can be built by adding child spans to the transaction,
- and child spans to other spans. To start a new child span within the
- transaction or any span, call the respective `.start_child()` method.
-
- Every child span must be finished before the transaction is finished,
- otherwise the unfinished spans are discarded.
-
- When used as context managers, spans and transactions are automatically
- finished at the end of the `with` block. If not using context managers,
- call the `.finish()` method.
-
- When the transaction is finished, it will be sent to Sentry with all its
- finished child spans.
- """
- custom_sampling_context = kwargs.pop("custom_sampling_context", {})
-
- # if we haven't been given a transaction, make one
- if transaction is None:
- kwargs.setdefault("hub", self)
- transaction = Transaction(**kwargs)
-
- # use traces_sample_rate, traces_sampler, and/or inheritance to make a
- # sampling decision
- sampling_context = {
- "transaction_context": transaction.to_json(),
- "parent_sampled": transaction.parent_sampled,
- }
- sampling_context.update(custom_sampling_context)
- transaction._set_initial_sampling_decision(sampling_context=sampling_context)
-
- # we don't bother to keep spans if we already know we're not going to
- # send the transaction
- if transaction.sampled:
- max_spans = (
- self.client and self.client.options["_experiments"].get("max_spans")
- ) or 1000
- transaction.init_span_recorder(maxlen=max_spans)
-
- return transaction
-
- @overload
- def push_scope( # noqa: F811
- self, callback=None # type: Optional[None]
- ):
- # type: (...) -> ContextManager[Scope]
- pass
-
- @overload
- def push_scope( # noqa: F811
- self, callback # type: Callable[[Scope], None]
- ):
- # type: (...) -> None
- pass
-
- def push_scope( # noqa
- self, callback=None # type: Optional[Callable[[Scope], None]]
- ):
- # type: (...) -> Optional[ContextManager[Scope]]
- """
- Pushes a new layer on the scope stack.
-
- :param callback: If provided, this method pushes a scope, calls
- `callback`, and pops the scope again.
-
- :returns: If no `callback` is provided, a context manager that should
- be used to pop the scope again.
- """
- if callback is not None:
- with self.push_scope() as scope:
- callback(scope)
- return None
-
- client, scope = self._stack[-1]
- new_layer = (client, copy.copy(scope))
- self._stack.append(new_layer)
-
- return _ScopeManager(self)
-
- def pop_scope_unsafe(self):
- # type: () -> Tuple[Optional[Client], Scope]
- """
- Pops a scope layer from the stack.
-
- Try to use the context manager :py:meth:`push_scope` instead.
- """
- rv = self._stack.pop()
- assert self._stack, "stack must have at least one layer"
- return rv
-
- @overload
- def configure_scope( # noqa: F811
- self, callback=None # type: Optional[None]
- ):
- # type: (...) -> ContextManager[Scope]
- pass
-
- @overload
- def configure_scope( # noqa: F811
- self, callback # type: Callable[[Scope], None]
- ):
- # type: (...) -> None
- pass
-
- def configure_scope( # noqa
- self, callback=None # type: Optional[Callable[[Scope], None]]
- ): # noqa
- # type: (...) -> Optional[ContextManager[Scope]]
-
- """
- Reconfigures the scope.
-
- :param callback: If provided, call the callback with the current scope.
-
- :returns: If no callback is provided, returns a context manager that returns the scope.
- """
-
- client, scope = self._stack[-1]
- if callback is not None:
- if client is not None:
- callback(scope)
-
- return None
-
- @contextmanager
- def inner():
- # type: () -> Generator[Scope, None, None]
- if client is not None:
- yield scope
- else:
- yield Scope()
-
- return inner()
-
- def start_session(
- self, session_mode="application" # type: str
- ):
- # type: (...) -> None
- """Starts a new session."""
- self.end_session()
- client, scope = self._stack[-1]
- scope._session = Session(
- release=client.options["release"] if client else None,
- environment=client.options["environment"] if client else None,
- user=scope._user,
- session_mode=session_mode,
- )
-
- def end_session(self):
- # type: (...) -> None
- """Ends the current session if there is one."""
- client, scope = self._stack[-1]
- session = scope._session
- self.scope._session = None
-
- if session is not None:
- session.close()
- if client is not None:
- client.capture_session(session)
-
- def stop_auto_session_tracking(self):
- # type: (...) -> None
- """Stops automatic session tracking.
-
- This temporarily session tracking for the current scope when called.
- To resume session tracking call `resume_auto_session_tracking`.
- """
- self.end_session()
- client, scope = self._stack[-1]
- scope._force_auto_session_tracking = False
-
- def resume_auto_session_tracking(self):
- # type: (...) -> None
- """Resumes automatic session tracking for the current scope if
- disabled earlier. This requires that generally automatic session
- tracking is enabled.
- """
- client, scope = self._stack[-1]
- scope._force_auto_session_tracking = None
-
- def flush(
- self,
- timeout=None, # type: Optional[float]
- callback=None, # type: Optional[Callable[[int, float], None]]
- ):
- # type: (...) -> None
- """
- Alias for :py:meth:`sentry_sdk.Client.flush`
- """
- client, scope = self._stack[-1]
- if client is not None:
- return client.flush(timeout=timeout, callback=callback)
-
- def iter_trace_propagation_headers(self, span=None):
- # type: (Optional[Span]) -> Generator[Tuple[str, str], None, None]
- """
- Return HTTP headers which allow propagation of trace data. Data taken
- from the span representing the request, if available, or the current
- span on the scope if not.
- """
- span = span or self.scope.span
- if not span:
- return
-
- client = self._stack[-1][0]
-
- propagate_traces = client and client.options["propagate_traces"]
- if not propagate_traces:
- return
-
- for header in span.iter_headers():
- yield header
-
-
-GLOBAL_HUB = Hub()
-_local.set(GLOBAL_HUB)
diff --git a/sentry_sdk/integrations/__init__.py b/sentry_sdk/integrations/__init__.py
deleted file mode 100644
index 777c363..0000000
--- a/sentry_sdk/integrations/__init__.py
+++ /dev/null
@@ -1,183 +0,0 @@
-"""This package"""
-from __future__ import absolute_import
-
-from threading import Lock
-
-from sentry_sdk._compat import iteritems
-from sentry_sdk.utils import logger
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Callable
- from typing import Dict
- from typing import Iterator
- from typing import List
- from typing import Set
- from typing import Tuple
- from typing import Type
-
-
-_installer_lock = Lock()
-_installed_integrations = set() # type: Set[str]
-
-
-def _generate_default_integrations_iterator(integrations, auto_enabling_integrations):
- # type: (Tuple[str, ...], Tuple[str, ...]) -> Callable[[bool], Iterator[Type[Integration]]]
-
- def iter_default_integrations(with_auto_enabling_integrations):
- # type: (bool) -> Iterator[Type[Integration]]
- """Returns an iterator of the default integration classes:"""
- from importlib import import_module
-
- if with_auto_enabling_integrations:
- all_import_strings = integrations + auto_enabling_integrations
- else:
- all_import_strings = integrations
-
- for import_string in all_import_strings:
- try:
- module, cls = import_string.rsplit(".", 1)
- yield getattr(import_module(module), cls)
- except (DidNotEnable, SyntaxError) as e:
- logger.debug(
- "Did not import default integration %s: %s", import_string, e
- )
-
- if isinstance(iter_default_integrations.__doc__, str):
- for import_string in integrations:
- iter_default_integrations.__doc__ += "\n- `{}`".format(import_string)
-
- return iter_default_integrations
-
-
-_AUTO_ENABLING_INTEGRATIONS = (
- "sentry_sdk.integrations.django.DjangoIntegration",
- "sentry_sdk.integrations.flask.FlaskIntegration",
- "sentry_sdk.integrations.bottle.BottleIntegration",
- "sentry_sdk.integrations.falcon.FalconIntegration",
- "sentry_sdk.integrations.sanic.SanicIntegration",
- "sentry_sdk.integrations.celery.CeleryIntegration",
- "sentry_sdk.integrations.rq.RqIntegration",
- "sentry_sdk.integrations.aiohttp.AioHttpIntegration",
- "sentry_sdk.integrations.tornado.TornadoIntegration",
- "sentry_sdk.integrations.sqlalchemy.SqlalchemyIntegration",
- "sentry_sdk.integrations.boto3.Boto3Integration",
-)
-
-
-iter_default_integrations = _generate_default_integrations_iterator(
- integrations=(
- # stdlib/base runtime integrations
- "sentry_sdk.integrations.logging.LoggingIntegration",
- "sentry_sdk.integrations.stdlib.StdlibIntegration",
- "sentry_sdk.integrations.excepthook.ExcepthookIntegration",
- "sentry_sdk.integrations.dedupe.DedupeIntegration",
- "sentry_sdk.integrations.atexit.AtexitIntegration",
- "sentry_sdk.integrations.modules.ModulesIntegration",
- "sentry_sdk.integrations.argv.ArgvIntegration",
- "sentry_sdk.integrations.threading.ThreadingIntegration",
- ),
- auto_enabling_integrations=_AUTO_ENABLING_INTEGRATIONS,
-)
-
-del _generate_default_integrations_iterator
-
-
-def setup_integrations(
- integrations, with_defaults=True, with_auto_enabling_integrations=False
-):
- # type: (List[Integration], bool, bool) -> Dict[str, Integration]
- """Given a list of integration instances this installs them all. When
- `with_defaults` is set to `True` then all default integrations are added
- unless they were already provided before.
- """
- integrations = dict(
- (integration.identifier, integration) for integration in integrations or ()
- )
-
- logger.debug("Setting up integrations (with default = %s)", with_defaults)
-
- # Integrations that are not explicitly set up by the user.
- used_as_default_integration = set()
-
- if with_defaults:
- for integration_cls in iter_default_integrations(
- with_auto_enabling_integrations
- ):
- if integration_cls.identifier not in integrations:
- instance = integration_cls()
- integrations[instance.identifier] = instance
- used_as_default_integration.add(instance.identifier)
-
- for identifier, integration in iteritems(integrations):
- with _installer_lock:
- if identifier not in _installed_integrations:
- logger.debug(
- "Setting up previously not enabled integration %s", identifier
- )
- try:
- type(integration).setup_once()
- except NotImplementedError:
- if getattr(integration, "install", None) is not None:
- logger.warning(
- "Integration %s: The install method is "
- "deprecated. Use `setup_once`.",
- identifier,
- )
- integration.install()
- else:
- raise
- except DidNotEnable as e:
- if identifier not in used_as_default_integration:
- raise
-
- logger.debug(
- "Did not enable default integration %s: %s", identifier, e
- )
-
- _installed_integrations.add(identifier)
-
- for identifier in integrations:
- logger.debug("Enabling integration %s", identifier)
-
- return integrations
-
-
-class DidNotEnable(Exception):
- """
- The integration could not be enabled due to a trivial user error like
- `flask` not being installed for the `FlaskIntegration`.
-
- This exception is silently swallowed for default integrations, but reraised
- for explicitly enabled integrations.
- """
-
-
-class Integration(object):
- """Baseclass for all integrations.
-
- To accept options for an integration, implement your own constructor that
- saves those options on `self`.
- """
-
- install = None
- """Legacy method, do not implement."""
-
- identifier = None # type: str
- """String unique ID of integration type"""
-
- @staticmethod
- def setup_once():
- # type: () -> None
- """
- Initialize the integration.
-
- This function is only called once, ever. Configuration is not available
- at this point, so the only thing to do here is to hook into exception
- handlers, and perhaps do monkeypatches.
-
- Inside those hooks `Integration.current` can be used to access the
- instance again.
- """
- raise NotImplementedError()
diff --git a/sentry_sdk/integrations/_wsgi_common.py b/sentry_sdk/integrations/_wsgi_common.py
deleted file mode 100644
index f874663..0000000
--- a/sentry_sdk/integrations/_wsgi_common.py
+++ /dev/null
@@ -1,180 +0,0 @@
-import json
-
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.utils import AnnotatedValue
-from sentry_sdk._compat import text_type, iteritems
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- import sentry_sdk
-
- from typing import Any
- from typing import Dict
- from typing import Optional
- from typing import Union
-
-
-SENSITIVE_ENV_KEYS = (
- "REMOTE_ADDR",
- "HTTP_X_FORWARDED_FOR",
- "HTTP_SET_COOKIE",
- "HTTP_COOKIE",
- "HTTP_AUTHORIZATION",
- "HTTP_X_FORWARDED_FOR",
- "HTTP_X_REAL_IP",
-)
-
-SENSITIVE_HEADERS = tuple(
- x[len("HTTP_") :] for x in SENSITIVE_ENV_KEYS if x.startswith("HTTP_")
-)
-
-
-def request_body_within_bounds(client, content_length):
- # type: (Optional[sentry_sdk.Client], int) -> bool
- if client is None:
- return False
-
- bodies = client.options["request_bodies"]
- return not (
- bodies == "never"
- or (bodies == "small" and content_length > 10 ** 3)
- or (bodies == "medium" and content_length > 10 ** 4)
- )
-
-
-class RequestExtractor(object):
- def __init__(self, request):
- # type: (Any) -> None
- self.request = request
-
- def extract_into_event(self, event):
- # type: (Dict[str, Any]) -> None
- client = Hub.current.client
- if client is None:
- return
-
- data = None # type: Optional[Union[AnnotatedValue, Dict[str, Any]]]
-
- content_length = self.content_length()
- request_info = event.get("request", {})
-
- if _should_send_default_pii():
- request_info["cookies"] = dict(self.cookies())
-
- if not request_body_within_bounds(client, content_length):
- data = AnnotatedValue(
- "",
- {"rem": [["!config", "x", 0, content_length]], "len": content_length},
- )
- else:
- parsed_body = self.parsed_body()
- if parsed_body is not None:
- data = parsed_body
- elif self.raw_data():
- data = AnnotatedValue(
- "",
- {"rem": [["!raw", "x", 0, content_length]], "len": content_length},
- )
- else:
- data = None
-
- if data is not None:
- request_info["data"] = data
-
- event["request"] = request_info
-
- def content_length(self):
- # type: () -> int
- try:
- return int(self.env().get("CONTENT_LENGTH", 0))
- except ValueError:
- return 0
-
- def cookies(self):
- # type: () -> Dict[str, Any]
- raise NotImplementedError()
-
- def raw_data(self):
- # type: () -> Optional[Union[str, bytes]]
- raise NotImplementedError()
-
- def form(self):
- # type: () -> Optional[Dict[str, Any]]
- raise NotImplementedError()
-
- def parsed_body(self):
- # type: () -> Optional[Dict[str, Any]]
- form = self.form()
- files = self.files()
- if form or files:
- data = dict(iteritems(form))
- for k, v in iteritems(files):
- size = self.size_of_file(v)
- data[k] = AnnotatedValue(
- "", {"len": size, "rem": [["!raw", "x", 0, size]]}
- )
-
- return data
-
- return self.json()
-
- def is_json(self):
- # type: () -> bool
- return _is_json_content_type(self.env().get("CONTENT_TYPE"))
-
- def json(self):
- # type: () -> Optional[Any]
- try:
- if not self.is_json():
- return None
-
- raw_data = self.raw_data()
- if raw_data is None:
- return None
-
- if isinstance(raw_data, text_type):
- return json.loads(raw_data)
- else:
- return json.loads(raw_data.decode("utf-8"))
- except ValueError:
- pass
-
- return None
-
- def files(self):
- # type: () -> Optional[Dict[str, Any]]
- raise NotImplementedError()
-
- def size_of_file(self, file):
- # type: (Any) -> int
- raise NotImplementedError()
-
- def env(self):
- # type: () -> Dict[str, Any]
- raise NotImplementedError()
-
-
-def _is_json_content_type(ct):
- # type: (Optional[str]) -> bool
- mt = (ct or "").split(";", 1)[0]
- return (
- mt == "application/json"
- or (mt.startswith("application/"))
- and mt.endswith("+json")
- )
-
-
-def _filter_headers(headers):
- # type: (Dict[str, str]) -> Dict[str, str]
- if _should_send_default_pii():
- return headers
-
- return {
- k: (
- v
- if k.upper().replace("-", "_") not in SENSITIVE_HEADERS
- else AnnotatedValue("", {"rem": [["!config", "x", 0, len(v)]]})
- )
- for k, v in iteritems(headers)
- }
diff --git a/sentry_sdk/integrations/aiohttp.py b/sentry_sdk/integrations/aiohttp.py
deleted file mode 100644
index 8a828b2..0000000
--- a/sentry_sdk/integrations/aiohttp.py
+++ /dev/null
@@ -1,230 +0,0 @@
-import sys
-import weakref
-
-from sentry_sdk._compat import reraise
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations.logging import ignore_logger
-from sentry_sdk.integrations._wsgi_common import (
- _filter_headers,
- request_body_within_bounds,
-)
-from sentry_sdk.tracing import Transaction
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- event_from_exception,
- transaction_from_function,
- HAS_REAL_CONTEXTVARS,
- CONTEXTVARS_ERROR_MESSAGE,
- AnnotatedValue,
-)
-
-try:
- import asyncio
-
- from aiohttp import __version__ as AIOHTTP_VERSION
- from aiohttp.web import Application, HTTPException, UrlDispatcher
-except ImportError:
- raise DidNotEnable("AIOHTTP not installed")
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from aiohttp.web_request import Request
- from aiohttp.abc import AbstractMatchInfo
- from typing import Any
- from typing import Dict
- from typing import Optional
- from typing import Tuple
- from typing import Callable
- from typing import Union
-
- from sentry_sdk.utils import ExcInfo
- from sentry_sdk._types import EventProcessor
-
-
-TRANSACTION_STYLE_VALUES = ("handler_name", "method_and_path_pattern")
-
-
-class AioHttpIntegration(Integration):
- identifier = "aiohttp"
-
- def __init__(self, transaction_style="handler_name"):
- # type: (str) -> None
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- try:
- version = tuple(map(int, AIOHTTP_VERSION.split(".")[:2]))
- except (TypeError, ValueError):
- raise DidNotEnable("AIOHTTP version unparsable: {}".format(AIOHTTP_VERSION))
-
- if version < (3, 4):
- raise DidNotEnable("AIOHTTP 3.4 or newer required.")
-
- if not HAS_REAL_CONTEXTVARS:
- # We better have contextvars or we're going to leak state between
- # requests.
- raise DidNotEnable(
- "The aiohttp integration for Sentry requires Python 3.7+ "
- " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
- )
-
- ignore_logger("aiohttp.server")
-
- old_handle = Application._handle
-
- async def sentry_app_handle(self, request, *args, **kwargs):
- # type: (Any, Request, *Any, **Any) -> Any
- hub = Hub.current
- if hub.get_integration(AioHttpIntegration) is None:
- return await old_handle(self, request, *args, **kwargs)
-
- weak_request = weakref.ref(request)
-
- with Hub(hub) as hub:
- # Scope data will not leak between requests because aiohttp
- # create a task to wrap each request.
- with hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
- scope.add_event_processor(_make_request_processor(weak_request))
-
- transaction = Transaction.continue_from_headers(
- request.headers,
- op="http.server",
- # If this transaction name makes it to the UI, AIOHTTP's
- # URL resolver did not find a route or died trying.
- name="generic AIOHTTP request",
- )
- with hub.start_transaction(
- transaction, custom_sampling_context={"aiohttp_request": request}
- ):
- try:
- response = await old_handle(self, request)
- except HTTPException as e:
- transaction.set_http_status(e.status_code)
- raise
- except (asyncio.CancelledError, ConnectionResetError):
- transaction.set_status("cancelled")
- raise
- except Exception:
- # This will probably map to a 500 but seems like we
- # have no way to tell. Do not set span status.
- reraise(*_capture_exception(hub))
-
- transaction.set_http_status(response.status)
- return response
-
- Application._handle = sentry_app_handle
-
- old_urldispatcher_resolve = UrlDispatcher.resolve
-
- async def sentry_urldispatcher_resolve(self, request):
- # type: (UrlDispatcher, Request) -> AbstractMatchInfo
- rv = await old_urldispatcher_resolve(self, request)
-
- hub = Hub.current
- integration = hub.get_integration(AioHttpIntegration)
-
- name = None
-
- try:
- if integration.transaction_style == "handler_name":
- name = transaction_from_function(rv.handler)
- elif integration.transaction_style == "method_and_path_pattern":
- route_info = rv.get_info()
- pattern = route_info.get("path") or route_info.get("formatter")
- name = "{} {}".format(request.method, pattern)
- except Exception:
- pass
-
- if name is not None:
- with Hub.current.configure_scope() as scope:
- scope.transaction = name
-
- return rv
-
- UrlDispatcher.resolve = sentry_urldispatcher_resolve
-
-
-def _make_request_processor(weak_request):
- # type: (Callable[[], Request]) -> EventProcessor
- def aiohttp_processor(
- event, # type: Dict[str, Any]
- hint, # type: Dict[str, Tuple[type, BaseException, Any]]
- ):
- # type: (...) -> Dict[str, Any]
- request = weak_request()
- if request is None:
- return event
-
- with capture_internal_exceptions():
- request_info = event.setdefault("request", {})
-
- request_info["url"] = "%s://%s%s" % (
- request.scheme,
- request.host,
- request.path,
- )
-
- request_info["query_string"] = request.query_string
- request_info["method"] = request.method
- request_info["env"] = {"REMOTE_ADDR": request.remote}
-
- hub = Hub.current
- request_info["headers"] = _filter_headers(dict(request.headers))
-
- # Just attach raw data here if it is within bounds, if available.
- # Unfortunately there's no way to get structured data from aiohttp
- # without awaiting on some coroutine.
- request_info["data"] = get_aiohttp_request_data(hub, request)
-
- return event
-
- return aiohttp_processor
-
-
-def _capture_exception(hub):
- # type: (Hub) -> ExcInfo
- exc_info = sys.exc_info()
- event, hint = event_from_exception(
- exc_info,
- client_options=hub.client.options, # type: ignore
- mechanism={"type": "aiohttp", "handled": False},
- )
- hub.capture_event(event, hint=hint)
- return exc_info
-
-
-BODY_NOT_READ_MESSAGE = "[Can't show request body due to implementation details.]"
-
-
-def get_aiohttp_request_data(hub, request):
- # type: (Hub, Request) -> Union[Optional[str], AnnotatedValue]
- bytes_body = request._read_bytes
-
- if bytes_body is not None:
- # we have body to show
- if not request_body_within_bounds(hub.client, len(bytes_body)):
-
- return AnnotatedValue(
- "",
- {"rem": [["!config", "x", 0, len(bytes_body)]], "len": len(bytes_body)},
- )
- encoding = request.charset or "utf-8"
- return bytes_body.decode(encoding, "replace")
-
- if request.can_read_body:
- # body exists but we can't show it
- return BODY_NOT_READ_MESSAGE
-
- # request has no body
- return None
diff --git a/sentry_sdk/integrations/argv.py b/sentry_sdk/integrations/argv.py
deleted file mode 100644
index f005521..0000000
--- a/sentry_sdk/integrations/argv.py
+++ /dev/null
@@ -1,33 +0,0 @@
-from __future__ import absolute_import
-
-import sys
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration
-from sentry_sdk.scope import add_global_event_processor
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Optional
-
- from sentry_sdk._types import Event, Hint
-
-
-class ArgvIntegration(Integration):
- identifier = "argv"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- @add_global_event_processor
- def processor(event, hint):
- # type: (Event, Optional[Hint]) -> Optional[Event]
- if Hub.current.get_integration(ArgvIntegration) is not None:
- extra = event.setdefault("extra", {})
- # If some event processor decided to set extra to e.g. an
- # `int`, don't crash. Not here.
- if isinstance(extra, dict):
- extra["sys.argv"] = sys.argv
-
- return event
diff --git a/sentry_sdk/integrations/asgi.py b/sentry_sdk/integrations/asgi.py
deleted file mode 100644
index 5f78107..0000000
--- a/sentry_sdk/integrations/asgi.py
+++ /dev/null
@@ -1,278 +0,0 @@
-"""
-An ASGI middleware.
-
-Based on Tom Christie's `sentry-asgi `_.
-"""
-
-import asyncio
-import inspect
-import urllib
-
-from sentry_sdk._functools import partial
-from sentry_sdk._types import MYPY
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.integrations._wsgi_common import _filter_headers
-from sentry_sdk.sessions import auto_session_tracking
-from sentry_sdk.utils import (
- ContextVar,
- event_from_exception,
- transaction_from_function,
- HAS_REAL_CONTEXTVARS,
- CONTEXTVARS_ERROR_MESSAGE,
-)
-from sentry_sdk.tracing import Transaction
-
-if MYPY:
- from typing import Dict
- from typing import Any
- from typing import Optional
- from typing import Callable
-
- from typing_extensions import Literal
-
- from sentry_sdk._types import Event, Hint
-
-
-_asgi_middleware_applied = ContextVar("sentry_asgi_middleware_applied")
-
-_DEFAULT_TRANSACTION_NAME = "generic ASGI request"
-
-TRANSACTION_STYLE_VALUES = ("endpoint", "url")
-
-
-def _capture_exception(hub, exc):
- # type: (Hub, Any) -> None
-
- # Check client here as it might have been unset while streaming response
- if hub.client is not None:
- event, hint = event_from_exception(
- exc,
- client_options=hub.client.options,
- mechanism={"type": "asgi", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
-
-def _looks_like_asgi3(app):
- # type: (Any) -> bool
- """
- Try to figure out if an application object supports ASGI3.
-
- This is how uvicorn figures out the application version as well.
- """
- if inspect.isclass(app):
- return hasattr(app, "__await__")
- elif inspect.isfunction(app):
- return asyncio.iscoroutinefunction(app)
- else:
- call = getattr(app, "__call__", None) # noqa
- return asyncio.iscoroutinefunction(call)
-
-
-class SentryAsgiMiddleware:
- __slots__ = ("app", "__call__", "transaction_style")
-
- def __init__(self, app, unsafe_context_data=False, transaction_style="endpoint"):
- # type: (Any, bool, str) -> None
- """
- Instrument an ASGI application with Sentry. Provides HTTP/websocket
- data to sent events and basic handling for exceptions bubbling up
- through the middleware.
-
- :param unsafe_context_data: Disable errors when a proper contextvars installation could not be found. We do not recommend changing this from the default.
- """
-
- if not unsafe_context_data and not HAS_REAL_CONTEXTVARS:
- # We better have contextvars or we're going to leak state between
- # requests.
- raise RuntimeError(
- "The ASGI middleware for Sentry requires Python 3.7+ "
- "or the aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
- )
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
- self.app = app
-
- if _looks_like_asgi3(app):
- self.__call__ = self._run_asgi3 # type: Callable[..., Any]
- else:
- self.__call__ = self._run_asgi2
-
- def _run_asgi2(self, scope):
- # type: (Any) -> Any
- async def inner(receive, send):
- # type: (Any, Any) -> Any
- return await self._run_app(scope, lambda: self.app(scope)(receive, send))
-
- return inner
-
- async def _run_asgi3(self, scope, receive, send):
- # type: (Any, Any, Any) -> Any
- return await self._run_app(scope, lambda: self.app(scope, receive, send))
-
- async def _run_app(self, scope, callback):
- # type: (Any, Any) -> Any
- is_recursive_asgi_middleware = _asgi_middleware_applied.get(False)
-
- if is_recursive_asgi_middleware:
- try:
- return await callback()
- except Exception as exc:
- _capture_exception(Hub.current, exc)
- raise exc from None
-
- _asgi_middleware_applied.set(True)
- try:
- hub = Hub(Hub.current)
- with auto_session_tracking(hub, session_mode="request"):
- with hub:
- with hub.configure_scope() as sentry_scope:
- sentry_scope.clear_breadcrumbs()
- sentry_scope._name = "asgi"
- processor = partial(self.event_processor, asgi_scope=scope)
- sentry_scope.add_event_processor(processor)
-
- ty = scope["type"]
-
- if ty in ("http", "websocket"):
- transaction = Transaction.continue_from_headers(
- self._get_headers(scope),
- op="{}.server".format(ty),
- )
- else:
- transaction = Transaction(op="asgi.server")
-
- transaction.name = _DEFAULT_TRANSACTION_NAME
- transaction.set_tag("asgi.type", ty)
-
- with hub.start_transaction(
- transaction, custom_sampling_context={"asgi_scope": scope}
- ):
- # XXX: Would be cool to have correct span status, but we
- # would have to wrap send(). That is a bit hard to do with
- # the current abstraction over ASGI 2/3.
- try:
- return await callback()
- except Exception as exc:
- _capture_exception(hub, exc)
- raise exc from None
- finally:
- _asgi_middleware_applied.set(False)
-
- def event_processor(self, event, hint, asgi_scope):
- # type: (Event, Hint, Any) -> Optional[Event]
- request_info = event.get("request", {})
-
- ty = asgi_scope["type"]
- if ty in ("http", "websocket"):
- request_info["method"] = asgi_scope.get("method")
- request_info["headers"] = headers = _filter_headers(
- self._get_headers(asgi_scope)
- )
- request_info["query_string"] = self._get_query(asgi_scope)
-
- request_info["url"] = self._get_url(
- asgi_scope, "http" if ty == "http" else "ws", headers.get("host")
- )
-
- client = asgi_scope.get("client")
- if client and _should_send_default_pii():
- request_info["env"] = {"REMOTE_ADDR": self._get_ip(asgi_scope)}
-
- if (
- event.get("transaction", _DEFAULT_TRANSACTION_NAME)
- == _DEFAULT_TRANSACTION_NAME
- ):
- if self.transaction_style == "endpoint":
- endpoint = asgi_scope.get("endpoint")
- # Webframeworks like Starlette mutate the ASGI env once routing is
- # done, which is sometime after the request has started. If we have
- # an endpoint, overwrite our generic transaction name.
- if endpoint:
- event["transaction"] = transaction_from_function(endpoint)
- elif self.transaction_style == "url":
- # FastAPI includes the route object in the scope to let Sentry extract the
- # path from it for the transaction name
- route = asgi_scope.get("route")
- if route:
- path = getattr(route, "path", None)
- if path is not None:
- event["transaction"] = path
-
- event["request"] = request_info
-
- return event
-
- # Helper functions for extracting request data.
- #
- # Note: Those functions are not public API. If you want to mutate request
- # data to your liking it's recommended to use the `before_send` callback
- # for that.
-
- def _get_url(self, scope, default_scheme, host):
- # type: (Dict[str, Any], Literal["ws", "http"], Optional[str]) -> str
- """
- Extract URL from the ASGI scope, without also including the querystring.
- """
- scheme = scope.get("scheme", default_scheme)
-
- server = scope.get("server", None)
- path = scope.get("root_path", "") + scope.get("path", "")
-
- if host:
- return "%s://%s%s" % (scheme, host, path)
-
- if server is not None:
- host, port = server
- default_port = {"http": 80, "https": 443, "ws": 80, "wss": 443}[scheme]
- if port != default_port:
- return "%s://%s:%s%s" % (scheme, host, port, path)
- return "%s://%s%s" % (scheme, host, path)
- return path
-
- def _get_query(self, scope):
- # type: (Any) -> Any
- """
- Extract querystring from the ASGI scope, in the format that the Sentry protocol expects.
- """
- qs = scope.get("query_string")
- if not qs:
- return None
- return urllib.parse.unquote(qs.decode("latin-1"))
-
- def _get_ip(self, scope):
- # type: (Any) -> str
- """
- Extract IP Address from the ASGI scope based on request headers with fallback to scope client.
- """
- headers = self._get_headers(scope)
- try:
- return headers["x-forwarded-for"].split(",")[0].strip()
- except (KeyError, IndexError):
- pass
-
- try:
- return headers["x-real-ip"]
- except KeyError:
- pass
-
- return scope.get("client")[0]
-
- def _get_headers(self, scope):
- # type: (Any) -> Dict[str, str]
- """
- Extract headers from the ASGI scope, in the format that the Sentry protocol expects.
- """
- headers = {} # type: Dict[str, str]
- for raw_key, raw_value in scope["headers"]:
- key = raw_key.decode("latin-1")
- value = raw_value.decode("latin-1")
- if key in headers:
- headers[key] = headers[key] + ", " + value
- else:
- headers[key] = value
- return headers
diff --git a/sentry_sdk/integrations/atexit.py b/sentry_sdk/integrations/atexit.py
deleted file mode 100644
index 18fe657..0000000
--- a/sentry_sdk/integrations/atexit.py
+++ /dev/null
@@ -1,62 +0,0 @@
-from __future__ import absolute_import
-
-import os
-import sys
-import atexit
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import logger
-from sentry_sdk.integrations import Integration
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
-
- from typing import Any
- from typing import Optional
-
-
-def default_callback(pending, timeout):
- # type: (int, int) -> None
- """This is the default shutdown callback that is set on the options.
- It prints out a message to stderr that informs the user that some events
- are still pending and the process is waiting for them to flush out.
- """
-
- def echo(msg):
- # type: (str) -> None
- sys.stderr.write(msg + "\n")
-
- echo("Sentry is attempting to send %i pending error messages" % pending)
- echo("Waiting up to %s seconds" % timeout)
- echo("Press Ctrl-%s to quit" % (os.name == "nt" and "Break" or "C"))
- sys.stderr.flush()
-
-
-class AtexitIntegration(Integration):
- identifier = "atexit"
-
- def __init__(self, callback=None):
- # type: (Optional[Any]) -> None
- if callback is None:
- callback = default_callback
- self.callback = callback
-
- @staticmethod
- def setup_once():
- # type: () -> None
- @atexit.register
- def _shutdown():
- # type: () -> None
- logger.debug("atexit: got shutdown signal")
- hub = Hub.main
- integration = hub.get_integration(AtexitIntegration)
- if integration is not None:
- logger.debug("atexit: shutting down client")
-
- # If there is a session on the hub, close it now.
- hub.end_session()
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
- client.close(callback=integration.callback)
diff --git a/sentry_sdk/integrations/aws_lambda.py b/sentry_sdk/integrations/aws_lambda.py
deleted file mode 100644
index 0eae710..0000000
--- a/sentry_sdk/integrations/aws_lambda.py
+++ /dev/null
@@ -1,428 +0,0 @@
-from datetime import datetime, timedelta
-from os import environ
-import sys
-
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.tracing import Transaction
-from sentry_sdk._compat import reraise
-from sentry_sdk.utils import (
- AnnotatedValue,
- capture_internal_exceptions,
- event_from_exception,
- logger,
- TimeoutThread,
-)
-from sentry_sdk.integrations import Integration
-from sentry_sdk.integrations._wsgi_common import _filter_headers
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import TypeVar
- from typing import Callable
- from typing import Optional
-
- from sentry_sdk._types import EventProcessor, Event, Hint
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-# Constants
-TIMEOUT_WARNING_BUFFER = 1500 # Buffer time required to send timeout warning to Sentry
-MILLIS_TO_SECONDS = 1000.0
-
-
-def _wrap_init_error(init_error):
- # type: (F) -> F
- def sentry_init_error(*args, **kwargs):
- # type: (*Any, **Any) -> Any
-
- hub = Hub.current
- integration = hub.get_integration(AwsLambdaIntegration)
- if integration is None:
- return init_error(*args, **kwargs)
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- with capture_internal_exceptions():
- with hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
-
- exc_info = sys.exc_info()
- if exc_info and all(exc_info):
- sentry_event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "aws_lambda", "handled": False},
- )
- hub.capture_event(sentry_event, hint=hint)
-
- return init_error(*args, **kwargs)
-
- return sentry_init_error # type: ignore
-
-
-def _wrap_handler(handler):
- # type: (F) -> F
- def sentry_handler(aws_event, aws_context, *args, **kwargs):
- # type: (Any, Any, *Any, **Any) -> Any
-
- # Per https://docs.aws.amazon.com/lambda/latest/dg/python-handler.html,
- # `event` here is *likely* a dictionary, but also might be a number of
- # other types (str, int, float, None).
- #
- # In some cases, it is a list (if the user is batch-invoking their
- # function, for example), in which case we'll use the first entry as a
- # representative from which to try pulling request data. (Presumably it
- # will be the same for all events in the list, since they're all hitting
- # the lambda in the same request.)
-
- if isinstance(aws_event, list):
- request_data = aws_event[0]
- batch_size = len(aws_event)
- else:
- request_data = aws_event
- batch_size = 1
-
- if not isinstance(request_data, dict):
- # If we're not dealing with a dictionary, we won't be able to get
- # headers, path, http method, etc in any case, so it's fine that
- # this is empty
- request_data = {}
-
- hub = Hub.current
- integration = hub.get_integration(AwsLambdaIntegration)
- if integration is None:
- return handler(aws_event, aws_context, *args, **kwargs)
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
- configured_time = aws_context.get_remaining_time_in_millis()
-
- with hub.push_scope() as scope:
- timeout_thread = None
- with capture_internal_exceptions():
- scope.clear_breadcrumbs()
- scope.add_event_processor(
- _make_request_event_processor(
- request_data, aws_context, configured_time
- )
- )
- scope.set_tag(
- "aws_region", aws_context.invoked_function_arn.split(":")[3]
- )
- if batch_size > 1:
- scope.set_tag("batch_request", True)
- scope.set_tag("batch_size", batch_size)
-
- # Starting the Timeout thread only if the configured time is greater than Timeout warning
- # buffer and timeout_warning parameter is set True.
- if (
- integration.timeout_warning
- and configured_time > TIMEOUT_WARNING_BUFFER
- ):
- waiting_time = (
- configured_time - TIMEOUT_WARNING_BUFFER
- ) / MILLIS_TO_SECONDS
-
- timeout_thread = TimeoutThread(
- waiting_time,
- configured_time / MILLIS_TO_SECONDS,
- )
-
- # Starting the thread to raise timeout warning exception
- timeout_thread.start()
-
- headers = request_data.get("headers")
- # AWS Service may set an explicit `{headers: None}`, we can't rely on `.get()`'s default.
- if headers is None:
- headers = {}
- transaction = Transaction.continue_from_headers(
- headers, op="serverless.function", name=aws_context.function_name
- )
- with hub.start_transaction(
- transaction,
- custom_sampling_context={
- "aws_event": aws_event,
- "aws_context": aws_context,
- },
- ):
- try:
- return handler(aws_event, aws_context, *args, **kwargs)
- except Exception:
- exc_info = sys.exc_info()
- sentry_event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "aws_lambda", "handled": False},
- )
- hub.capture_event(sentry_event, hint=hint)
- reraise(*exc_info)
- finally:
- if timeout_thread:
- timeout_thread.stop()
-
- return sentry_handler # type: ignore
-
-
-def _drain_queue():
- # type: () -> None
- with capture_internal_exceptions():
- hub = Hub.current
- integration = hub.get_integration(AwsLambdaIntegration)
- if integration is not None:
- # Flush out the event queue before AWS kills the
- # process.
- hub.flush()
-
-
-class AwsLambdaIntegration(Integration):
- identifier = "aws_lambda"
-
- def __init__(self, timeout_warning=False):
- # type: (bool) -> None
- self.timeout_warning = timeout_warning
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- lambda_bootstrap = get_lambda_bootstrap()
- if not lambda_bootstrap:
- logger.warning(
- "Not running in AWS Lambda environment, "
- "AwsLambdaIntegration disabled (could not find bootstrap module)"
- )
- return
-
- if not hasattr(lambda_bootstrap, "handle_event_request"):
- logger.warning(
- "Not running in AWS Lambda environment, "
- "AwsLambdaIntegration disabled (could not find handle_event_request)"
- )
- return
-
- pre_37 = hasattr(lambda_bootstrap, "handle_http_request") # Python 3.6 or 2.7
-
- if pre_37:
- old_handle_event_request = lambda_bootstrap.handle_event_request
-
- def sentry_handle_event_request(request_handler, *args, **kwargs):
- # type: (Any, *Any, **Any) -> Any
- request_handler = _wrap_handler(request_handler)
- return old_handle_event_request(request_handler, *args, **kwargs)
-
- lambda_bootstrap.handle_event_request = sentry_handle_event_request
-
- old_handle_http_request = lambda_bootstrap.handle_http_request
-
- def sentry_handle_http_request(request_handler, *args, **kwargs):
- # type: (Any, *Any, **Any) -> Any
- request_handler = _wrap_handler(request_handler)
- return old_handle_http_request(request_handler, *args, **kwargs)
-
- lambda_bootstrap.handle_http_request = sentry_handle_http_request
-
- # Patch to_json to drain the queue. This should work even when the
- # SDK is initialized inside of the handler
-
- old_to_json = lambda_bootstrap.to_json
-
- def sentry_to_json(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- _drain_queue()
- return old_to_json(*args, **kwargs)
-
- lambda_bootstrap.to_json = sentry_to_json
- else:
- lambda_bootstrap.LambdaRuntimeClient.post_init_error = _wrap_init_error(
- lambda_bootstrap.LambdaRuntimeClient.post_init_error
- )
-
- old_handle_event_request = lambda_bootstrap.handle_event_request
-
- def sentry_handle_event_request( # type: ignore
- lambda_runtime_client, request_handler, *args, **kwargs
- ):
- request_handler = _wrap_handler(request_handler)
- return old_handle_event_request(
- lambda_runtime_client, request_handler, *args, **kwargs
- )
-
- lambda_bootstrap.handle_event_request = sentry_handle_event_request
-
- # Patch the runtime client to drain the queue. This should work
- # even when the SDK is initialized inside of the handler
-
- def _wrap_post_function(f):
- # type: (F) -> F
- def inner(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- _drain_queue()
- return f(*args, **kwargs)
-
- return inner # type: ignore
-
- lambda_bootstrap.LambdaRuntimeClient.post_invocation_result = (
- _wrap_post_function(
- lambda_bootstrap.LambdaRuntimeClient.post_invocation_result
- )
- )
- lambda_bootstrap.LambdaRuntimeClient.post_invocation_error = (
- _wrap_post_function(
- lambda_bootstrap.LambdaRuntimeClient.post_invocation_error
- )
- )
-
-
-def get_lambda_bootstrap():
- # type: () -> Optional[Any]
-
- # Python 2.7: Everything is in `__main__`.
- #
- # Python 3.7: If the bootstrap module is *already imported*, it is the
- # one we actually want to use (no idea what's in __main__)
- #
- # Python 3.8: bootstrap is also importable, but will be the same file
- # as __main__ imported under a different name:
- #
- # sys.modules['__main__'].__file__ == sys.modules['bootstrap'].__file__
- # sys.modules['__main__'] is not sys.modules['bootstrap']
- #
- # Python 3.9: bootstrap is in __main__.awslambdaricmain
- #
- # On container builds using the `aws-lambda-python-runtime-interface-client`
- # (awslamdaric) module, bootstrap is located in sys.modules['__main__'].bootstrap
- #
- # Such a setup would then make all monkeypatches useless.
- if "bootstrap" in sys.modules:
- return sys.modules["bootstrap"]
- elif "__main__" in sys.modules:
- module = sys.modules["__main__"]
- # python3.9 runtime
- if hasattr(module, "awslambdaricmain") and hasattr(
- module.awslambdaricmain, "bootstrap" # type: ignore
- ):
- return module.awslambdaricmain.bootstrap # type: ignore
- elif hasattr(module, "bootstrap"):
- # awslambdaric python module in container builds
- return module.bootstrap # type: ignore
-
- # python3.8 runtime
- return module
- else:
- return None
-
-
-def _make_request_event_processor(aws_event, aws_context, configured_timeout):
- # type: (Any, Any, Any) -> EventProcessor
- start_time = datetime.utcnow()
-
- def event_processor(sentry_event, hint, start_time=start_time):
- # type: (Event, Hint, datetime) -> Optional[Event]
- remaining_time_in_milis = aws_context.get_remaining_time_in_millis()
- exec_duration = configured_timeout - remaining_time_in_milis
-
- extra = sentry_event.setdefault("extra", {})
- extra["lambda"] = {
- "function_name": aws_context.function_name,
- "function_version": aws_context.function_version,
- "invoked_function_arn": aws_context.invoked_function_arn,
- "aws_request_id": aws_context.aws_request_id,
- "execution_duration_in_millis": exec_duration,
- "remaining_time_in_millis": remaining_time_in_milis,
- }
-
- extra["cloudwatch logs"] = {
- "url": _get_cloudwatch_logs_url(aws_context, start_time),
- "log_group": aws_context.log_group_name,
- "log_stream": aws_context.log_stream_name,
- }
-
- request = sentry_event.get("request", {})
-
- if "httpMethod" in aws_event:
- request["method"] = aws_event["httpMethod"]
-
- request["url"] = _get_url(aws_event, aws_context)
-
- if "queryStringParameters" in aws_event:
- request["query_string"] = aws_event["queryStringParameters"]
-
- if "headers" in aws_event:
- request["headers"] = _filter_headers(aws_event["headers"])
-
- if _should_send_default_pii():
- user_info = sentry_event.setdefault("user", {})
-
- identity = aws_event.get("identity")
- if identity is None:
- identity = {}
-
- id = identity.get("userArn")
- if id is not None:
- user_info.setdefault("id", id)
-
- ip = identity.get("sourceIp")
- if ip is not None:
- user_info.setdefault("ip_address", ip)
-
- if "body" in aws_event:
- request["data"] = aws_event.get("body", "")
- else:
- if aws_event.get("body", None):
- # Unfortunately couldn't find a way to get structured body from AWS
- # event. Meaning every body is unstructured to us.
- request["data"] = AnnotatedValue("", {"rem": [["!raw", "x", 0, 0]]})
-
- sentry_event["request"] = request
-
- return sentry_event
-
- return event_processor
-
-
-def _get_url(aws_event, aws_context):
- # type: (Any, Any) -> str
- path = aws_event.get("path", None)
-
- headers = aws_event.get("headers")
- if headers is None:
- headers = {}
-
- host = headers.get("Host", None)
- proto = headers.get("X-Forwarded-Proto", None)
- if proto and host and path:
- return "{}://{}{}".format(proto, host, path)
- return "awslambda:///{}".format(aws_context.function_name)
-
-
-def _get_cloudwatch_logs_url(aws_context, start_time):
- # type: (Any, datetime) -> str
- """
- Generates a CloudWatchLogs console URL based on the context object
-
- Arguments:
- aws_context {Any} -- context from lambda handler
-
- Returns:
- str -- AWS Console URL to logs.
- """
- formatstring = "%Y-%m-%dT%H:%M:%SZ"
- region = environ.get("AWS_REGION", "")
-
- url = (
- "https://console.{domain}/cloudwatch/home?region={region}"
- "#logEventViewer:group={log_group};stream={log_stream}"
- ";start={start_time};end={end_time}"
- ).format(
- domain="amazonaws.cn" if region.startswith("cn-") else "aws.amazon.com",
- region=region,
- log_group=aws_context.log_group_name,
- log_stream=aws_context.log_stream_name,
- start_time=(start_time - timedelta(seconds=1)).strftime(formatstring),
- end_time=(datetime.utcnow() + timedelta(seconds=2)).strftime(formatstring),
- )
-
- return url
diff --git a/sentry_sdk/integrations/beam.py b/sentry_sdk/integrations/beam.py
deleted file mode 100644
index 30faa38..0000000
--- a/sentry_sdk/integrations/beam.py
+++ /dev/null
@@ -1,185 +0,0 @@
-from __future__ import absolute_import
-
-import sys
-import types
-from sentry_sdk._functools import wraps
-
-from sentry_sdk.hub import Hub
-from sentry_sdk._compat import reraise
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-from sentry_sdk.integrations import Integration
-from sentry_sdk.integrations.logging import ignore_logger
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Iterator
- from typing import TypeVar
- from typing import Optional
- from typing import Callable
-
- from sentry_sdk.client import Client
- from sentry_sdk._types import ExcInfo
-
- T = TypeVar("T")
- F = TypeVar("F", bound=Callable[..., Any])
-
-
-WRAPPED_FUNC = "_wrapped_{}_"
-INSPECT_FUNC = "_inspect_{}" # Required format per apache_beam/transforms/core.py
-USED_FUNC = "_sentry_used_"
-
-
-class BeamIntegration(Integration):
- identifier = "beam"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- from apache_beam.transforms.core import DoFn, ParDo # type: ignore
-
- ignore_logger("root")
- ignore_logger("bundle_processor.create")
-
- function_patches = ["process", "start_bundle", "finish_bundle", "setup"]
- for func_name in function_patches:
- setattr(
- DoFn,
- INSPECT_FUNC.format(func_name),
- _wrap_inspect_call(DoFn, func_name),
- )
-
- old_init = ParDo.__init__
-
- def sentry_init_pardo(self, fn, *args, **kwargs):
- # type: (ParDo, Any, *Any, **Any) -> Any
- # Do not monkey patch init twice
- if not getattr(self, "_sentry_is_patched", False):
- for func_name in function_patches:
- if not hasattr(fn, func_name):
- continue
- wrapped_func = WRAPPED_FUNC.format(func_name)
-
- # Check to see if inspect is set and process is not
- # to avoid monkey patching process twice.
- # Check to see if function is part of object for
- # backwards compatibility.
- process_func = getattr(fn, func_name)
- inspect_func = getattr(fn, INSPECT_FUNC.format(func_name))
- if not getattr(inspect_func, USED_FUNC, False) and not getattr(
- process_func, USED_FUNC, False
- ):
- setattr(fn, wrapped_func, process_func)
- setattr(fn, func_name, _wrap_task_call(process_func))
-
- self._sentry_is_patched = True
- old_init(self, fn, *args, **kwargs)
-
- ParDo.__init__ = sentry_init_pardo
-
-
-def _wrap_inspect_call(cls, func_name):
- # type: (Any, Any) -> Any
-
- if not hasattr(cls, func_name):
- return None
-
- def _inspect(self):
- # type: (Any) -> Any
- """
- Inspect function overrides the way Beam gets argspec.
- """
- wrapped_func = WRAPPED_FUNC.format(func_name)
- if hasattr(self, wrapped_func):
- process_func = getattr(self, wrapped_func)
- else:
- process_func = getattr(self, func_name)
- setattr(self, func_name, _wrap_task_call(process_func))
- setattr(self, wrapped_func, process_func)
-
- # getfullargspec is deprecated in more recent beam versions and get_function_args_defaults
- # (which uses Signatures internally) should be used instead.
- try:
- from apache_beam.transforms.core import get_function_args_defaults
-
- return get_function_args_defaults(process_func)
- except ImportError:
- from apache_beam.typehints.decorators import getfullargspec # type: ignore
-
- return getfullargspec(process_func)
-
- setattr(_inspect, USED_FUNC, True)
- return _inspect
-
-
-def _wrap_task_call(func):
- # type: (F) -> F
- """
- Wrap task call with a try catch to get exceptions.
- Pass the client on to raise_exception so it can get rebinded.
- """
- client = Hub.current.client
-
- @wraps(func)
- def _inner(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- try:
- gen = func(*args, **kwargs)
- except Exception:
- raise_exception(client)
-
- if not isinstance(gen, types.GeneratorType):
- return gen
- return _wrap_generator_call(gen, client)
-
- setattr(_inner, USED_FUNC, True)
- return _inner # type: ignore
-
-
-def _capture_exception(exc_info, hub):
- # type: (ExcInfo, Hub) -> None
- """
- Send Beam exception to Sentry.
- """
- integration = hub.get_integration(BeamIntegration)
- if integration is None:
- return
-
- client = hub.client
- if client is None:
- return
-
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "beam", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
-
-def raise_exception(client):
- # type: (Optional[Client]) -> None
- """
- Raise an exception. If the client is not in the hub, rebind it.
- """
- hub = Hub.current
- if hub.client is None:
- hub.bind_client(client)
- exc_info = sys.exc_info()
- with capture_internal_exceptions():
- _capture_exception(exc_info, hub)
- reraise(*exc_info)
-
-
-def _wrap_generator_call(gen, client):
- # type: (Iterator[T], Optional[Client]) -> Iterator[T]
- """
- Wrap the generator to handle any failures.
- """
- while True:
- try:
- yield next(gen)
- except StopIteration:
- break
- except Exception:
- raise_exception(client)
diff --git a/sentry_sdk/integrations/boto3.py b/sentry_sdk/integrations/boto3.py
deleted file mode 100644
index e65f5a7..0000000
--- a/sentry_sdk/integrations/boto3.py
+++ /dev/null
@@ -1,130 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk import Hub
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.tracing import Span
-
-from sentry_sdk._functools import partial
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Optional
- from typing import Type
-
-try:
- from botocore import __version__ as BOTOCORE_VERSION # type: ignore
- from botocore.client import BaseClient # type: ignore
- from botocore.response import StreamingBody # type: ignore
- from botocore.awsrequest import AWSRequest # type: ignore
-except ImportError:
- raise DidNotEnable("botocore is not installed")
-
-
-class Boto3Integration(Integration):
- identifier = "boto3"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- try:
- version = tuple(map(int, BOTOCORE_VERSION.split(".")[:3]))
- except (ValueError, TypeError):
- raise DidNotEnable(
- "Unparsable botocore version: {}".format(BOTOCORE_VERSION)
- )
- if version < (1, 12):
- raise DidNotEnable("Botocore 1.12 or newer is required.")
- orig_init = BaseClient.__init__
-
- def sentry_patched_init(self, *args, **kwargs):
- # type: (Type[BaseClient], *Any, **Any) -> None
- orig_init(self, *args, **kwargs)
- meta = self.meta
- service_id = meta.service_model.service_id.hyphenize()
- meta.events.register(
- "request-created",
- partial(_sentry_request_created, service_id=service_id),
- )
- meta.events.register("after-call", _sentry_after_call)
- meta.events.register("after-call-error", _sentry_after_call_error)
-
- BaseClient.__init__ = sentry_patched_init
-
-
-def _sentry_request_created(service_id, request, operation_name, **kwargs):
- # type: (str, AWSRequest, str, **Any) -> None
- hub = Hub.current
- if hub.get_integration(Boto3Integration) is None:
- return
-
- description = "aws.%s.%s" % (service_id, operation_name)
- span = hub.start_span(
- hub=hub,
- op="aws.request",
- description=description,
- )
- span.set_tag("aws.service_id", service_id)
- span.set_tag("aws.operation_name", operation_name)
- span.set_data("aws.request.url", request.url)
-
- # We do it in order for subsequent http calls/retries be
- # attached to this span.
- span.__enter__()
-
- # request.context is an open-ended data-structure
- # where we can add anything useful in request life cycle.
- request.context["_sentrysdk_span"] = span
-
-
-def _sentry_after_call(context, parsed, **kwargs):
- # type: (Dict[str, Any], Dict[str, Any], **Any) -> None
- span = context.pop("_sentrysdk_span", None) # type: Optional[Span]
-
- # Span could be absent if the integration is disabled.
- if span is None:
- return
- span.__exit__(None, None, None)
-
- body = parsed.get("Body")
- if not isinstance(body, StreamingBody):
- return
-
- streaming_span = span.start_child(
- op="aws.request.stream",
- description=span.description,
- )
-
- orig_read = body.read
- orig_close = body.close
-
- def sentry_streaming_body_read(*args, **kwargs):
- # type: (*Any, **Any) -> bytes
- try:
- ret = orig_read(*args, **kwargs)
- if not ret:
- streaming_span.finish()
- return ret
- except Exception:
- streaming_span.finish()
- raise
-
- body.read = sentry_streaming_body_read
-
- def sentry_streaming_body_close(*args, **kwargs):
- # type: (*Any, **Any) -> None
- streaming_span.finish()
- orig_close(*args, **kwargs)
-
- body.close = sentry_streaming_body_close
-
-
-def _sentry_after_call_error(context, exception, **kwargs):
- # type: (Dict[str, Any], Type[BaseException], **Any) -> None
- span = context.pop("_sentrysdk_span", None) # type: Optional[Span]
-
- # Span could be absent if the integration is disabled.
- if span is None:
- return
- span.__exit__(type(exception), exception, None)
diff --git a/sentry_sdk/integrations/bottle.py b/sentry_sdk/integrations/bottle.py
deleted file mode 100644
index 4fa077e..0000000
--- a/sentry_sdk/integrations/bottle.py
+++ /dev/null
@@ -1,199 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- event_from_exception,
- transaction_from_function,
-)
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations.wsgi import SentryWsgiMiddleware
-from sentry_sdk.integrations._wsgi_common import RequestExtractor
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from sentry_sdk.integrations.wsgi import _ScopedResponse
- from typing import Any
- from typing import Dict
- from typing import Callable
- from typing import Optional
- from bottle import FileUpload, FormsDict, LocalRequest # type: ignore
-
- from sentry_sdk._types import EventProcessor
-
-try:
- from bottle import (
- Bottle,
- Route,
- request as bottle_request,
- HTTPResponse,
- __version__ as BOTTLE_VERSION,
- )
-except ImportError:
- raise DidNotEnable("Bottle not installed")
-
-
-TRANSACTION_STYLE_VALUES = ("endpoint", "url")
-
-
-class BottleIntegration(Integration):
- identifier = "bottle"
-
- transaction_style = None
-
- def __init__(self, transaction_style="endpoint"):
- # type: (str) -> None
-
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- try:
- version = tuple(map(int, BOTTLE_VERSION.replace("-dev", "").split(".")))
- except (TypeError, ValueError):
- raise DidNotEnable("Unparsable Bottle version: {}".format(version))
-
- if version < (0, 12):
- raise DidNotEnable("Bottle 0.12 or newer required.")
-
- # monkey patch method Bottle.__call__
- old_app = Bottle.__call__
-
- def sentry_patched_wsgi_app(self, environ, start_response):
- # type: (Any, Dict[str, str], Callable[..., Any]) -> _ScopedResponse
-
- hub = Hub.current
- integration = hub.get_integration(BottleIntegration)
- if integration is None:
- return old_app(self, environ, start_response)
-
- return SentryWsgiMiddleware(lambda *a, **kw: old_app(self, *a, **kw))(
- environ, start_response
- )
-
- Bottle.__call__ = sentry_patched_wsgi_app
-
- # monkey patch method Bottle._handle
- old_handle = Bottle._handle
-
- def _patched_handle(self, environ):
- # type: (Bottle, Dict[str, Any]) -> Any
- hub = Hub.current
- integration = hub.get_integration(BottleIntegration)
- if integration is None:
- return old_handle(self, environ)
-
- # create new scope
- scope_manager = hub.push_scope()
-
- with scope_manager:
- app = self
- with hub.configure_scope() as scope:
- scope._name = "bottle"
- scope.add_event_processor(
- _make_request_event_processor(app, bottle_request, integration)
- )
- res = old_handle(self, environ)
-
- # scope cleanup
- return res
-
- Bottle._handle = _patched_handle
-
- # monkey patch method Route._make_callback
- old_make_callback = Route._make_callback
-
- def patched_make_callback(self, *args, **kwargs):
- # type: (Route, *object, **object) -> Any
- hub = Hub.current
- integration = hub.get_integration(BottleIntegration)
- prepared_callback = old_make_callback(self, *args, **kwargs)
- if integration is None:
- return prepared_callback
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- def wrapped_callback(*args, **kwargs):
- # type: (*object, **object) -> Any
-
- try:
- res = prepared_callback(*args, **kwargs)
- except HTTPResponse:
- raise
- except Exception as exception:
- event, hint = event_from_exception(
- exception,
- client_options=client.options,
- mechanism={"type": "bottle", "handled": False},
- )
- hub.capture_event(event, hint=hint)
- raise exception
-
- return res
-
- return wrapped_callback
-
- Route._make_callback = patched_make_callback
-
-
-class BottleRequestExtractor(RequestExtractor):
- def env(self):
- # type: () -> Dict[str, str]
- return self.request.environ
-
- def cookies(self):
- # type: () -> Dict[str, str]
- return self.request.cookies
-
- def raw_data(self):
- # type: () -> bytes
- return self.request.body.read()
-
- def form(self):
- # type: () -> FormsDict
- if self.is_json():
- return None
- return self.request.forms.decode()
-
- def files(self):
- # type: () -> Optional[Dict[str, str]]
- if self.is_json():
- return None
-
- return self.request.files
-
- def size_of_file(self, file):
- # type: (FileUpload) -> int
- return file.content_length
-
-
-def _make_request_event_processor(app, request, integration):
- # type: (Bottle, LocalRequest, BottleIntegration) -> EventProcessor
- def inner(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
-
- try:
- if integration.transaction_style == "endpoint":
- event["transaction"] = request.route.name or transaction_from_function(
- request.route.callback
- )
- elif integration.transaction_style == "url":
- event["transaction"] = request.route.rule
- except Exception:
- pass
-
- with capture_internal_exceptions():
- BottleRequestExtractor(request).extract_into_event(event)
-
- return event
-
- return inner
diff --git a/sentry_sdk/integrations/celery.py b/sentry_sdk/integrations/celery.py
deleted file mode 100644
index 40a2dfb..0000000
--- a/sentry_sdk/integrations/celery.py
+++ /dev/null
@@ -1,289 +0,0 @@
-from __future__ import absolute_import
-
-import sys
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-from sentry_sdk.tracing import Transaction
-from sentry_sdk._compat import reraise
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations.logging import ignore_logger
-from sentry_sdk._types import MYPY
-from sentry_sdk._functools import wraps
-
-if MYPY:
- from typing import Any
- from typing import TypeVar
- from typing import Callable
- from typing import Optional
-
- from sentry_sdk._types import EventProcessor, Event, Hint, ExcInfo
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-
-try:
- from celery import VERSION as CELERY_VERSION # type: ignore
- from celery.exceptions import ( # type: ignore
- SoftTimeLimitExceeded,
- Retry,
- Ignore,
- Reject,
- )
- from celery.app.trace import task_has_custom
-except ImportError:
- raise DidNotEnable("Celery not installed")
-
-
-CELERY_CONTROL_FLOW_EXCEPTIONS = (Retry, Ignore, Reject)
-
-
-class CeleryIntegration(Integration):
- identifier = "celery"
-
- def __init__(self, propagate_traces=True):
- # type: (bool) -> None
- self.propagate_traces = propagate_traces
-
- @staticmethod
- def setup_once():
- # type: () -> None
- if CELERY_VERSION < (3,):
- raise DidNotEnable("Celery 3 or newer required.")
-
- import celery.app.trace as trace # type: ignore
-
- old_build_tracer = trace.build_tracer
-
- def sentry_build_tracer(name, task, *args, **kwargs):
- # type: (Any, Any, *Any, **Any) -> Any
- if not getattr(task, "_sentry_is_patched", False):
- # determine whether Celery will use __call__ or run and patch
- # accordingly
- if task_has_custom(task, "__call__"):
- type(task).__call__ = _wrap_task_call(task, type(task).__call__)
- else:
- task.run = _wrap_task_call(task, task.run)
-
- # `build_tracer` is apparently called for every task
- # invocation. Can't wrap every celery task for every invocation
- # or we will get infinitely nested wrapper functions.
- task._sentry_is_patched = True
-
- return _wrap_tracer(task, old_build_tracer(name, task, *args, **kwargs))
-
- trace.build_tracer = sentry_build_tracer
-
- from celery.app.task import Task # type: ignore
-
- Task.apply_async = _wrap_apply_async(Task.apply_async)
-
- _patch_worker_exit()
-
- # This logger logs every status of every task that ran on the worker.
- # Meaning that every task's breadcrumbs are full of stuff like "Task
- # raised unexpected ".
- ignore_logger("celery.worker.job")
- ignore_logger("celery.app.trace")
-
- # This is stdout/err redirected to a logger, can't deal with this
- # (need event_level=logging.WARN to reproduce)
- ignore_logger("celery.redirected")
-
-
-def _wrap_apply_async(f):
- # type: (F) -> F
- @wraps(f)
- def apply_async(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- hub = Hub.current
- integration = hub.get_integration(CeleryIntegration)
- if integration is not None and integration.propagate_traces:
- with hub.start_span(op="celery.submit", description=args[0].name) as span:
- with capture_internal_exceptions():
- headers = dict(hub.iter_trace_propagation_headers(span))
-
- if headers:
- # Note: kwargs can contain headers=None, so no setdefault!
- # Unsure which backend though.
- kwarg_headers = kwargs.get("headers") or {}
- kwarg_headers.update(headers)
-
- # https://github.com/celery/celery/issues/4875
- #
- # Need to setdefault the inner headers too since other
- # tracing tools (dd-trace-py) also employ this exact
- # workaround and we don't want to break them.
- kwarg_headers.setdefault("headers", {}).update(headers)
- kwargs["headers"] = kwarg_headers
-
- return f(*args, **kwargs)
- else:
- return f(*args, **kwargs)
-
- return apply_async # type: ignore
-
-
-def _wrap_tracer(task, f):
- # type: (Any, F) -> F
-
- # Need to wrap tracer for pushing the scope before prerun is sent, and
- # popping it after postrun is sent.
- #
- # This is the reason we don't use signals for hooking in the first place.
- # Also because in Celery 3, signal dispatch returns early if one handler
- # crashes.
- @wraps(f)
- def _inner(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- hub = Hub.current
- if hub.get_integration(CeleryIntegration) is None:
- return f(*args, **kwargs)
-
- with hub.push_scope() as scope:
- scope._name = "celery"
- scope.clear_breadcrumbs()
- scope.add_event_processor(_make_event_processor(task, *args, **kwargs))
-
- transaction = None
-
- # Celery task objects are not a thing to be trusted. Even
- # something such as attribute access can fail.
- with capture_internal_exceptions():
- transaction = Transaction.continue_from_headers(
- args[3].get("headers") or {},
- op="celery.task",
- name="unknown celery task",
- )
-
- transaction.name = task.name
- transaction.set_status("ok")
-
- if transaction is None:
- return f(*args, **kwargs)
-
- with hub.start_transaction(
- transaction,
- custom_sampling_context={
- "celery_job": {
- "task": task.name,
- # for some reason, args[1] is a list if non-empty but a
- # tuple if empty
- "args": list(args[1]),
- "kwargs": args[2],
- }
- },
- ):
- return f(*args, **kwargs)
-
- return _inner # type: ignore
-
-
-def _wrap_task_call(task, f):
- # type: (Any, F) -> F
-
- # Need to wrap task call because the exception is caught before we get to
- # see it. Also celery's reported stacktrace is untrustworthy.
-
- # functools.wraps is important here because celery-once looks at this
- # method's name.
- # https://github.com/getsentry/sentry-python/issues/421
- @wraps(f)
- def _inner(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- try:
- return f(*args, **kwargs)
- except Exception:
- exc_info = sys.exc_info()
- with capture_internal_exceptions():
- _capture_exception(task, exc_info)
- reraise(*exc_info)
-
- return _inner # type: ignore
-
-
-def _make_event_processor(task, uuid, args, kwargs, request=None):
- # type: (Any, Any, Any, Any, Optional[Any]) -> EventProcessor
- def event_processor(event, hint):
- # type: (Event, Hint) -> Optional[Event]
-
- with capture_internal_exceptions():
- tags = event.setdefault("tags", {})
- tags["celery_task_id"] = uuid
- extra = event.setdefault("extra", {})
- extra["celery-job"] = {
- "task_name": task.name,
- "args": args,
- "kwargs": kwargs,
- }
-
- if "exc_info" in hint:
- with capture_internal_exceptions():
- if issubclass(hint["exc_info"][0], SoftTimeLimitExceeded):
- event["fingerprint"] = [
- "celery",
- "SoftTimeLimitExceeded",
- getattr(task, "name", task),
- ]
-
- return event
-
- return event_processor
-
-
-def _capture_exception(task, exc_info):
- # type: (Any, ExcInfo) -> None
- hub = Hub.current
-
- if hub.get_integration(CeleryIntegration) is None:
- return
- if isinstance(exc_info[1], CELERY_CONTROL_FLOW_EXCEPTIONS):
- # ??? Doesn't map to anything
- _set_status(hub, "aborted")
- return
-
- _set_status(hub, "internal_error")
-
- if hasattr(task, "throws") and isinstance(exc_info[1], task.throws):
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "celery", "handled": False},
- )
-
- hub.capture_event(event, hint=hint)
-
-
-def _set_status(hub, status):
- # type: (Hub, str) -> None
- with capture_internal_exceptions():
- with hub.configure_scope() as scope:
- if scope.span is not None:
- scope.span.set_status(status)
-
-
-def _patch_worker_exit():
- # type: () -> None
-
- # Need to flush queue before worker shutdown because a crashing worker will
- # call os._exit
- from billiard.pool import Worker # type: ignore
-
- old_workloop = Worker.workloop
-
- def sentry_workloop(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- try:
- return old_workloop(*args, **kwargs)
- finally:
- with capture_internal_exceptions():
- hub = Hub.current
- if hub.get_integration(CeleryIntegration) is not None:
- hub.flush()
-
- Worker.workloop = sentry_workloop
diff --git a/sentry_sdk/integrations/chalice.py b/sentry_sdk/integrations/chalice.py
deleted file mode 100644
index 109862b..0000000
--- a/sentry_sdk/integrations/chalice.py
+++ /dev/null
@@ -1,128 +0,0 @@
-import sys
-
-from sentry_sdk._compat import reraise
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations.aws_lambda import _make_request_event_processor
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- event_from_exception,
-)
-from sentry_sdk._types import MYPY
-from sentry_sdk._functools import wraps
-
-import chalice # type: ignore
-from chalice import Chalice, ChaliceViewError
-from chalice.app import EventSourceHandler as ChaliceEventSourceHandler # type: ignore
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import TypeVar
- from typing import Callable
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-try:
- from chalice import __version__ as CHALICE_VERSION
-except ImportError:
- raise DidNotEnable("Chalice is not installed")
-
-
-class EventSourceHandler(ChaliceEventSourceHandler): # type: ignore
- def __call__(self, event, context):
- # type: (Any, Any) -> Any
- hub = Hub.current
- client = hub.client # type: Any
-
- with hub.push_scope() as scope:
- with capture_internal_exceptions():
- configured_time = context.get_remaining_time_in_millis()
- scope.add_event_processor(
- _make_request_event_processor(event, context, configured_time)
- )
- try:
- return ChaliceEventSourceHandler.__call__(self, event, context)
- except Exception:
- exc_info = sys.exc_info()
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "chalice", "handled": False},
- )
- hub.capture_event(event, hint=hint)
- hub.flush()
- reraise(*exc_info)
-
-
-def _get_view_function_response(app, view_function, function_args):
- # type: (Any, F, Any) -> F
- @wraps(view_function)
- def wrapped_view_function(**function_args):
- # type: (**Any) -> Any
- hub = Hub.current
- client = hub.client # type: Any
- with hub.push_scope() as scope:
- with capture_internal_exceptions():
- configured_time = app.lambda_context.get_remaining_time_in_millis()
- scope.transaction = app.lambda_context.function_name
- scope.add_event_processor(
- _make_request_event_processor(
- app.current_request.to_dict(),
- app.lambda_context,
- configured_time,
- )
- )
- try:
- return view_function(**function_args)
- except Exception as exc:
- if isinstance(exc, ChaliceViewError):
- raise
- exc_info = sys.exc_info()
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "chalice", "handled": False},
- )
- hub.capture_event(event, hint=hint)
- hub.flush()
- raise
-
- return wrapped_view_function # type: ignore
-
-
-class ChaliceIntegration(Integration):
- identifier = "chalice"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- try:
- version = tuple(map(int, CHALICE_VERSION.split(".")[:3]))
- except (ValueError, TypeError):
- raise DidNotEnable("Unparsable Chalice version: {}".format(CHALICE_VERSION))
- if version < (1, 20):
- old_get_view_function_response = Chalice._get_view_function_response
- else:
- from chalice.app import RestAPIEventHandler
-
- old_get_view_function_response = (
- RestAPIEventHandler._get_view_function_response
- )
-
- def sentry_event_response(app, view_function, function_args):
- # type: (Any, F, Dict[str, Any]) -> Any
- wrapped_view_function = _get_view_function_response(
- app, view_function, function_args
- )
-
- return old_get_view_function_response(
- app, wrapped_view_function, function_args
- )
-
- if version < (1, 20):
- Chalice._get_view_function_response = sentry_event_response
- else:
- RestAPIEventHandler._get_view_function_response = sentry_event_response
- # for everything else (like events)
- chalice.app.EventSourceHandler = EventSourceHandler
diff --git a/sentry_sdk/integrations/dedupe.py b/sentry_sdk/integrations/dedupe.py
deleted file mode 100644
index b023df2..0000000
--- a/sentry_sdk/integrations/dedupe.py
+++ /dev/null
@@ -1,43 +0,0 @@
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import ContextVar
-from sentry_sdk.integrations import Integration
-from sentry_sdk.scope import add_global_event_processor
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Optional
-
- from sentry_sdk._types import Event, Hint
-
-
-class DedupeIntegration(Integration):
- identifier = "dedupe"
-
- def __init__(self):
- # type: () -> None
- self._last_seen = ContextVar("last-seen")
-
- @staticmethod
- def setup_once():
- # type: () -> None
- @add_global_event_processor
- def processor(event, hint):
- # type: (Event, Optional[Hint]) -> Optional[Event]
- if hint is None:
- return event
-
- integration = Hub.current.get_integration(DedupeIntegration)
-
- if integration is None:
- return event
-
- exc_info = hint.get("exc_info", None)
- if exc_info is None:
- return event
-
- exc = exc_info[1]
- if integration._last_seen.get(None) is exc:
- return None
- integration._last_seen.set(exc)
- return event
diff --git a/sentry_sdk/integrations/django/__init__.py b/sentry_sdk/integrations/django/__init__.py
deleted file mode 100644
index db90918..0000000
--- a/sentry_sdk/integrations/django/__init__.py
+++ /dev/null
@@ -1,573 +0,0 @@
-# -*- coding: utf-8 -*-
-from __future__ import absolute_import
-
-import sys
-import threading
-import weakref
-
-from sentry_sdk._types import MYPY
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.scope import add_global_event_processor
-from sentry_sdk.serializer import add_global_repr_processor
-from sentry_sdk.tracing_utils import RecordSqlQueries
-from sentry_sdk.utils import (
- HAS_REAL_CONTEXTVARS,
- CONTEXTVARS_ERROR_MESSAGE,
- logger,
- capture_internal_exceptions,
- event_from_exception,
- transaction_from_function,
- walk_exception_chain,
-)
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations.logging import ignore_logger
-from sentry_sdk.integrations.wsgi import SentryWsgiMiddleware
-from sentry_sdk.integrations._wsgi_common import RequestExtractor
-
-try:
- from django import VERSION as DJANGO_VERSION
- from django.core import signals
-
- try:
- from django.urls import resolve
- except ImportError:
- from django.core.urlresolvers import resolve
-except ImportError:
- raise DidNotEnable("Django not installed")
-
-
-from sentry_sdk.integrations.django.transactions import LEGACY_RESOLVER
-from sentry_sdk.integrations.django.templates import (
- get_template_frame_from_exception,
- patch_templates,
-)
-from sentry_sdk.integrations.django.middleware import patch_django_middlewares
-from sentry_sdk.integrations.django.views import patch_views
-
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import Optional
- from typing import Union
- from typing import List
-
- from django.core.handlers.wsgi import WSGIRequest
- from django.http.response import HttpResponse
- from django.http.request import QueryDict
- from django.utils.datastructures import MultiValueDict
-
- from sentry_sdk.scope import Scope
- from sentry_sdk.integrations.wsgi import _ScopedResponse
- from sentry_sdk._types import Event, Hint, EventProcessor, NotImplementedType
-
-
-if DJANGO_VERSION < (1, 10):
-
- def is_authenticated(request_user):
- # type: (Any) -> bool
- return request_user.is_authenticated()
-
-
-else:
-
- def is_authenticated(request_user):
- # type: (Any) -> bool
- return request_user.is_authenticated
-
-
-TRANSACTION_STYLE_VALUES = ("function_name", "url")
-
-
-class DjangoIntegration(Integration):
- identifier = "django"
-
- transaction_style = None
- middleware_spans = None
-
- def __init__(self, transaction_style="url", middleware_spans=True):
- # type: (str, bool) -> None
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
- self.middleware_spans = middleware_spans
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- if DJANGO_VERSION < (1, 8):
- raise DidNotEnable("Django 1.8 or newer is required.")
-
- install_sql_hook()
- # Patch in our custom middleware.
-
- # logs an error for every 500
- ignore_logger("django.server")
- ignore_logger("django.request")
-
- from django.core.handlers.wsgi import WSGIHandler
-
- old_app = WSGIHandler.__call__
-
- def sentry_patched_wsgi_handler(self, environ, start_response):
- # type: (Any, Dict[str, str], Callable[..., Any]) -> _ScopedResponse
- if Hub.current.get_integration(DjangoIntegration) is None:
- return old_app(self, environ, start_response)
-
- bound_old_app = old_app.__get__(self, WSGIHandler)
-
- from django.conf import settings
-
- use_x_forwarded_for = settings.USE_X_FORWARDED_HOST
-
- return SentryWsgiMiddleware(bound_old_app, use_x_forwarded_for)(
- environ, start_response
- )
-
- WSGIHandler.__call__ = sentry_patched_wsgi_handler
-
- _patch_get_response()
-
- _patch_django_asgi_handler()
-
- signals.got_request_exception.connect(_got_request_exception)
-
- @add_global_event_processor
- def process_django_templates(event, hint):
- # type: (Event, Optional[Hint]) -> Optional[Event]
- if hint is None:
- return event
-
- exc_info = hint.get("exc_info", None)
-
- if exc_info is None:
- return event
-
- exception = event.get("exception", None)
-
- if exception is None:
- return event
-
- values = exception.get("values", None)
-
- if values is None:
- return event
-
- for exception, (_, exc_value, _) in zip(
- reversed(values), walk_exception_chain(exc_info)
- ):
- frame = get_template_frame_from_exception(exc_value)
- if frame is not None:
- frames = exception.get("stacktrace", {}).get("frames", [])
-
- for i in reversed(range(len(frames))):
- f = frames[i]
- if (
- f.get("function") in ("Parser.parse", "parse", "render")
- and f.get("module") == "django.template.base"
- ):
- i += 1
- break
- else:
- i = len(frames)
-
- frames.insert(i, frame)
-
- return event
-
- @add_global_repr_processor
- def _django_queryset_repr(value, hint):
- # type: (Any, Dict[str, Any]) -> Union[NotImplementedType, str]
- try:
- # Django 1.6 can fail to import `QuerySet` when Django settings
- # have not yet been initialized.
- #
- # If we fail to import, return `NotImplemented`. It's at least
- # unlikely that we have a query set in `value` when importing
- # `QuerySet` fails.
- from django.db.models.query import QuerySet
- except Exception:
- return NotImplemented
-
- if not isinstance(value, QuerySet) or value._result_cache:
- return NotImplemented
-
- # Do not call Hub.get_integration here. It is intentional that
- # running under a new hub does not suddenly start executing
- # querysets. This might be surprising to the user but it's likely
- # less annoying.
-
- return u"<%s from %s at 0x%x>" % (
- value.__class__.__name__,
- value.__module__,
- id(value),
- )
-
- _patch_channels()
- patch_django_middlewares()
- patch_views()
- patch_templates()
-
-
-_DRF_PATCHED = False
-_DRF_PATCH_LOCK = threading.Lock()
-
-
-def _patch_drf():
- # type: () -> None
- """
- Patch Django Rest Framework for more/better request data. DRF's request
- type is a wrapper around Django's request type. The attribute we're
- interested in is `request.data`, which is a cached property containing a
- parsed request body. Reading a request body from that property is more
- reliable than reading from any of Django's own properties, as those don't
- hold payloads in memory and therefore can only be accessed once.
-
- We patch the Django request object to include a weak backreference to the
- DRF request object, such that we can later use either in
- `DjangoRequestExtractor`.
-
- This function is not called directly on SDK setup, because importing almost
- any part of Django Rest Framework will try to access Django settings (where
- `sentry_sdk.init()` might be called from in the first place). Instead we
- run this function on every request and do the patching on the first
- request.
- """
-
- global _DRF_PATCHED
-
- if _DRF_PATCHED:
- # Double-checked locking
- return
-
- with _DRF_PATCH_LOCK:
- if _DRF_PATCHED:
- return
-
- # We set this regardless of whether the code below succeeds or fails.
- # There is no point in trying to patch again on the next request.
- _DRF_PATCHED = True
-
- with capture_internal_exceptions():
- try:
- from rest_framework.views import APIView # type: ignore
- except ImportError:
- pass
- else:
- old_drf_initial = APIView.initial
-
- def sentry_patched_drf_initial(self, request, *args, **kwargs):
- # type: (APIView, Any, *Any, **Any) -> Any
- with capture_internal_exceptions():
- request._request._sentry_drf_request_backref = weakref.ref(
- request
- )
- pass
- return old_drf_initial(self, request, *args, **kwargs)
-
- APIView.initial = sentry_patched_drf_initial
-
-
-def _patch_channels():
- # type: () -> None
- try:
- from channels.http import AsgiHandler # type: ignore
- except ImportError:
- return
-
- if not HAS_REAL_CONTEXTVARS:
- # We better have contextvars or we're going to leak state between
- # requests.
- #
- # We cannot hard-raise here because channels may not be used at all in
- # the current process. That is the case when running traditional WSGI
- # workers in gunicorn+gevent and the websocket stuff in a separate
- # process.
- logger.warning(
- "We detected that you are using Django channels 2.0."
- + CONTEXTVARS_ERROR_MESSAGE
- )
-
- from sentry_sdk.integrations.django.asgi import patch_channels_asgi_handler_impl
-
- patch_channels_asgi_handler_impl(AsgiHandler)
-
-
-def _patch_django_asgi_handler():
- # type: () -> None
- try:
- from django.core.handlers.asgi import ASGIHandler
- except ImportError:
- return
-
- if not HAS_REAL_CONTEXTVARS:
- # We better have contextvars or we're going to leak state between
- # requests.
- #
- # We cannot hard-raise here because Django's ASGI stuff may not be used
- # at all.
- logger.warning(
- "We detected that you are using Django 3." + CONTEXTVARS_ERROR_MESSAGE
- )
-
- from sentry_sdk.integrations.django.asgi import patch_django_asgi_handler_impl
-
- patch_django_asgi_handler_impl(ASGIHandler)
-
-
-def _before_get_response(request):
- # type: (WSGIRequest) -> None
- hub = Hub.current
- integration = hub.get_integration(DjangoIntegration)
- if integration is None:
- return
-
- _patch_drf()
-
- with hub.configure_scope() as scope:
- # Rely on WSGI middleware to start a trace
- try:
- if integration.transaction_style == "function_name":
- fn = resolve(request.path).func
- scope.transaction = transaction_from_function(
- getattr(fn, "view_class", fn)
- )
- elif integration.transaction_style == "url":
- scope.transaction = LEGACY_RESOLVER.resolve(request.path_info)
- except Exception:
- pass
-
- scope.add_event_processor(
- _make_event_processor(weakref.ref(request), integration)
- )
-
-
-def _attempt_resolve_again(request, scope):
- # type: (WSGIRequest, Scope) -> None
- """
- Some django middlewares overwrite request.urlconf
- so we need to respect that contract,
- so we try to resolve the url again.
- """
- if not hasattr(request, "urlconf"):
- return
-
- try:
- scope.transaction = LEGACY_RESOLVER.resolve(
- request.path_info,
- urlconf=request.urlconf,
- )
- except Exception:
- pass
-
-
-def _after_get_response(request):
- # type: (WSGIRequest) -> None
- hub = Hub.current
- integration = hub.get_integration(DjangoIntegration)
- if integration is None or integration.transaction_style != "url":
- return
-
- with hub.configure_scope() as scope:
- _attempt_resolve_again(request, scope)
-
-
-def _patch_get_response():
- # type: () -> None
- """
- patch get_response, because at that point we have the Django request object
- """
- from django.core.handlers.base import BaseHandler
-
- old_get_response = BaseHandler.get_response
-
- def sentry_patched_get_response(self, request):
- # type: (Any, WSGIRequest) -> Union[HttpResponse, BaseException]
- _before_get_response(request)
- rv = old_get_response(self, request)
- _after_get_response(request)
- return rv
-
- BaseHandler.get_response = sentry_patched_get_response
-
- if hasattr(BaseHandler, "get_response_async"):
- from sentry_sdk.integrations.django.asgi import patch_get_response_async
-
- patch_get_response_async(BaseHandler, _before_get_response)
-
-
-def _make_event_processor(weak_request, integration):
- # type: (Callable[[], WSGIRequest], DjangoIntegration) -> EventProcessor
- def event_processor(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- # if the request is gone we are fine not logging the data from
- # it. This might happen if the processor is pushed away to
- # another thread.
- request = weak_request()
- if request is None:
- return event
-
- try:
- drf_request = request._sentry_drf_request_backref()
- if drf_request is not None:
- request = drf_request
- except AttributeError:
- pass
-
- with capture_internal_exceptions():
- DjangoRequestExtractor(request).extract_into_event(event)
-
- if _should_send_default_pii():
- with capture_internal_exceptions():
- _set_user_info(request, event)
-
- return event
-
- return event_processor
-
-
-def _got_request_exception(request=None, **kwargs):
- # type: (WSGIRequest, **Any) -> None
- hub = Hub.current
- integration = hub.get_integration(DjangoIntegration)
- if integration is not None:
-
- if request is not None and integration.transaction_style == "url":
- with hub.configure_scope() as scope:
- _attempt_resolve_again(request, scope)
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- sys.exc_info(),
- client_options=client.options,
- mechanism={"type": "django", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
-
-class DjangoRequestExtractor(RequestExtractor):
- def env(self):
- # type: () -> Dict[str, str]
- return self.request.META
-
- def cookies(self):
- # type: () -> Dict[str, str]
- return self.request.COOKIES
-
- def raw_data(self):
- # type: () -> bytes
- return self.request.body
-
- def form(self):
- # type: () -> QueryDict
- return self.request.POST
-
- def files(self):
- # type: () -> MultiValueDict
- return self.request.FILES
-
- def size_of_file(self, file):
- # type: (Any) -> int
- return file.size
-
- def parsed_body(self):
- # type: () -> Optional[Dict[str, Any]]
- try:
- return self.request.data
- except AttributeError:
- return RequestExtractor.parsed_body(self)
-
-
-def _set_user_info(request, event):
- # type: (WSGIRequest, Dict[str, Any]) -> None
- user_info = event.setdefault("user", {})
-
- user = getattr(request, "user", None)
-
- if user is None or not is_authenticated(user):
- return
-
- try:
- user_info.setdefault("id", str(user.pk))
- except Exception:
- pass
-
- try:
- user_info.setdefault("email", user.email)
- except Exception:
- pass
-
- try:
- user_info.setdefault("username", user.get_username())
- except Exception:
- pass
-
-
-def install_sql_hook():
- # type: () -> None
- """If installed this causes Django's queries to be captured."""
- try:
- from django.db.backends.utils import CursorWrapper
- except ImportError:
- from django.db.backends.util import CursorWrapper
-
- try:
- # django 1.6 and 1.7 compatability
- from django.db.backends import BaseDatabaseWrapper
- except ImportError:
- # django 1.8 or later
- from django.db.backends.base.base import BaseDatabaseWrapper
-
- try:
- real_execute = CursorWrapper.execute
- real_executemany = CursorWrapper.executemany
- real_connect = BaseDatabaseWrapper.connect
- except AttributeError:
- # This won't work on Django versions < 1.6
- return
-
- def execute(self, sql, params=None):
- # type: (CursorWrapper, Any, Optional[Any]) -> Any
- hub = Hub.current
- if hub.get_integration(DjangoIntegration) is None:
- return real_execute(self, sql, params)
-
- with RecordSqlQueries(
- hub, self.cursor, sql, params, paramstyle="format", executemany=False
- ):
- return real_execute(self, sql, params)
-
- def executemany(self, sql, param_list):
- # type: (CursorWrapper, Any, List[Any]) -> Any
- hub = Hub.current
- if hub.get_integration(DjangoIntegration) is None:
- return real_executemany(self, sql, param_list)
-
- with RecordSqlQueries(
- hub, self.cursor, sql, param_list, paramstyle="format", executemany=True
- ):
- return real_executemany(self, sql, param_list)
-
- def connect(self):
- # type: (BaseDatabaseWrapper) -> None
- hub = Hub.current
- if hub.get_integration(DjangoIntegration) is None:
- return real_connect(self)
-
- with capture_internal_exceptions():
- hub.add_breadcrumb(message="connect", category="query")
-
- with hub.start_span(op="db", description="connect"):
- return real_connect(self)
-
- CursorWrapper.execute = execute
- CursorWrapper.executemany = executemany
- BaseDatabaseWrapper.connect = connect
- ignore_logger("django.db.backends")
diff --git a/sentry_sdk/integrations/django/asgi.py b/sentry_sdk/integrations/django/asgi.py
deleted file mode 100644
index 79916e9..0000000
--- a/sentry_sdk/integrations/django/asgi.py
+++ /dev/null
@@ -1,151 +0,0 @@
-"""
-Instrumentation for Django 3.0
-
-Since this file contains `async def` it is conditionally imported in
-`sentry_sdk.integrations.django` (depending on the existence of
-`django.core.handlers.asgi`.
-"""
-
-import asyncio
-
-from sentry_sdk import Hub, _functools
-from sentry_sdk._types import MYPY
-
-from sentry_sdk.integrations.asgi import SentryAsgiMiddleware
-
-if MYPY:
- from typing import Any
- from typing import Union
- from typing import Callable
-
- from django.http.response import HttpResponse
-
-
-def patch_django_asgi_handler_impl(cls):
- # type: (Any) -> None
-
- from sentry_sdk.integrations.django import DjangoIntegration
-
- old_app = cls.__call__
-
- async def sentry_patched_asgi_handler(self, scope, receive, send):
- # type: (Any, Any, Any, Any) -> Any
- if Hub.current.get_integration(DjangoIntegration) is None:
- return await old_app(self, scope, receive, send)
-
- middleware = SentryAsgiMiddleware(
- old_app.__get__(self, cls), unsafe_context_data=True
- )._run_asgi3
- return await middleware(scope, receive, send)
-
- cls.__call__ = sentry_patched_asgi_handler
-
-
-def patch_get_response_async(cls, _before_get_response):
- # type: (Any, Any) -> None
- old_get_response_async = cls.get_response_async
-
- async def sentry_patched_get_response_async(self, request):
- # type: (Any, Any) -> Union[HttpResponse, BaseException]
- _before_get_response(request)
- return await old_get_response_async(self, request)
-
- cls.get_response_async = sentry_patched_get_response_async
-
-
-def patch_channels_asgi_handler_impl(cls):
- # type: (Any) -> None
-
- import channels # type: ignore
- from sentry_sdk.integrations.django import DjangoIntegration
-
- if channels.__version__ < "3.0.0":
-
- old_app = cls.__call__
-
- async def sentry_patched_asgi_handler(self, receive, send):
- # type: (Any, Any, Any) -> Any
- if Hub.current.get_integration(DjangoIntegration) is None:
- return await old_app(self, receive, send)
-
- middleware = SentryAsgiMiddleware(
- lambda _scope: old_app.__get__(self, cls), unsafe_context_data=True
- )
-
- return await middleware(self.scope)(receive, send)
-
- cls.__call__ = sentry_patched_asgi_handler
-
- else:
- # The ASGI handler in Channels >= 3 has the same signature as
- # the Django handler.
- patch_django_asgi_handler_impl(cls)
-
-
-def wrap_async_view(hub, callback):
- # type: (Hub, Any) -> Any
- @_functools.wraps(callback)
- async def sentry_wrapped_callback(request, *args, **kwargs):
- # type: (Any, *Any, **Any) -> Any
-
- with hub.start_span(
- op="django.view", description=request.resolver_match.view_name
- ):
- return await callback(request, *args, **kwargs)
-
- return sentry_wrapped_callback
-
-
-def _asgi_middleware_mixin_factory(_check_middleware_span):
- # type: (Callable[..., Any]) -> Any
- """
- Mixin class factory that generates a middleware mixin for handling requests
- in async mode.
- """
-
- class SentryASGIMixin:
- if MYPY:
- _inner = None
-
- def __init__(self, get_response):
- # type: (Callable[..., Any]) -> None
- self.get_response = get_response
- self._acall_method = None
- self._async_check()
-
- def _async_check(self):
- # type: () -> None
- """
- If get_response is a coroutine function, turns us into async mode so
- a thread is not consumed during a whole request.
- Taken from django.utils.deprecation::MiddlewareMixin._async_check
- """
- if asyncio.iscoroutinefunction(self.get_response):
- self._is_coroutine = asyncio.coroutines._is_coroutine # type: ignore
-
- def async_route_check(self):
- # type: () -> bool
- """
- Function that checks if we are in async mode,
- and if we are forwards the handling of requests to __acall__
- """
- return asyncio.iscoroutinefunction(self.get_response)
-
- async def __acall__(self, *args, **kwargs):
- # type: (*Any, **Any) -> Any
- f = self._acall_method
- if f is None:
- if hasattr(self._inner, "__acall__"):
- self._acall_method = f = self._inner.__acall__ # type: ignore
- else:
- self._acall_method = f = self._inner
-
- middleware_span = _check_middleware_span(old_method=f)
-
- if middleware_span is None:
- return await f(*args, **kwargs)
-
- with middleware_span:
- return await f(*args, **kwargs)
-
- return SentryASGIMixin
diff --git a/sentry_sdk/integrations/django/middleware.py b/sentry_sdk/integrations/django/middleware.py
deleted file mode 100644
index c9001cd..0000000
--- a/sentry_sdk/integrations/django/middleware.py
+++ /dev/null
@@ -1,185 +0,0 @@
-"""
-Create spans from Django middleware invocations
-"""
-
-from django import VERSION as DJANGO_VERSION
-
-from sentry_sdk import Hub
-from sentry_sdk._functools import wraps
-from sentry_sdk._types import MYPY
-from sentry_sdk.utils import (
- ContextVar,
- transaction_from_function,
- capture_internal_exceptions,
-)
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Optional
- from typing import TypeVar
-
- from sentry_sdk.tracing import Span
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-_import_string_should_wrap_middleware = ContextVar(
- "import_string_should_wrap_middleware"
-)
-
-if DJANGO_VERSION < (1, 7):
- import_string_name = "import_by_path"
-else:
- import_string_name = "import_string"
-
-
-if DJANGO_VERSION < (3, 1):
- _asgi_middleware_mixin_factory = lambda _: object
-else:
- from .asgi import _asgi_middleware_mixin_factory
-
-
-def patch_django_middlewares():
- # type: () -> None
- from django.core.handlers import base
-
- old_import_string = getattr(base, import_string_name)
-
- def sentry_patched_import_string(dotted_path):
- # type: (str) -> Any
- rv = old_import_string(dotted_path)
-
- if _import_string_should_wrap_middleware.get(None):
- rv = _wrap_middleware(rv, dotted_path)
-
- return rv
-
- setattr(base, import_string_name, sentry_patched_import_string)
-
- old_load_middleware = base.BaseHandler.load_middleware
-
- def sentry_patched_load_middleware(*args, **kwargs):
- # type: (Any, Any) -> Any
- _import_string_should_wrap_middleware.set(True)
- try:
- return old_load_middleware(*args, **kwargs)
- finally:
- _import_string_should_wrap_middleware.set(False)
-
- base.BaseHandler.load_middleware = sentry_patched_load_middleware
-
-
-def _wrap_middleware(middleware, middleware_name):
- # type: (Any, str) -> Any
- from sentry_sdk.integrations.django import DjangoIntegration
-
- def _check_middleware_span(old_method):
- # type: (Callable[..., Any]) -> Optional[Span]
- hub = Hub.current
- integration = hub.get_integration(DjangoIntegration)
- if integration is None or not integration.middleware_spans:
- return None
-
- function_name = transaction_from_function(old_method)
-
- description = middleware_name
- function_basename = getattr(old_method, "__name__", None)
- if function_basename:
- description = "{}.{}".format(description, function_basename)
-
- middleware_span = hub.start_span(
- op="django.middleware", description=description
- )
- middleware_span.set_tag("django.function_name", function_name)
- middleware_span.set_tag("django.middleware_name", middleware_name)
-
- return middleware_span
-
- def _get_wrapped_method(old_method):
- # type: (F) -> F
- with capture_internal_exceptions():
-
- def sentry_wrapped_method(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- middleware_span = _check_middleware_span(old_method)
-
- if middleware_span is None:
- return old_method(*args, **kwargs)
-
- with middleware_span:
- return old_method(*args, **kwargs)
-
- try:
- # fails for __call__ of function on Python 2 (see py2.7-django-1.11)
- sentry_wrapped_method = wraps(old_method)(sentry_wrapped_method)
-
- # Necessary for Django 3.1
- sentry_wrapped_method.__self__ = old_method.__self__ # type: ignore
- except Exception:
- pass
-
- return sentry_wrapped_method # type: ignore
-
- return old_method
-
- class SentryWrappingMiddleware(
- _asgi_middleware_mixin_factory(_check_middleware_span) # type: ignore
- ):
-
- async_capable = getattr(middleware, "async_capable", False)
-
- def __init__(self, get_response=None, *args, **kwargs):
- # type: (Optional[Callable[..., Any]], *Any, **Any) -> None
- if get_response:
- self._inner = middleware(get_response, *args, **kwargs)
- else:
- self._inner = middleware(*args, **kwargs)
- self.get_response = get_response
- self._call_method = None
- if self.async_capable:
- super(SentryWrappingMiddleware, self).__init__(get_response)
-
- # We need correct behavior for `hasattr()`, which we can only determine
- # when we have an instance of the middleware we're wrapping.
- def __getattr__(self, method_name):
- # type: (str) -> Any
- if method_name not in (
- "process_request",
- "process_view",
- "process_template_response",
- "process_response",
- "process_exception",
- ):
- raise AttributeError()
-
- old_method = getattr(self._inner, method_name)
- rv = _get_wrapped_method(old_method)
- self.__dict__[method_name] = rv
- return rv
-
- def __call__(self, *args, **kwargs):
- # type: (*Any, **Any) -> Any
- if hasattr(self, "async_route_check") and self.async_route_check():
- return self.__acall__(*args, **kwargs)
-
- f = self._call_method
- if f is None:
- self._call_method = f = self._inner.__call__
-
- middleware_span = _check_middleware_span(old_method=f)
-
- if middleware_span is None:
- return f(*args, **kwargs)
-
- with middleware_span:
- return f(*args, **kwargs)
-
- for attr in (
- "__name__",
- "__module__",
- "__qualname__",
- ):
- if hasattr(middleware, attr):
- setattr(SentryWrappingMiddleware, attr, getattr(middleware, attr))
-
- return SentryWrappingMiddleware
diff --git a/sentry_sdk/integrations/django/templates.py b/sentry_sdk/integrations/django/templates.py
deleted file mode 100644
index 2ff9d1b..0000000
--- a/sentry_sdk/integrations/django/templates.py
+++ /dev/null
@@ -1,178 +0,0 @@
-from django.template import TemplateSyntaxError
-from django import VERSION as DJANGO_VERSION
-
-from sentry_sdk import _functools, Hub
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Optional
- from typing import Iterator
- from typing import Tuple
-
-try:
- # support Django 1.9
- from django.template.base import Origin
-except ImportError:
- # backward compatibility
- from django.template.loader import LoaderOrigin as Origin
-
-
-def get_template_frame_from_exception(exc_value):
- # type: (Optional[BaseException]) -> Optional[Dict[str, Any]]
-
- # As of Django 1.9 or so the new template debug thing showed up.
- if hasattr(exc_value, "template_debug"):
- return _get_template_frame_from_debug(exc_value.template_debug) # type: ignore
-
- # As of r16833 (Django) all exceptions may contain a
- # ``django_template_source`` attribute (rather than the legacy
- # ``TemplateSyntaxError.source`` check)
- if hasattr(exc_value, "django_template_source"):
- return _get_template_frame_from_source(
- exc_value.django_template_source # type: ignore
- )
-
- if isinstance(exc_value, TemplateSyntaxError) and hasattr(exc_value, "source"):
- source = exc_value.source
- if isinstance(source, (tuple, list)) and isinstance(source[0], Origin):
- return _get_template_frame_from_source(source) # type: ignore
-
- return None
-
-
-def _get_template_name_description(template_name):
- # type: (str) -> str
- if isinstance(template_name, (list, tuple)):
- if template_name:
- return "[{}, ...]".format(template_name[0])
- else:
- return template_name
-
-
-def patch_templates():
- # type: () -> None
- from django.template.response import SimpleTemplateResponse
- from sentry_sdk.integrations.django import DjangoIntegration
-
- real_rendered_content = SimpleTemplateResponse.rendered_content
-
- @property # type: ignore
- def rendered_content(self):
- # type: (SimpleTemplateResponse) -> str
- hub = Hub.current
- if hub.get_integration(DjangoIntegration) is None:
- return real_rendered_content.fget(self)
-
- with hub.start_span(
- op="django.template.render",
- description=_get_template_name_description(self.template_name),
- ) as span:
- span.set_data("context", self.context_data)
- return real_rendered_content.fget(self)
-
- SimpleTemplateResponse.rendered_content = rendered_content
-
- if DJANGO_VERSION < (1, 7):
- return
- import django.shortcuts
-
- real_render = django.shortcuts.render
-
- @_functools.wraps(real_render)
- def render(request, template_name, context=None, *args, **kwargs):
- # type: (django.http.HttpRequest, str, Optional[Dict[str, Any]], *Any, **Any) -> django.http.HttpResponse
- hub = Hub.current
- if hub.get_integration(DjangoIntegration) is None:
- return real_render(request, template_name, context, *args, **kwargs)
-
- with hub.start_span(
- op="django.template.render",
- description=_get_template_name_description(template_name),
- ) as span:
- span.set_data("context", context)
- return real_render(request, template_name, context, *args, **kwargs)
-
- django.shortcuts.render = render
-
-
-def _get_template_frame_from_debug(debug):
- # type: (Dict[str, Any]) -> Dict[str, Any]
- if debug is None:
- return None
-
- lineno = debug["line"]
- filename = debug["name"]
- if filename is None:
- filename = ""
-
- pre_context = []
- post_context = []
- context_line = None
-
- for i, line in debug["source_lines"]:
- if i < lineno:
- pre_context.append(line)
- elif i > lineno:
- post_context.append(line)
- else:
- context_line = line
-
- return {
- "filename": filename,
- "lineno": lineno,
- "pre_context": pre_context[-5:],
- "post_context": post_context[:5],
- "context_line": context_line,
- "in_app": True,
- }
-
-
-def _linebreak_iter(template_source):
- # type: (str) -> Iterator[int]
- yield 0
- p = template_source.find("\n")
- while p >= 0:
- yield p + 1
- p = template_source.find("\n", p + 1)
-
-
-def _get_template_frame_from_source(source):
- # type: (Tuple[Origin, Tuple[int, int]]) -> Optional[Dict[str, Any]]
- if not source:
- return None
-
- origin, (start, end) = source
- filename = getattr(origin, "loadname", None)
- if filename is None:
- filename = ""
- template_source = origin.reload()
- lineno = None
- upto = 0
- pre_context = []
- post_context = []
- context_line = None
-
- for num, next in enumerate(_linebreak_iter(template_source)):
- line = template_source[upto:next]
- if start >= upto and end <= next:
- lineno = num
- context_line = line
- elif lineno is None:
- pre_context.append(line)
- else:
- post_context.append(line)
-
- upto = next
-
- if context_line is None or lineno is None:
- return None
-
- return {
- "filename": filename,
- "lineno": lineno,
- "pre_context": pre_context[-5:],
- "post_context": post_context[:5],
- "context_line": context_line,
- }
diff --git a/sentry_sdk/integrations/django/transactions.py b/sentry_sdk/integrations/django/transactions.py
deleted file mode 100644
index b0f88e9..0000000
--- a/sentry_sdk/integrations/django/transactions.py
+++ /dev/null
@@ -1,136 +0,0 @@
-"""
-Copied from raven-python. Used for
-`DjangoIntegration(transaction_fron="raven_legacy")`.
-"""
-
-from __future__ import absolute_import
-
-import re
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from django.urls.resolvers import URLResolver
- from typing import Dict
- from typing import List
- from typing import Optional
- from django.urls.resolvers import URLPattern
- from typing import Tuple
- from typing import Union
- from re import Pattern
-
-try:
- from django.urls import get_resolver
-except ImportError:
- from django.core.urlresolvers import get_resolver
-
-
-def get_regex(resolver_or_pattern):
- # type: (Union[URLPattern, URLResolver]) -> Pattern[str]
- """Utility method for django's deprecated resolver.regex"""
- try:
- regex = resolver_or_pattern.regex
- except AttributeError:
- regex = resolver_or_pattern.pattern.regex
- return regex
-
-
-class RavenResolver(object):
- _optional_group_matcher = re.compile(r"\(\?\:([^\)]+)\)")
- _named_group_matcher = re.compile(r"\(\?P<(\w+)>[^\)]+\)+")
- _non_named_group_matcher = re.compile(r"\([^\)]+\)")
- # [foo|bar|baz]
- _either_option_matcher = re.compile(r"\[([^\]]+)\|([^\]]+)\]")
- _camel_re = re.compile(r"([A-Z]+)([a-z])")
-
- _cache = {} # type: Dict[URLPattern, str]
-
- def _simplify(self, pattern):
- # type: (str) -> str
- r"""
- Clean up urlpattern regexes into something readable by humans:
-
- From:
- > "^(?P\w+)/athletes/(?P\w+)/$"
-
- To:
- > "{sport_slug}/athletes/{athlete_slug}/"
- """
- # remove optional params
- # TODO(dcramer): it'd be nice to change these into [%s] but it currently
- # conflicts with the other rules because we're doing regexp matches
- # rather than parsing tokens
- result = self._optional_group_matcher.sub(lambda m: "%s" % m.group(1), pattern)
-
- # handle named groups first
- result = self._named_group_matcher.sub(lambda m: "{%s}" % m.group(1), result)
-
- # handle non-named groups
- result = self._non_named_group_matcher.sub("{var}", result)
-
- # handle optional params
- result = self._either_option_matcher.sub(lambda m: m.group(1), result)
-
- # clean up any outstanding regex-y characters.
- result = (
- result.replace("^", "")
- .replace("$", "")
- .replace("?", "")
- .replace("\\A", "")
- .replace("\\Z", "")
- .replace("//", "/")
- .replace("\\", "")
- )
-
- return result
-
- def _resolve(self, resolver, path, parents=None):
- # type: (URLResolver, str, Optional[List[URLResolver]]) -> Optional[str]
-
- match = get_regex(resolver).search(path) # Django < 2.0
-
- if not match:
- return None
-
- if parents is None:
- parents = [resolver]
- elif resolver not in parents:
- parents = parents + [resolver]
-
- new_path = path[match.end() :]
- for pattern in resolver.url_patterns:
- # this is an include()
- if not pattern.callback:
- match_ = self._resolve(pattern, new_path, parents)
- if match_:
- return match_
- continue
- elif not get_regex(pattern).search(new_path):
- continue
-
- try:
- return self._cache[pattern]
- except KeyError:
- pass
-
- prefix = "".join(self._simplify(get_regex(p).pattern) for p in parents)
- result = prefix + self._simplify(get_regex(pattern).pattern)
- if not result.startswith("/"):
- result = "/" + result
- self._cache[pattern] = result
- return result
-
- return None
-
- def resolve(
- self,
- path, # type: str
- urlconf=None, # type: Union[None, Tuple[URLPattern, URLPattern, URLResolver], Tuple[URLPattern]]
- ):
- # type: (...) -> str
- resolver = get_resolver(urlconf)
- match = self._resolve(resolver, path)
- return match or path
-
-
-LEGACY_RESOLVER = RavenResolver()
diff --git a/sentry_sdk/integrations/django/views.py b/sentry_sdk/integrations/django/views.py
deleted file mode 100644
index 51f1abc..0000000
--- a/sentry_sdk/integrations/django/views.py
+++ /dev/null
@@ -1,69 +0,0 @@
-from sentry_sdk.hub import Hub
-from sentry_sdk._types import MYPY
-from sentry_sdk import _functools
-
-if MYPY:
- from typing import Any
-
-
-try:
- from asyncio import iscoroutinefunction
-except ImportError:
- iscoroutinefunction = None # type: ignore
-
-
-try:
- from sentry_sdk.integrations.django.asgi import wrap_async_view
-except (ImportError, SyntaxError):
- wrap_async_view = None # type: ignore
-
-
-def patch_views():
- # type: () -> None
-
- from django.core.handlers.base import BaseHandler
- from sentry_sdk.integrations.django import DjangoIntegration
-
- old_make_view_atomic = BaseHandler.make_view_atomic
-
- @_functools.wraps(old_make_view_atomic)
- def sentry_patched_make_view_atomic(self, *args, **kwargs):
- # type: (Any, *Any, **Any) -> Any
- callback = old_make_view_atomic(self, *args, **kwargs)
-
- # XXX: The wrapper function is created for every request. Find more
- # efficient way to wrap views (or build a cache?)
-
- hub = Hub.current
- integration = hub.get_integration(DjangoIntegration)
-
- if integration is not None and integration.middleware_spans:
-
- if (
- iscoroutinefunction is not None
- and wrap_async_view is not None
- and iscoroutinefunction(callback)
- ):
- sentry_wrapped_callback = wrap_async_view(hub, callback)
- else:
- sentry_wrapped_callback = _wrap_sync_view(hub, callback)
-
- else:
- sentry_wrapped_callback = callback
-
- return sentry_wrapped_callback
-
- BaseHandler.make_view_atomic = sentry_patched_make_view_atomic
-
-
-def _wrap_sync_view(hub, callback):
- # type: (Hub, Any) -> Any
- @_functools.wraps(callback)
- def sentry_wrapped_callback(request, *args, **kwargs):
- # type: (Any, *Any, **Any) -> Any
- with hub.start_span(
- op="django.view", description=request.resolver_match.view_name
- ):
- return callback(request, *args, **kwargs)
-
- return sentry_wrapped_callback
diff --git a/sentry_sdk/integrations/excepthook.py b/sentry_sdk/integrations/excepthook.py
deleted file mode 100644
index 1e8597e..0000000
--- a/sentry_sdk/integrations/excepthook.py
+++ /dev/null
@@ -1,77 +0,0 @@
-import sys
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-from sentry_sdk.integrations import Integration
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Callable
- from typing import Any
- from typing import Type
-
- from types import TracebackType
-
- Excepthook = Callable[
- [Type[BaseException], BaseException, TracebackType],
- Any,
- ]
-
-
-class ExcepthookIntegration(Integration):
- identifier = "excepthook"
-
- always_run = False
-
- def __init__(self, always_run=False):
- # type: (bool) -> None
-
- if not isinstance(always_run, bool):
- raise ValueError(
- "Invalid value for always_run: %s (must be type boolean)"
- % (always_run,)
- )
- self.always_run = always_run
-
- @staticmethod
- def setup_once():
- # type: () -> None
- sys.excepthook = _make_excepthook(sys.excepthook)
-
-
-def _make_excepthook(old_excepthook):
- # type: (Excepthook) -> Excepthook
- def sentry_sdk_excepthook(type_, value, traceback):
- # type: (Type[BaseException], BaseException, TracebackType) -> None
- hub = Hub.current
- integration = hub.get_integration(ExcepthookIntegration)
-
- if integration is not None and _should_send(integration.always_run):
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- with capture_internal_exceptions():
- event, hint = event_from_exception(
- (type_, value, traceback),
- client_options=client.options,
- mechanism={"type": "excepthook", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
- return old_excepthook(type_, value, traceback)
-
- return sentry_sdk_excepthook
-
-
-def _should_send(always_run=False):
- # type: (bool) -> bool
- if always_run:
- return True
-
- if hasattr(sys, "ps1"):
- # Disable the excepthook for interactive Python shells, otherwise
- # every typo gets sent to Sentry.
- return False
-
- return True
diff --git a/sentry_sdk/integrations/executing.py b/sentry_sdk/integrations/executing.py
deleted file mode 100644
index 4fbf729..0000000
--- a/sentry_sdk/integrations/executing.py
+++ /dev/null
@@ -1,68 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk import Hub
-from sentry_sdk._types import MYPY
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.scope import add_global_event_processor
-from sentry_sdk.utils import walk_exception_chain, iter_stacks
-
-if MYPY:
- from typing import Optional
-
- from sentry_sdk._types import Event, Hint
-
-try:
- import executing
-except ImportError:
- raise DidNotEnable("executing is not installed")
-
-
-class ExecutingIntegration(Integration):
- identifier = "executing"
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- @add_global_event_processor
- def add_executing_info(event, hint):
- # type: (Event, Optional[Hint]) -> Optional[Event]
- if Hub.current.get_integration(ExecutingIntegration) is None:
- return event
-
- if hint is None:
- return event
-
- exc_info = hint.get("exc_info", None)
-
- if exc_info is None:
- return event
-
- exception = event.get("exception", None)
-
- if exception is None:
- return event
-
- values = exception.get("values", None)
-
- if values is None:
- return event
-
- for exception, (_exc_type, _exc_value, exc_tb) in zip(
- reversed(values), walk_exception_chain(exc_info)
- ):
- sentry_frames = [
- frame
- for frame in exception.get("stacktrace", {}).get("frames", [])
- if frame.get("function")
- ]
- tbs = list(iter_stacks(exc_tb))
- if len(sentry_frames) != len(tbs):
- continue
-
- for sentry_frame, tb in zip(sentry_frames, tbs):
- frame = tb.tb_frame
- source = executing.Source.for_frame(frame)
- sentry_frame["function"] = source.code_qualname(frame.f_code)
-
- return event
diff --git a/sentry_sdk/integrations/falcon.py b/sentry_sdk/integrations/falcon.py
deleted file mode 100644
index 8129fab..0000000
--- a/sentry_sdk/integrations/falcon.py
+++ /dev/null
@@ -1,215 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations._wsgi_common import RequestExtractor
-from sentry_sdk.integrations.wsgi import SentryWsgiMiddleware
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Optional
-
- from sentry_sdk._types import EventProcessor
-
-try:
- import falcon # type: ignore
- import falcon.api_helpers # type: ignore
-
- from falcon import __version__ as FALCON_VERSION
-except ImportError:
- raise DidNotEnable("Falcon not installed")
-
-
-class FalconRequestExtractor(RequestExtractor):
- def env(self):
- # type: () -> Dict[str, Any]
- return self.request.env
-
- def cookies(self):
- # type: () -> Dict[str, Any]
- return self.request.cookies
-
- def form(self):
- # type: () -> None
- return None # No such concept in Falcon
-
- def files(self):
- # type: () -> None
- return None # No such concept in Falcon
-
- def raw_data(self):
- # type: () -> Optional[str]
-
- # As request data can only be read once we won't make this available
- # to Sentry. Just send back a dummy string in case there was a
- # content length.
- # TODO(jmagnusson): Figure out if there's a way to support this
- content_length = self.content_length()
- if content_length > 0:
- return "[REQUEST_CONTAINING_RAW_DATA]"
- else:
- return None
-
- def json(self):
- # type: () -> Optional[Dict[str, Any]]
- try:
- return self.request.media
- except falcon.errors.HTTPBadRequest:
- # NOTE(jmagnusson): We return `falcon.Request._media` here because
- # falcon 1.4 doesn't do proper type checking in
- # `falcon.Request.media`. This has been fixed in 2.0.
- # Relevant code: https://github.com/falconry/falcon/blob/1.4.1/falcon/request.py#L953
- return self.request._media
-
-
-class SentryFalconMiddleware(object):
- """Captures exceptions in Falcon requests and send to Sentry"""
-
- def process_request(self, req, resp, *args, **kwargs):
- # type: (Any, Any, *Any, **Any) -> None
- hub = Hub.current
- integration = hub.get_integration(FalconIntegration)
- if integration is None:
- return
-
- with hub.configure_scope() as scope:
- scope._name = "falcon"
- scope.add_event_processor(_make_request_event_processor(req, integration))
-
-
-TRANSACTION_STYLE_VALUES = ("uri_template", "path")
-
-
-class FalconIntegration(Integration):
- identifier = "falcon"
-
- transaction_style = None
-
- def __init__(self, transaction_style="uri_template"):
- # type: (str) -> None
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
-
- @staticmethod
- def setup_once():
- # type: () -> None
- try:
- version = tuple(map(int, FALCON_VERSION.split(".")))
- except (ValueError, TypeError):
- raise DidNotEnable("Unparsable Falcon version: {}".format(FALCON_VERSION))
-
- if version < (1, 4):
- raise DidNotEnable("Falcon 1.4 or newer required.")
-
- _patch_wsgi_app()
- _patch_handle_exception()
- _patch_prepare_middleware()
-
-
-def _patch_wsgi_app():
- # type: () -> None
- original_wsgi_app = falcon.API.__call__
-
- def sentry_patched_wsgi_app(self, env, start_response):
- # type: (falcon.API, Any, Any) -> Any
- hub = Hub.current
- integration = hub.get_integration(FalconIntegration)
- if integration is None:
- return original_wsgi_app(self, env, start_response)
-
- sentry_wrapped = SentryWsgiMiddleware(
- lambda envi, start_resp: original_wsgi_app(self, envi, start_resp)
- )
-
- return sentry_wrapped(env, start_response)
-
- falcon.API.__call__ = sentry_patched_wsgi_app
-
-
-def _patch_handle_exception():
- # type: () -> None
- original_handle_exception = falcon.API._handle_exception
-
- def sentry_patched_handle_exception(self, *args):
- # type: (falcon.API, *Any) -> Any
- # NOTE(jmagnusson): falcon 2.0 changed falcon.API._handle_exception
- # method signature from `(ex, req, resp, params)` to
- # `(req, resp, ex, params)`
- if isinstance(args[0], Exception):
- ex = args[0]
- else:
- ex = args[2]
-
- was_handled = original_handle_exception(self, *args)
-
- hub = Hub.current
- integration = hub.get_integration(FalconIntegration)
-
- if integration is not None and _exception_leads_to_http_5xx(ex):
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- ex,
- client_options=client.options,
- mechanism={"type": "falcon", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
- return was_handled
-
- falcon.API._handle_exception = sentry_patched_handle_exception
-
-
-def _patch_prepare_middleware():
- # type: () -> None
- original_prepare_middleware = falcon.api_helpers.prepare_middleware
-
- def sentry_patched_prepare_middleware(
- middleware=None, independent_middleware=False
- ):
- # type: (Any, Any) -> Any
- hub = Hub.current
- integration = hub.get_integration(FalconIntegration)
- if integration is not None:
- middleware = [SentryFalconMiddleware()] + (middleware or [])
- return original_prepare_middleware(middleware, independent_middleware)
-
- falcon.api_helpers.prepare_middleware = sentry_patched_prepare_middleware
-
-
-def _exception_leads_to_http_5xx(ex):
- # type: (Exception) -> bool
- is_server_error = isinstance(ex, falcon.HTTPError) and (ex.status or "").startswith(
- "5"
- )
- is_unhandled_error = not isinstance(
- ex, (falcon.HTTPError, falcon.http_status.HTTPStatus)
- )
- return is_server_error or is_unhandled_error
-
-
-def _make_request_event_processor(req, integration):
- # type: (falcon.Request, FalconIntegration) -> EventProcessor
-
- def inner(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- if integration.transaction_style == "uri_template":
- event["transaction"] = req.uri_template
- elif integration.transaction_style == "path":
- event["transaction"] = req.path
-
- with capture_internal_exceptions():
- FalconRequestExtractor(req).extract_into_event(event)
-
- return event
-
- return inner
diff --git a/sentry_sdk/integrations/flask.py b/sentry_sdk/integrations/flask.py
deleted file mode 100644
index 8883cbb..0000000
--- a/sentry_sdk/integrations/flask.py
+++ /dev/null
@@ -1,262 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations.wsgi import SentryWsgiMiddleware
-from sentry_sdk.integrations._wsgi_common import RequestExtractor
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from sentry_sdk.integrations.wsgi import _ScopedResponse
- from typing import Any
- from typing import Dict
- from werkzeug.datastructures import ImmutableMultiDict
- from werkzeug.datastructures import FileStorage
- from typing import Union
- from typing import Callable
-
- from sentry_sdk._types import EventProcessor
-
-
-try:
- import flask_login # type: ignore
-except ImportError:
- flask_login = None
-
-try:
- from flask import ( # type: ignore
- Markup,
- Request,
- Flask,
- _request_ctx_stack,
- _app_ctx_stack,
- __version__ as FLASK_VERSION,
- )
- from flask.signals import (
- before_render_template,
- got_request_exception,
- request_started,
- )
-except ImportError:
- raise DidNotEnable("Flask is not installed")
-
-try:
- import blinker # noqa
-except ImportError:
- raise DidNotEnable("blinker is not installed")
-
-TRANSACTION_STYLE_VALUES = ("endpoint", "url")
-
-
-class FlaskIntegration(Integration):
- identifier = "flask"
-
- transaction_style = None
-
- def __init__(self, transaction_style="endpoint"):
- # type: (str) -> None
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- # This version parsing is absolutely naive but the alternative is to
- # import pkg_resources which slows down the SDK a lot.
- try:
- version = tuple(map(int, FLASK_VERSION.split(".")[:3]))
- except (ValueError, TypeError):
- # It's probably a release candidate, we assume it's fine.
- pass
- else:
- if version < (0, 10):
- raise DidNotEnable("Flask 0.10 or newer is required.")
-
- before_render_template.connect(_add_sentry_trace)
- request_started.connect(_request_started)
- got_request_exception.connect(_capture_exception)
-
- old_app = Flask.__call__
-
- def sentry_patched_wsgi_app(self, environ, start_response):
- # type: (Any, Dict[str, str], Callable[..., Any]) -> _ScopedResponse
- if Hub.current.get_integration(FlaskIntegration) is None:
- return old_app(self, environ, start_response)
-
- return SentryWsgiMiddleware(lambda *a, **kw: old_app(self, *a, **kw))(
- environ, start_response
- )
-
- Flask.__call__ = sentry_patched_wsgi_app # type: ignore
-
-
-def _add_sentry_trace(sender, template, context, **extra):
- # type: (Flask, Any, Dict[str, Any], **Any) -> None
-
- if "sentry_trace" in context:
- return
-
- sentry_span = Hub.current.scope.span
- context["sentry_trace"] = (
- Markup(
- ''
- % (sentry_span.to_traceparent(),)
- )
- if sentry_span
- else ""
- )
-
-
-def _request_started(sender, **kwargs):
- # type: (Flask, **Any) -> None
- hub = Hub.current
- integration = hub.get_integration(FlaskIntegration)
- if integration is None:
- return
-
- app = _app_ctx_stack.top.app
- with hub.configure_scope() as scope:
- request = _request_ctx_stack.top.request
-
- # Set the transaction name here, but rely on WSGI middleware to actually
- # start the transaction
- try:
- if integration.transaction_style == "endpoint":
- scope.transaction = request.url_rule.endpoint
- elif integration.transaction_style == "url":
- scope.transaction = request.url_rule.rule
- except Exception:
- pass
-
- evt_processor = _make_request_event_processor(app, request, integration)
- scope.add_event_processor(evt_processor)
-
-
-class FlaskRequestExtractor(RequestExtractor):
- def env(self):
- # type: () -> Dict[str, str]
- return self.request.environ
-
- def cookies(self):
- # type: () -> Dict[Any, Any]
- return {
- k: v[0] if isinstance(v, list) and len(v) == 1 else v
- for k, v in self.request.cookies.items()
- }
-
- def raw_data(self):
- # type: () -> bytes
- return self.request.get_data()
-
- def form(self):
- # type: () -> ImmutableMultiDict[str, Any]
- return self.request.form
-
- def files(self):
- # type: () -> ImmutableMultiDict[str, Any]
- return self.request.files
-
- def is_json(self):
- # type: () -> bool
- return self.request.is_json
-
- def json(self):
- # type: () -> Any
- return self.request.get_json()
-
- def size_of_file(self, file):
- # type: (FileStorage) -> int
- return file.content_length
-
-
-def _make_request_event_processor(app, request, integration):
- # type: (Flask, Callable[[], Request], FlaskIntegration) -> EventProcessor
-
- def inner(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
-
- # if the request is gone we are fine not logging the data from
- # it. This might happen if the processor is pushed away to
- # another thread.
- if request is None:
- return event
-
- with capture_internal_exceptions():
- FlaskRequestExtractor(request).extract_into_event(event)
-
- if _should_send_default_pii():
- with capture_internal_exceptions():
- _add_user_to_event(event)
-
- return event
-
- return inner
-
-
-def _capture_exception(sender, exception, **kwargs):
- # type: (Flask, Union[ValueError, BaseException], **Any) -> None
- hub = Hub.current
- if hub.get_integration(FlaskIntegration) is None:
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- exception,
- client_options=client.options,
- mechanism={"type": "flask", "handled": False},
- )
-
- hub.capture_event(event, hint=hint)
-
-
-def _add_user_to_event(event):
- # type: (Dict[str, Any]) -> None
- if flask_login is None:
- return
-
- user = flask_login.current_user
- if user is None:
- return
-
- with capture_internal_exceptions():
- # Access this object as late as possible as accessing the user
- # is relatively costly
-
- user_info = event.setdefault("user", {})
-
- try:
- user_info.setdefault("id", user.get_id())
- # TODO: more configurable user attrs here
- except AttributeError:
- # might happen if:
- # - flask_login could not be imported
- # - flask_login is not configured
- # - no user is logged in
- pass
-
- # The following attribute accesses are ineffective for the general
- # Flask-Login case, because the User interface of Flask-Login does not
- # care about anything but the ID. However, Flask-User (based on
- # Flask-Login) documents a few optional extra attributes.
- #
- # https://github.com/lingthio/Flask-User/blob/a379fa0a281789618c484b459cb41236779b95b1/docs/source/data_models.rst#fixed-data-model-property-names
-
- try:
- user_info.setdefault("email", user.email)
- except Exception:
- pass
-
- try:
- user_info.setdefault("username", user.username)
- user_info.setdefault("username", user.email)
- except Exception:
- pass
diff --git a/sentry_sdk/integrations/gcp.py b/sentry_sdk/integrations/gcp.py
deleted file mode 100644
index e92422d..0000000
--- a/sentry_sdk/integrations/gcp.py
+++ /dev/null
@@ -1,225 +0,0 @@
-from datetime import datetime, timedelta
-from os import environ
-import sys
-
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.tracing import Transaction
-from sentry_sdk._compat import reraise
-from sentry_sdk.utils import (
- AnnotatedValue,
- capture_internal_exceptions,
- event_from_exception,
- logger,
- TimeoutThread,
-)
-from sentry_sdk.integrations import Integration
-from sentry_sdk.integrations._wsgi_common import _filter_headers
-
-from sentry_sdk._types import MYPY
-
-# Constants
-TIMEOUT_WARNING_BUFFER = 1.5 # Buffer time required to send timeout warning to Sentry
-MILLIS_TO_SECONDS = 1000.0
-
-if MYPY:
- from typing import Any
- from typing import TypeVar
- from typing import Callable
- from typing import Optional
-
- from sentry_sdk._types import EventProcessor, Event, Hint
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-
-def _wrap_func(func):
- # type: (F) -> F
- def sentry_func(functionhandler, gcp_event, *args, **kwargs):
- # type: (Any, Any, *Any, **Any) -> Any
-
- hub = Hub.current
- integration = hub.get_integration(GcpIntegration)
- if integration is None:
- return func(functionhandler, gcp_event, *args, **kwargs)
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- configured_time = environ.get("FUNCTION_TIMEOUT_SEC")
- if not configured_time:
- logger.debug(
- "The configured timeout could not be fetched from Cloud Functions configuration."
- )
- return func(functionhandler, gcp_event, *args, **kwargs)
-
- configured_time = int(configured_time)
-
- initial_time = datetime.utcnow()
-
- with hub.push_scope() as scope:
- with capture_internal_exceptions():
- scope.clear_breadcrumbs()
- scope.add_event_processor(
- _make_request_event_processor(
- gcp_event, configured_time, initial_time
- )
- )
- scope.set_tag("gcp_region", environ.get("FUNCTION_REGION"))
- timeout_thread = None
- if (
- integration.timeout_warning
- and configured_time > TIMEOUT_WARNING_BUFFER
- ):
- waiting_time = configured_time - TIMEOUT_WARNING_BUFFER
-
- timeout_thread = TimeoutThread(waiting_time, configured_time)
-
- # Starting the thread to raise timeout warning exception
- timeout_thread.start()
-
- headers = {}
- if hasattr(gcp_event, "headers"):
- headers = gcp_event.headers
- transaction = Transaction.continue_from_headers(
- headers, op="serverless.function", name=environ.get("FUNCTION_NAME", "")
- )
- sampling_context = {
- "gcp_env": {
- "function_name": environ.get("FUNCTION_NAME"),
- "function_entry_point": environ.get("ENTRY_POINT"),
- "function_identity": environ.get("FUNCTION_IDENTITY"),
- "function_region": environ.get("FUNCTION_REGION"),
- "function_project": environ.get("GCP_PROJECT"),
- },
- "gcp_event": gcp_event,
- }
- with hub.start_transaction(
- transaction, custom_sampling_context=sampling_context
- ):
- try:
- return func(functionhandler, gcp_event, *args, **kwargs)
- except Exception:
- exc_info = sys.exc_info()
- sentry_event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "gcp", "handled": False},
- )
- hub.capture_event(sentry_event, hint=hint)
- reraise(*exc_info)
- finally:
- if timeout_thread:
- timeout_thread.stop()
- # Flush out the event queue
- hub.flush()
-
- return sentry_func # type: ignore
-
-
-class GcpIntegration(Integration):
- identifier = "gcp"
-
- def __init__(self, timeout_warning=False):
- # type: (bool) -> None
- self.timeout_warning = timeout_warning
-
- @staticmethod
- def setup_once():
- # type: () -> None
- import __main__ as gcp_functions # type: ignore
-
- if not hasattr(gcp_functions, "worker_v1"):
- logger.warning(
- "GcpIntegration currently supports only Python 3.7 runtime environment."
- )
- return
-
- worker1 = gcp_functions.worker_v1
-
- worker1.FunctionHandler.invoke_user_function = _wrap_func(
- worker1.FunctionHandler.invoke_user_function
- )
-
-
-def _make_request_event_processor(gcp_event, configured_timeout, initial_time):
- # type: (Any, Any, Any) -> EventProcessor
-
- def event_processor(event, hint):
- # type: (Event, Hint) -> Optional[Event]
-
- final_time = datetime.utcnow()
- time_diff = final_time - initial_time
-
- execution_duration_in_millis = time_diff.microseconds / MILLIS_TO_SECONDS
-
- extra = event.setdefault("extra", {})
- extra["google cloud functions"] = {
- "function_name": environ.get("FUNCTION_NAME"),
- "function_entry_point": environ.get("ENTRY_POINT"),
- "function_identity": environ.get("FUNCTION_IDENTITY"),
- "function_region": environ.get("FUNCTION_REGION"),
- "function_project": environ.get("GCP_PROJECT"),
- "execution_duration_in_millis": execution_duration_in_millis,
- "configured_timeout_in_seconds": configured_timeout,
- }
-
- extra["google cloud logs"] = {
- "url": _get_google_cloud_logs_url(final_time),
- }
-
- request = event.get("request", {})
-
- request["url"] = "gcp:///{}".format(environ.get("FUNCTION_NAME"))
-
- if hasattr(gcp_event, "method"):
- request["method"] = gcp_event.method
-
- if hasattr(gcp_event, "query_string"):
- request["query_string"] = gcp_event.query_string.decode("utf-8")
-
- if hasattr(gcp_event, "headers"):
- request["headers"] = _filter_headers(gcp_event.headers)
-
- if _should_send_default_pii():
- if hasattr(gcp_event, "data"):
- request["data"] = gcp_event.data
- else:
- if hasattr(gcp_event, "data"):
- # Unfortunately couldn't find a way to get structured body from GCP
- # event. Meaning every body is unstructured to us.
- request["data"] = AnnotatedValue("", {"rem": [["!raw", "x", 0, 0]]})
-
- event["request"] = request
-
- return event
-
- return event_processor
-
-
-def _get_google_cloud_logs_url(final_time):
- # type: (datetime) -> str
- """
- Generates a Google Cloud Logs console URL based on the environment variables
- Arguments:
- final_time {datetime} -- Final time
- Returns:
- str -- Google Cloud Logs Console URL to logs.
- """
- hour_ago = final_time - timedelta(hours=1)
- formatstring = "%Y-%m-%dT%H:%M:%SZ"
-
- url = (
- "https://console.cloud.google.com/logs/viewer?project={project}&resource=cloud_function"
- "%2Ffunction_name%2F{function_name}%2Fregion%2F{region}&minLogLevel=0&expandAll=false"
- "×tamp={timestamp_end}&customFacets=&limitCustomFacetWidth=true"
- "&dateRangeStart={timestamp_start}&dateRangeEnd={timestamp_end}"
- "&interval=PT1H&scrollTimestamp={timestamp_end}"
- ).format(
- project=environ.get("GCP_PROJECT"),
- function_name=environ.get("FUNCTION_NAME"),
- region=environ.get("FUNCTION_REGION"),
- timestamp_end=final_time.strftime(formatstring),
- timestamp_start=hour_ago.strftime(formatstring),
- )
-
- return url
diff --git a/sentry_sdk/integrations/gnu_backtrace.py b/sentry_sdk/integrations/gnu_backtrace.py
deleted file mode 100644
index e0ec110..0000000
--- a/sentry_sdk/integrations/gnu_backtrace.py
+++ /dev/null
@@ -1,107 +0,0 @@
-import re
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration
-from sentry_sdk.scope import add_global_event_processor
-from sentry_sdk.utils import capture_internal_exceptions
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
-
-
-MODULE_RE = r"[a-zA-Z0-9/._:\\-]+"
-TYPE_RE = r"[a-zA-Z0-9._:<>,-]+"
-HEXVAL_RE = r"[A-Fa-f0-9]+"
-
-
-FRAME_RE = r"""
-^(?P\d+)\.\s
-(?P{MODULE_RE})\(
- (?P{TYPE_RE}\ )?
- ((?P{TYPE_RE})
- (?P\(.*\))?
- )?
- ((?P\ const)?\+0x(?P{HEXVAL_RE}))?
-\)\s
-\[0x(?P{HEXVAL_RE})\]$
-""".format(
- MODULE_RE=MODULE_RE, HEXVAL_RE=HEXVAL_RE, TYPE_RE=TYPE_RE
-)
-
-FRAME_RE = re.compile(FRAME_RE, re.MULTILINE | re.VERBOSE)
-
-
-class GnuBacktraceIntegration(Integration):
- identifier = "gnu_backtrace"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- @add_global_event_processor
- def process_gnu_backtrace(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- with capture_internal_exceptions():
- return _process_gnu_backtrace(event, hint)
-
-
-def _process_gnu_backtrace(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- if Hub.current.get_integration(GnuBacktraceIntegration) is None:
- return event
-
- exc_info = hint.get("exc_info", None)
-
- if exc_info is None:
- return event
-
- exception = event.get("exception", None)
-
- if exception is None:
- return event
-
- values = exception.get("values", None)
-
- if values is None:
- return event
-
- for exception in values:
- frames = exception.get("stacktrace", {}).get("frames", [])
- if not frames:
- continue
-
- msg = exception.get("value", None)
- if not msg:
- continue
-
- additional_frames = []
- new_msg = []
-
- for line in msg.splitlines():
- match = FRAME_RE.match(line)
- if match:
- additional_frames.append(
- (
- int(match.group("index")),
- {
- "package": match.group("package") or None,
- "function": match.group("function") or None,
- "platform": "native",
- },
- )
- )
- else:
- # Put garbage lines back into message, not sure what to do with them.
- new_msg.append(line)
-
- if additional_frames:
- additional_frames.sort(key=lambda x: -x[0])
- for _, frame in additional_frames:
- frames.append(frame)
-
- new_msg.append("")
- exception["value"] = "\n".join(new_msg)
-
- return event
diff --git a/sentry_sdk/integrations/httpx.py b/sentry_sdk/integrations/httpx.py
deleted file mode 100644
index 3d4bbf8..0000000
--- a/sentry_sdk/integrations/httpx.py
+++ /dev/null
@@ -1,94 +0,0 @@
-from sentry_sdk import Hub
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.utils import logger
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
-
-
-try:
- from httpx import AsyncClient, Client, Request, Response # type: ignore
-except ImportError:
- raise DidNotEnable("httpx is not installed")
-
-__all__ = ["HttpxIntegration"]
-
-
-class HttpxIntegration(Integration):
- identifier = "httpx"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- """
- httpx has its own transport layer and can be customized when needed,
- so patch Client.send and AsyncClient.send to support both synchronous and async interfaces.
- """
- _install_httpx_client()
- _install_httpx_async_client()
-
-
-def _install_httpx_client():
- # type: () -> None
- real_send = Client.send
-
- def send(self, request, **kwargs):
- # type: (Client, Request, **Any) -> Response
- hub = Hub.current
- if hub.get_integration(HttpxIntegration) is None:
- return real_send(self, request, **kwargs)
-
- with hub.start_span(
- op="http", description="%s %s" % (request.method, request.url)
- ) as span:
- span.set_data("method", request.method)
- span.set_data("url", str(request.url))
- for key, value in hub.iter_trace_propagation_headers():
- logger.debug(
- "[Tracing] Adding `{key}` header {value} to outgoing request to {url}.".format(
- key=key, value=value, url=request.url
- )
- )
- request.headers[key] = value
- rv = real_send(self, request, **kwargs)
-
- span.set_data("status_code", rv.status_code)
- span.set_http_status(rv.status_code)
- span.set_data("reason", rv.reason_phrase)
- return rv
-
- Client.send = send
-
-
-def _install_httpx_async_client():
- # type: () -> None
- real_send = AsyncClient.send
-
- async def send(self, request, **kwargs):
- # type: (AsyncClient, Request, **Any) -> Response
- hub = Hub.current
- if hub.get_integration(HttpxIntegration) is None:
- return await real_send(self, request, **kwargs)
-
- with hub.start_span(
- op="http", description="%s %s" % (request.method, request.url)
- ) as span:
- span.set_data("method", request.method)
- span.set_data("url", str(request.url))
- for key, value in hub.iter_trace_propagation_headers():
- logger.debug(
- "[Tracing] Adding `{key}` header {value} to outgoing request to {url}.".format(
- key=key, value=value, url=request.url
- )
- )
- request.headers[key] = value
- rv = await real_send(self, request, **kwargs)
-
- span.set_data("status_code", rv.status_code)
- span.set_http_status(rv.status_code)
- span.set_data("reason", rv.reason_phrase)
- return rv
-
- AsyncClient.send = send
diff --git a/sentry_sdk/integrations/logging.py b/sentry_sdk/integrations/logging.py
deleted file mode 100644
index 31c7b87..0000000
--- a/sentry_sdk/integrations/logging.py
+++ /dev/null
@@ -1,275 +0,0 @@
-from __future__ import absolute_import
-
-import logging
-import datetime
-from fnmatch import fnmatch
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import (
- to_string,
- event_from_exception,
- current_stacktrace,
- capture_internal_exceptions,
-)
-from sentry_sdk.integrations import Integration
-from sentry_sdk._compat import iteritems
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from logging import LogRecord
- from typing import Any
- from typing import Dict
- from typing import Optional
-
-DEFAULT_LEVEL = logging.INFO
-DEFAULT_EVENT_LEVEL = logging.ERROR
-
-# Capturing events from those loggers causes recursion errors. We cannot allow
-# the user to unconditionally create events from those loggers under any
-# circumstances.
-#
-# Note: Ignoring by logger name here is better than mucking with thread-locals.
-# We do not necessarily know whether thread-locals work 100% correctly in the user's environment.
-_IGNORED_LOGGERS = set(
- ["sentry_sdk.errors", "urllib3.connectionpool", "urllib3.connection"]
-)
-
-
-def ignore_logger(
- name, # type: str
-):
- # type: (...) -> None
- """This disables recording (both in breadcrumbs and as events) calls to
- a logger of a specific name. Among other uses, many of our integrations
- use this to prevent their actions being recorded as breadcrumbs. Exposed
- to users as a way to quiet spammy loggers.
-
- :param name: The name of the logger to ignore (same string you would pass to ``logging.getLogger``).
- """
- _IGNORED_LOGGERS.add(name)
-
-
-class LoggingIntegration(Integration):
- identifier = "logging"
-
- def __init__(self, level=DEFAULT_LEVEL, event_level=DEFAULT_EVENT_LEVEL):
- # type: (Optional[int], Optional[int]) -> None
- self._handler = None
- self._breadcrumb_handler = None
-
- if level is not None:
- self._breadcrumb_handler = BreadcrumbHandler(level=level)
-
- if event_level is not None:
- self._handler = EventHandler(level=event_level)
-
- def _handle_record(self, record):
- # type: (LogRecord) -> None
- if self._handler is not None and record.levelno >= self._handler.level:
- self._handler.handle(record)
-
- if (
- self._breadcrumb_handler is not None
- and record.levelno >= self._breadcrumb_handler.level
- ):
- self._breadcrumb_handler.handle(record)
-
- @staticmethod
- def setup_once():
- # type: () -> None
- old_callhandlers = logging.Logger.callHandlers # type: ignore
-
- def sentry_patched_callhandlers(self, record):
- # type: (Any, LogRecord) -> Any
- try:
- return old_callhandlers(self, record)
- finally:
- # This check is done twice, once also here before we even get
- # the integration. Otherwise we have a high chance of getting
- # into a recursion error when the integration is resolved
- # (this also is slower).
- if record.name not in _IGNORED_LOGGERS:
- integration = Hub.current.get_integration(LoggingIntegration)
- if integration is not None:
- integration._handle_record(record)
-
- logging.Logger.callHandlers = sentry_patched_callhandlers # type: ignore
-
-
-def _can_record(record):
- # type: (LogRecord) -> bool
- """Prevents ignored loggers from recording"""
- for logger in _IGNORED_LOGGERS:
- if fnmatch(record.name, logger):
- return False
- return True
-
-
-def _breadcrumb_from_record(record):
- # type: (LogRecord) -> Dict[str, Any]
- return {
- "type": "log",
- "level": _logging_to_event_level(record.levelname),
- "category": record.name,
- "message": record.message,
- "timestamp": datetime.datetime.utcfromtimestamp(record.created),
- "data": _extra_from_record(record),
- }
-
-
-def _logging_to_event_level(levelname):
- # type: (str) -> str
- return {"critical": "fatal"}.get(levelname.lower(), levelname.lower())
-
-
-COMMON_RECORD_ATTRS = frozenset(
- (
- "args",
- "created",
- "exc_info",
- "exc_text",
- "filename",
- "funcName",
- "levelname",
- "levelno",
- "linenno",
- "lineno",
- "message",
- "module",
- "msecs",
- "msg",
- "name",
- "pathname",
- "process",
- "processName",
- "relativeCreated",
- "stack",
- "tags",
- "thread",
- "threadName",
- "stack_info",
- )
-)
-
-
-def _extra_from_record(record):
- # type: (LogRecord) -> Dict[str, None]
- return {
- k: v
- for k, v in iteritems(vars(record))
- if k not in COMMON_RECORD_ATTRS
- and (not isinstance(k, str) or not k.startswith("_"))
- }
-
-
-class EventHandler(logging.Handler, object):
- """
- A logging handler that emits Sentry events for each log record
-
- Note that you do not have to use this class if the logging integration is enabled, which it is by default.
- """
-
- def emit(self, record):
- # type: (LogRecord) -> Any
- with capture_internal_exceptions():
- self.format(record)
- return self._emit(record)
-
- def _emit(self, record):
- # type: (LogRecord) -> None
- if not _can_record(record):
- return
-
- hub = Hub.current
- if hub.client is None:
- return
-
- client_options = hub.client.options
-
- # exc_info might be None or (None, None, None)
- #
- # exc_info may also be any falsy value due to Python stdlib being
- # liberal with what it receives and Celery's billiard being "liberal"
- # with what it sends. See
- # https://github.com/getsentry/sentry-python/issues/904
- if record.exc_info and record.exc_info[0] is not None:
- event, hint = event_from_exception(
- record.exc_info,
- client_options=client_options,
- mechanism={"type": "logging", "handled": True},
- )
- elif record.exc_info and record.exc_info[0] is None:
- event = {}
- hint = {}
- with capture_internal_exceptions():
- event["threads"] = {
- "values": [
- {
- "stacktrace": current_stacktrace(
- client_options["with_locals"]
- ),
- "crashed": False,
- "current": True,
- }
- ]
- }
- else:
- event = {}
- hint = {}
-
- hint["log_record"] = record
-
- event["level"] = _logging_to_event_level(record.levelname)
- event["logger"] = record.name
-
- # Log records from `warnings` module as separate issues
- record_caputured_from_warnings_module = (
- record.name == "py.warnings" and record.msg == "%s"
- )
- if record_caputured_from_warnings_module:
- # use the actual message and not "%s" as the message
- # this prevents grouping all warnings under one "%s" issue
- msg = record.args[0] # type: ignore
-
- event["logentry"] = {
- "message": msg,
- "params": (),
- }
-
- else:
- event["logentry"] = {
- "message": to_string(record.msg),
- "params": record.args,
- }
-
- event["extra"] = _extra_from_record(record)
-
- hub.capture_event(event, hint=hint)
-
-
-# Legacy name
-SentryHandler = EventHandler
-
-
-class BreadcrumbHandler(logging.Handler, object):
- """
- A logging handler that records breadcrumbs for each log record.
-
- Note that you do not have to use this class if the logging integration is enabled, which it is by default.
- """
-
- def emit(self, record):
- # type: (LogRecord) -> Any
- with capture_internal_exceptions():
- self.format(record)
- return self._emit(record)
-
- def _emit(self, record):
- # type: (LogRecord) -> None
- if not _can_record(record):
- return
-
- Hub.current.add_breadcrumb(
- _breadcrumb_from_record(record), hint={"log_record": record}
- )
diff --git a/sentry_sdk/integrations/modules.py b/sentry_sdk/integrations/modules.py
deleted file mode 100644
index 3d78cb8..0000000
--- a/sentry_sdk/integrations/modules.py
+++ /dev/null
@@ -1,56 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration
-from sentry_sdk.scope import add_global_event_processor
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Tuple
- from typing import Iterator
-
- from sentry_sdk._types import Event
-
-
-_installed_modules = None
-
-
-def _generate_installed_modules():
- # type: () -> Iterator[Tuple[str, str]]
- try:
- import pkg_resources
- except ImportError:
- return
-
- for info in pkg_resources.working_set:
- yield info.key, info.version
-
-
-def _get_installed_modules():
- # type: () -> Dict[str, str]
- global _installed_modules
- if _installed_modules is None:
- _installed_modules = dict(_generate_installed_modules())
- return _installed_modules
-
-
-class ModulesIntegration(Integration):
- identifier = "modules"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- @add_global_event_processor
- def processor(event, hint):
- # type: (Event, Any) -> Dict[str, Any]
- if event.get("type") == "transaction":
- return event
-
- if Hub.current.get_integration(ModulesIntegration) is None:
- return event
-
- event["modules"] = _get_installed_modules()
- return event
diff --git a/sentry_sdk/integrations/pure_eval.py b/sentry_sdk/integrations/pure_eval.py
deleted file mode 100644
index 9d3fe66..0000000
--- a/sentry_sdk/integrations/pure_eval.py
+++ /dev/null
@@ -1,138 +0,0 @@
-from __future__ import absolute_import
-
-import ast
-
-from sentry_sdk import Hub, serializer
-from sentry_sdk._types import MYPY
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.scope import add_global_event_processor
-from sentry_sdk.utils import walk_exception_chain, iter_stacks
-
-if MYPY:
- from typing import Optional, Dict, Any, Tuple, List
- from types import FrameType
-
- from sentry_sdk._types import Event, Hint
-
-try:
- import executing
-except ImportError:
- raise DidNotEnable("executing is not installed")
-
-try:
- import pure_eval
-except ImportError:
- raise DidNotEnable("pure_eval is not installed")
-
-try:
- # Used implicitly, just testing it's available
- import asttokens # noqa
-except ImportError:
- raise DidNotEnable("asttokens is not installed")
-
-
-class PureEvalIntegration(Integration):
- identifier = "pure_eval"
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- @add_global_event_processor
- def add_executing_info(event, hint):
- # type: (Event, Optional[Hint]) -> Optional[Event]
- if Hub.current.get_integration(PureEvalIntegration) is None:
- return event
-
- if hint is None:
- return event
-
- exc_info = hint.get("exc_info", None)
-
- if exc_info is None:
- return event
-
- exception = event.get("exception", None)
-
- if exception is None:
- return event
-
- values = exception.get("values", None)
-
- if values is None:
- return event
-
- for exception, (_exc_type, _exc_value, exc_tb) in zip(
- reversed(values), walk_exception_chain(exc_info)
- ):
- sentry_frames = [
- frame
- for frame in exception.get("stacktrace", {}).get("frames", [])
- if frame.get("function")
- ]
- tbs = list(iter_stacks(exc_tb))
- if len(sentry_frames) != len(tbs):
- continue
-
- for sentry_frame, tb in zip(sentry_frames, tbs):
- sentry_frame["vars"] = (
- pure_eval_frame(tb.tb_frame) or sentry_frame["vars"]
- )
- return event
-
-
-def pure_eval_frame(frame):
- # type: (FrameType) -> Dict[str, Any]
- source = executing.Source.for_frame(frame)
- if not source.tree:
- return {}
-
- statements = source.statements_at_line(frame.f_lineno)
- if not statements:
- return {}
-
- scope = stmt = list(statements)[0]
- while True:
- # Get the parent first in case the original statement is already
- # a function definition, e.g. if we're calling a decorator
- # In that case we still want the surrounding scope, not that function
- scope = scope.parent
- if isinstance(scope, (ast.FunctionDef, ast.ClassDef, ast.Module)):
- break
-
- evaluator = pure_eval.Evaluator.from_frame(frame)
- expressions = evaluator.interesting_expressions_grouped(scope)
-
- def closeness(expression):
- # type: (Tuple[List[Any], Any]) -> Tuple[int, int]
- # Prioritise expressions with a node closer to the statement executed
- # without being after that statement
- # A higher return value is better - the expression will appear
- # earlier in the list of values and is less likely to be trimmed
- nodes, _value = expression
-
- def start(n):
- # type: (ast.expr) -> Tuple[int, int]
- return (n.lineno, n.col_offset)
-
- nodes_before_stmt = [
- node for node in nodes if start(node) < stmt.last_token.end
- ]
- if nodes_before_stmt:
- # The position of the last node before or in the statement
- return max(start(node) for node in nodes_before_stmt)
- else:
- # The position of the first node after the statement
- # Negative means it's always lower priority than nodes that come before
- # Less negative means closer to the statement and higher priority
- lineno, col_offset = min(start(node) for node in nodes)
- return (-lineno, -col_offset)
-
- # This adds the first_token and last_token attributes to nodes
- atok = source.asttokens()
-
- expressions.sort(key=closeness, reverse=True)
- return {
- atok.get_text(nodes[0]): value
- for nodes, value in expressions[: serializer.MAX_DATABAG_BREADTH]
- }
diff --git a/sentry_sdk/integrations/pyramid.py b/sentry_sdk/integrations/pyramid.py
deleted file mode 100644
index a974d29..0000000
--- a/sentry_sdk/integrations/pyramid.py
+++ /dev/null
@@ -1,218 +0,0 @@
-from __future__ import absolute_import
-
-import os
-import sys
-import weakref
-
-from pyramid.httpexceptions import HTTPException
-from pyramid.request import Request
-
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-from sentry_sdk._compat import reraise, iteritems
-
-from sentry_sdk.integrations import Integration
-from sentry_sdk.integrations._wsgi_common import RequestExtractor
-from sentry_sdk.integrations.wsgi import SentryWsgiMiddleware
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from pyramid.response import Response
- from typing import Any
- from sentry_sdk.integrations.wsgi import _ScopedResponse
- from typing import Callable
- from typing import Dict
- from typing import Optional
- from webob.cookies import RequestCookies # type: ignore
- from webob.compat import cgi_FieldStorage # type: ignore
-
- from sentry_sdk.utils import ExcInfo
- from sentry_sdk._types import EventProcessor
-
-
-if getattr(Request, "authenticated_userid", None):
-
- def authenticated_userid(request):
- # type: (Request) -> Optional[Any]
- return request.authenticated_userid
-
-
-else:
- # bw-compat for pyramid < 1.5
- from pyramid.security import authenticated_userid # type: ignore
-
-
-TRANSACTION_STYLE_VALUES = ("route_name", "route_pattern")
-
-
-class PyramidIntegration(Integration):
- identifier = "pyramid"
-
- transaction_style = None
-
- def __init__(self, transaction_style="route_name"):
- # type: (str) -> None
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
-
- @staticmethod
- def setup_once():
- # type: () -> None
- from pyramid import router
- from pyramid.request import Request
-
- old_call_view = router._call_view
-
- def sentry_patched_call_view(registry, request, *args, **kwargs):
- # type: (Any, Request, *Any, **Any) -> Response
- hub = Hub.current
- integration = hub.get_integration(PyramidIntegration)
-
- if integration is not None:
- with hub.configure_scope() as scope:
- try:
- if integration.transaction_style == "route_name":
- scope.transaction = request.matched_route.name
- elif integration.transaction_style == "route_pattern":
- scope.transaction = request.matched_route.pattern
- except Exception:
- pass
-
- scope.add_event_processor(
- _make_event_processor(weakref.ref(request), integration)
- )
-
- return old_call_view(registry, request, *args, **kwargs)
-
- router._call_view = sentry_patched_call_view
-
- if hasattr(Request, "invoke_exception_view"):
- old_invoke_exception_view = Request.invoke_exception_view
-
- def sentry_patched_invoke_exception_view(self, *args, **kwargs):
- # type: (Request, *Any, **Any) -> Any
- rv = old_invoke_exception_view(self, *args, **kwargs)
-
- if (
- self.exc_info
- and all(self.exc_info)
- and rv.status_int == 500
- and Hub.current.get_integration(PyramidIntegration) is not None
- ):
- _capture_exception(self.exc_info)
-
- return rv
-
- Request.invoke_exception_view = sentry_patched_invoke_exception_view
-
- old_wsgi_call = router.Router.__call__
-
- def sentry_patched_wsgi_call(self, environ, start_response):
- # type: (Any, Dict[str, str], Callable[..., Any]) -> _ScopedResponse
- hub = Hub.current
- integration = hub.get_integration(PyramidIntegration)
- if integration is None:
- return old_wsgi_call(self, environ, start_response)
-
- def sentry_patched_inner_wsgi_call(environ, start_response):
- # type: (Dict[str, Any], Callable[..., Any]) -> Any
- try:
- return old_wsgi_call(self, environ, start_response)
- except Exception:
- einfo = sys.exc_info()
- _capture_exception(einfo)
- reraise(*einfo)
-
- return SentryWsgiMiddleware(sentry_patched_inner_wsgi_call)(
- environ, start_response
- )
-
- router.Router.__call__ = sentry_patched_wsgi_call
-
-
-def _capture_exception(exc_info):
- # type: (ExcInfo) -> None
- if exc_info[0] is None or issubclass(exc_info[0], HTTPException):
- return
- hub = Hub.current
- if hub.get_integration(PyramidIntegration) is None:
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "pyramid", "handled": False},
- )
-
- hub.capture_event(event, hint=hint)
-
-
-class PyramidRequestExtractor(RequestExtractor):
- def url(self):
- # type: () -> str
- return self.request.path_url
-
- def env(self):
- # type: () -> Dict[str, str]
- return self.request.environ
-
- def cookies(self):
- # type: () -> RequestCookies
- return self.request.cookies
-
- def raw_data(self):
- # type: () -> str
- return self.request.text
-
- def form(self):
- # type: () -> Dict[str, str]
- return {
- key: value
- for key, value in iteritems(self.request.POST)
- if not getattr(value, "filename", None)
- }
-
- def files(self):
- # type: () -> Dict[str, cgi_FieldStorage]
- return {
- key: value
- for key, value in iteritems(self.request.POST)
- if getattr(value, "filename", None)
- }
-
- def size_of_file(self, postdata):
- # type: (cgi_FieldStorage) -> int
- file = postdata.file
- try:
- return os.fstat(file.fileno()).st_size
- except Exception:
- return 0
-
-
-def _make_event_processor(weak_request, integration):
- # type: (Callable[[], Request], PyramidIntegration) -> EventProcessor
- def event_processor(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- request = weak_request()
- if request is None:
- return event
-
- with capture_internal_exceptions():
- PyramidRequestExtractor(request).extract_into_event(event)
-
- if _should_send_default_pii():
- with capture_internal_exceptions():
- user_info = event.setdefault("user", {})
- user_info.setdefault("id", authenticated_userid(request))
-
- return event
-
- return event_processor
diff --git a/sentry_sdk/integrations/quart.py b/sentry_sdk/integrations/quart.py
deleted file mode 100644
index 411817c..0000000
--- a/sentry_sdk/integrations/quart.py
+++ /dev/null
@@ -1,171 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk.hub import _should_send_default_pii, Hub
-from sentry_sdk.integrations import DidNotEnable, Integration
-from sentry_sdk.integrations._wsgi_common import _filter_headers
-from sentry_sdk.integrations.asgi import SentryAsgiMiddleware
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Union
-
- from sentry_sdk._types import EventProcessor
-
-try:
- import quart_auth # type: ignore
-except ImportError:
- quart_auth = None
-
-try:
- from quart import ( # type: ignore
- Request,
- Quart,
- _request_ctx_stack,
- _websocket_ctx_stack,
- _app_ctx_stack,
- )
- from quart.signals import ( # type: ignore
- got_background_exception,
- got_request_exception,
- got_websocket_exception,
- request_started,
- websocket_started,
- )
-except ImportError:
- raise DidNotEnable("Quart is not installed")
-
-TRANSACTION_STYLE_VALUES = ("endpoint", "url")
-
-
-class QuartIntegration(Integration):
- identifier = "quart"
-
- transaction_style = None
-
- def __init__(self, transaction_style="endpoint"):
- # type: (str) -> None
- if transaction_style not in TRANSACTION_STYLE_VALUES:
- raise ValueError(
- "Invalid value for transaction_style: %s (must be in %s)"
- % (transaction_style, TRANSACTION_STYLE_VALUES)
- )
- self.transaction_style = transaction_style
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- request_started.connect(_request_websocket_started)
- websocket_started.connect(_request_websocket_started)
- got_background_exception.connect(_capture_exception)
- got_request_exception.connect(_capture_exception)
- got_websocket_exception.connect(_capture_exception)
-
- old_app = Quart.__call__
-
- async def sentry_patched_asgi_app(self, scope, receive, send):
- # type: (Any, Any, Any, Any) -> Any
- if Hub.current.get_integration(QuartIntegration) is None:
- return await old_app(self, scope, receive, send)
-
- middleware = SentryAsgiMiddleware(lambda *a, **kw: old_app(self, *a, **kw))
- middleware.__call__ = middleware._run_asgi3
- return await middleware(scope, receive, send)
-
- Quart.__call__ = sentry_patched_asgi_app
-
-
-def _request_websocket_started(sender, **kwargs):
- # type: (Quart, **Any) -> None
- hub = Hub.current
- integration = hub.get_integration(QuartIntegration)
- if integration is None:
- return
-
- app = _app_ctx_stack.top.app
- with hub.configure_scope() as scope:
- if _request_ctx_stack.top is not None:
- request_websocket = _request_ctx_stack.top.request
- if _websocket_ctx_stack.top is not None:
- request_websocket = _websocket_ctx_stack.top.websocket
-
- # Set the transaction name here, but rely on ASGI middleware
- # to actually start the transaction
- try:
- if integration.transaction_style == "endpoint":
- scope.transaction = request_websocket.url_rule.endpoint
- elif integration.transaction_style == "url":
- scope.transaction = request_websocket.url_rule.rule
- except Exception:
- pass
-
- evt_processor = _make_request_event_processor(
- app, request_websocket, integration
- )
- scope.add_event_processor(evt_processor)
-
-
-def _make_request_event_processor(app, request, integration):
- # type: (Quart, Request, QuartIntegration) -> EventProcessor
- def inner(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- # if the request is gone we are fine not logging the data from
- # it. This might happen if the processor is pushed away to
- # another thread.
- if request is None:
- return event
-
- with capture_internal_exceptions():
- # TODO: Figure out what to do with request body. Methods on request
- # are async, but event processors are not.
-
- request_info = event.setdefault("request", {})
- request_info["url"] = request.url
- request_info["query_string"] = request.query_string
- request_info["method"] = request.method
- request_info["headers"] = _filter_headers(dict(request.headers))
-
- if _should_send_default_pii():
- request_info["env"] = {"REMOTE_ADDR": request.access_route[0]}
- _add_user_to_event(event)
-
- return event
-
- return inner
-
-
-def _capture_exception(sender, exception, **kwargs):
- # type: (Quart, Union[ValueError, BaseException], **Any) -> None
- hub = Hub.current
- if hub.get_integration(QuartIntegration) is None:
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- exception,
- client_options=client.options,
- mechanism={"type": "quart", "handled": False},
- )
-
- hub.capture_event(event, hint=hint)
-
-
-def _add_user_to_event(event):
- # type: (Dict[str, Any]) -> None
- if quart_auth is None:
- return
-
- user = quart_auth.current_user
- if user is None:
- return
-
- with capture_internal_exceptions():
- user_info = event.setdefault("user", {})
-
- user_info["id"] = quart_auth.current_user._auth_id
diff --git a/sentry_sdk/integrations/redis.py b/sentry_sdk/integrations/redis.py
deleted file mode 100644
index 6475d15..0000000
--- a/sentry_sdk/integrations/redis.py
+++ /dev/null
@@ -1,103 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk import Hub
-from sentry_sdk.utils import capture_internal_exceptions, logger
-from sentry_sdk.integrations import Integration
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
-
-_SINGLE_KEY_COMMANDS = frozenset(
- ["decr", "decrby", "get", "incr", "incrby", "pttl", "set", "setex", "setnx", "ttl"]
-)
-_MULTI_KEY_COMMANDS = frozenset(["del", "touch", "unlink"])
-
-
-def _patch_rediscluster():
- # type: () -> None
- try:
- import rediscluster # type: ignore
- except ImportError:
- return
-
- patch_redis_client(rediscluster.RedisCluster)
-
- # up to v1.3.6, __version__ attribute is a tuple
- # from v2.0.0, __version__ is a string and VERSION a tuple
- version = getattr(rediscluster, "VERSION", rediscluster.__version__)
-
- # StrictRedisCluster was introduced in v0.2.0 and removed in v2.0.0
- # https://github.com/Grokzen/redis-py-cluster/blob/master/docs/release-notes.rst
- if (0, 2, 0) < version < (2, 0, 0):
- patch_redis_client(rediscluster.StrictRedisCluster)
-
-
-class RedisIntegration(Integration):
- identifier = "redis"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- import redis
-
- patch_redis_client(redis.StrictRedis)
-
- try:
- import rb.clients # type: ignore
- except ImportError:
- pass
- else:
- patch_redis_client(rb.clients.FanoutClient)
- patch_redis_client(rb.clients.MappingClient)
- patch_redis_client(rb.clients.RoutingClient)
-
- try:
- _patch_rediscluster()
- except Exception:
- logger.exception("Error occurred while patching `rediscluster` library")
-
-
-def patch_redis_client(cls):
- # type: (Any) -> None
- """
- This function can be used to instrument custom redis client classes or
- subclasses.
- """
-
- old_execute_command = cls.execute_command
-
- def sentry_patched_execute_command(self, name, *args, **kwargs):
- # type: (Any, str, *Any, **Any) -> Any
- hub = Hub.current
-
- if hub.get_integration(RedisIntegration) is None:
- return old_execute_command(self, name, *args, **kwargs)
-
- description = name
-
- with capture_internal_exceptions():
- description_parts = [name]
- for i, arg in enumerate(args):
- if i > 10:
- break
-
- description_parts.append(repr(arg))
-
- description = " ".join(description_parts)
-
- with hub.start_span(op="redis", description=description) as span:
- if name:
- span.set_tag("redis.command", name)
-
- if name and args:
- name_low = name.lower()
- if (name_low in _SINGLE_KEY_COMMANDS) or (
- name_low in _MULTI_KEY_COMMANDS and len(args) == 1
- ):
- span.set_tag("redis.key", args[0])
-
- return old_execute_command(self, name, *args, **kwargs)
-
- cls.execute_command = sentry_patched_execute_command
diff --git a/sentry_sdk/integrations/rq.py b/sentry_sdk/integrations/rq.py
deleted file mode 100644
index f4c77d7..0000000
--- a/sentry_sdk/integrations/rq.py
+++ /dev/null
@@ -1,155 +0,0 @@
-from __future__ import absolute_import
-
-import weakref
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import DidNotEnable, Integration
-from sentry_sdk.integrations.logging import ignore_logger
-from sentry_sdk.tracing import Transaction
-from sentry_sdk.utils import capture_internal_exceptions, event_from_exception
-
-try:
- from rq.queue import Queue
- from rq.timeouts import JobTimeoutException
- from rq.version import VERSION as RQ_VERSION
- from rq.worker import Worker
-except ImportError:
- raise DidNotEnable("RQ not installed")
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any, Callable, Dict
-
- from sentry_sdk._types import EventProcessor
- from sentry_sdk.utils import ExcInfo
-
- from rq.job import Job
-
-
-class RqIntegration(Integration):
- identifier = "rq"
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- try:
- version = tuple(map(int, RQ_VERSION.split(".")[:3]))
- except (ValueError, TypeError):
- raise DidNotEnable("Unparsable RQ version: {}".format(RQ_VERSION))
-
- if version < (0, 6):
- raise DidNotEnable("RQ 0.6 or newer is required.")
-
- old_perform_job = Worker.perform_job
-
- def sentry_patched_perform_job(self, job, *args, **kwargs):
- # type: (Any, Job, *Queue, **Any) -> bool
- hub = Hub.current
- integration = hub.get_integration(RqIntegration)
-
- if integration is None:
- return old_perform_job(self, job, *args, **kwargs)
-
- client = hub.client
- assert client is not None
-
- with hub.push_scope() as scope:
- scope.clear_breadcrumbs()
- scope.add_event_processor(_make_event_processor(weakref.ref(job)))
-
- transaction = Transaction.continue_from_headers(
- job.meta.get("_sentry_trace_headers") or {},
- op="rq.task",
- name="unknown RQ task",
- )
-
- with capture_internal_exceptions():
- transaction.name = job.func_name
-
- with hub.start_transaction(
- transaction, custom_sampling_context={"rq_job": job}
- ):
- rv = old_perform_job(self, job, *args, **kwargs)
-
- if self.is_horse:
- # We're inside of a forked process and RQ is
- # about to call `os._exit`. Make sure that our
- # events get sent out.
- client.flush()
-
- return rv
-
- Worker.perform_job = sentry_patched_perform_job
-
- old_handle_exception = Worker.handle_exception
-
- def sentry_patched_handle_exception(self, job, *exc_info, **kwargs):
- # type: (Worker, Any, *Any, **Any) -> Any
- if job.is_failed:
- _capture_exception(exc_info) # type: ignore
-
- return old_handle_exception(self, job, *exc_info, **kwargs)
-
- Worker.handle_exception = sentry_patched_handle_exception
-
- old_enqueue_job = Queue.enqueue_job
-
- def sentry_patched_enqueue_job(self, job, **kwargs):
- # type: (Queue, Any, **Any) -> Any
- hub = Hub.current
- if hub.get_integration(RqIntegration) is not None:
- job.meta["_sentry_trace_headers"] = dict(
- hub.iter_trace_propagation_headers()
- )
-
- return old_enqueue_job(self, job, **kwargs)
-
- Queue.enqueue_job = sentry_patched_enqueue_job
-
- ignore_logger("rq.worker")
-
-
-def _make_event_processor(weak_job):
- # type: (Callable[[], Job]) -> EventProcessor
- def event_processor(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- job = weak_job()
- if job is not None:
- with capture_internal_exceptions():
- extra = event.setdefault("extra", {})
- extra["rq-job"] = {
- "job_id": job.id,
- "func": job.func_name,
- "args": job.args,
- "kwargs": job.kwargs,
- "description": job.description,
- }
-
- if "exc_info" in hint:
- with capture_internal_exceptions():
- if issubclass(hint["exc_info"][0], JobTimeoutException):
- event["fingerprint"] = ["rq", "JobTimeoutException", job.func_name]
-
- return event
-
- return event_processor
-
-
-def _capture_exception(exc_info, **kwargs):
- # type: (ExcInfo, **Any) -> None
- hub = Hub.current
- if hub.get_integration(RqIntegration) is None:
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "rq", "handled": False},
- )
-
- hub.capture_event(event, hint=hint)
diff --git a/sentry_sdk/integrations/sanic.py b/sentry_sdk/integrations/sanic.py
deleted file mode 100644
index 4e20cc9..0000000
--- a/sentry_sdk/integrations/sanic.py
+++ /dev/null
@@ -1,331 +0,0 @@
-import sys
-import weakref
-from inspect import isawaitable
-
-from sentry_sdk._compat import urlparse, reraise
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- event_from_exception,
- HAS_REAL_CONTEXTVARS,
- CONTEXTVARS_ERROR_MESSAGE,
-)
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations._wsgi_common import RequestExtractor, _filter_headers
-from sentry_sdk.integrations.logging import ignore_logger
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Optional
- from typing import Union
- from typing import Tuple
- from typing import Dict
-
- from sanic.request import Request, RequestParameters
-
- from sentry_sdk._types import Event, EventProcessor, Hint
- from sanic.router import Route
-
-try:
- from sanic import Sanic, __version__ as SANIC_VERSION
- from sanic.exceptions import SanicException
- from sanic.router import Router
- from sanic.handlers import ErrorHandler
-except ImportError:
- raise DidNotEnable("Sanic not installed")
-
-old_error_handler_lookup = ErrorHandler.lookup
-old_handle_request = Sanic.handle_request
-old_router_get = Router.get
-
-try:
- # This method was introduced in Sanic v21.9
- old_startup = Sanic._startup
-except AttributeError:
- pass
-
-
-class SanicIntegration(Integration):
- identifier = "sanic"
- version = (0, 0) # type: Tuple[int, ...]
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- try:
- SanicIntegration.version = tuple(map(int, SANIC_VERSION.split(".")))
- except (TypeError, ValueError):
- raise DidNotEnable("Unparsable Sanic version: {}".format(SANIC_VERSION))
-
- if SanicIntegration.version < (0, 8):
- raise DidNotEnable("Sanic 0.8 or newer required.")
-
- if not HAS_REAL_CONTEXTVARS:
- # We better have contextvars or we're going to leak state between
- # requests.
- raise DidNotEnable(
- "The sanic integration for Sentry requires Python 3.7+ "
- " or the aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
- )
-
- if SANIC_VERSION.startswith("0.8."):
- # Sanic 0.8 and older creates a logger named "root" and puts a
- # stringified version of every exception in there (without exc_info),
- # which our error deduplication can't detect.
- #
- # We explicitly check the version here because it is a very
- # invasive step to ignore this logger and not necessary in newer
- # versions at all.
- #
- # https://github.com/huge-success/sanic/issues/1332
- ignore_logger("root")
-
- if SanicIntegration.version < (21, 9):
- _setup_legacy_sanic()
- return
-
- _setup_sanic()
-
-
-class SanicRequestExtractor(RequestExtractor):
- def content_length(self):
- # type: () -> int
- if self.request.body is None:
- return 0
- return len(self.request.body)
-
- def cookies(self):
- # type: () -> Dict[str, str]
- return dict(self.request.cookies)
-
- def raw_data(self):
- # type: () -> bytes
- return self.request.body
-
- def form(self):
- # type: () -> RequestParameters
- return self.request.form
-
- def is_json(self):
- # type: () -> bool
- raise NotImplementedError()
-
- def json(self):
- # type: () -> Optional[Any]
- return self.request.json
-
- def files(self):
- # type: () -> RequestParameters
- return self.request.files
-
- def size_of_file(self, file):
- # type: (Any) -> int
- return len(file.body or ())
-
-
-def _setup_sanic():
- # type: () -> None
- Sanic._startup = _startup
- ErrorHandler.lookup = _sentry_error_handler_lookup
-
-
-def _setup_legacy_sanic():
- # type: () -> None
- Sanic.handle_request = _legacy_handle_request
- Router.get = _legacy_router_get
- ErrorHandler.lookup = _sentry_error_handler_lookup
-
-
-async def _startup(self):
- # type: (Sanic) -> None
- # This happens about as early in the lifecycle as possible, just after the
- # Request object is created. The body has not yet been consumed.
- self.signal("http.lifecycle.request")(_hub_enter)
-
- # This happens after the handler is complete. In v21.9 this signal is not
- # dispatched when there is an exception. Therefore we need to close out
- # and call _hub_exit from the custom exception handler as well.
- # See https://github.com/sanic-org/sanic/issues/2297
- self.signal("http.lifecycle.response")(_hub_exit)
-
- # This happens inside of request handling immediately after the route
- # has been identified by the router.
- self.signal("http.routing.after")(_set_transaction)
-
- # The above signals need to be declared before this can be called.
- await old_startup(self)
-
-
-async def _hub_enter(request):
- # type: (Request) -> None
- hub = Hub.current
- request.ctx._sentry_do_integration = (
- hub.get_integration(SanicIntegration) is not None
- )
-
- if not request.ctx._sentry_do_integration:
- return
-
- weak_request = weakref.ref(request)
- request.ctx._sentry_hub = Hub(hub)
- request.ctx._sentry_hub.__enter__()
-
- with request.ctx._sentry_hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
- scope.add_event_processor(_make_request_processor(weak_request))
-
-
-async def _hub_exit(request, **_):
- # type: (Request, **Any) -> None
- request.ctx._sentry_hub.__exit__(None, None, None)
-
-
-async def _set_transaction(request, route, **kwargs):
- # type: (Request, Route, **Any) -> None
- hub = Hub.current
- if hub.get_integration(SanicIntegration) is not None:
- with capture_internal_exceptions():
- with hub.configure_scope() as scope:
- route_name = route.name.replace(request.app.name, "").strip(".")
- scope.transaction = route_name
-
-
-def _sentry_error_handler_lookup(self, exception, *args, **kwargs):
- # type: (Any, Exception, *Any, **Any) -> Optional[object]
- _capture_exception(exception)
- old_error_handler = old_error_handler_lookup(self, exception, *args, **kwargs)
-
- if old_error_handler is None:
- return None
-
- if Hub.current.get_integration(SanicIntegration) is None:
- return old_error_handler
-
- async def sentry_wrapped_error_handler(request, exception):
- # type: (Request, Exception) -> Any
- try:
- response = old_error_handler(request, exception)
- if isawaitable(response):
- response = await response
- return response
- except Exception:
- # Report errors that occur in Sanic error handler. These
- # exceptions will not even show up in Sanic's
- # `sanic.exceptions` logger.
- exc_info = sys.exc_info()
- _capture_exception(exc_info)
- reraise(*exc_info)
- finally:
- # As mentioned in previous comment in _startup, this can be removed
- # after https://github.com/sanic-org/sanic/issues/2297 is resolved
- if SanicIntegration.version == (21, 9):
- await _hub_exit(request)
-
- return sentry_wrapped_error_handler
-
-
-async def _legacy_handle_request(self, request, *args, **kwargs):
- # type: (Any, Request, *Any, **Any) -> Any
- hub = Hub.current
- if hub.get_integration(SanicIntegration) is None:
- return old_handle_request(self, request, *args, **kwargs)
-
- weak_request = weakref.ref(request)
-
- with Hub(hub) as hub:
- with hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
- scope.add_event_processor(_make_request_processor(weak_request))
-
- response = old_handle_request(self, request, *args, **kwargs)
- if isawaitable(response):
- response = await response
-
- return response
-
-
-def _legacy_router_get(self, *args):
- # type: (Any, Union[Any, Request]) -> Any
- rv = old_router_get(self, *args)
- hub = Hub.current
- if hub.get_integration(SanicIntegration) is not None:
- with capture_internal_exceptions():
- with hub.configure_scope() as scope:
- if SanicIntegration.version and SanicIntegration.version >= (21, 3):
- # Sanic versions above and including 21.3 append the app name to the
- # route name, and so we need to remove it from Route name so the
- # transaction name is consistent across all versions
- sanic_app_name = self.ctx.app.name
- sanic_route = rv[0].name
-
- if sanic_route.startswith("%s." % sanic_app_name):
- # We add a 1 to the len of the sanic_app_name because there is a dot
- # that joins app name and the route name
- # Format: app_name.route_name
- sanic_route = sanic_route[len(sanic_app_name) + 1 :]
-
- scope.transaction = sanic_route
- else:
- scope.transaction = rv[0].__name__
- return rv
-
-
-def _capture_exception(exception):
- # type: (Union[Tuple[Optional[type], Optional[BaseException], Any], BaseException]) -> None
- hub = Hub.current
- integration = hub.get_integration(SanicIntegration)
- if integration is None:
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- with capture_internal_exceptions():
- event, hint = event_from_exception(
- exception,
- client_options=client.options,
- mechanism={"type": "sanic", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
-
-def _make_request_processor(weak_request):
- # type: (Callable[[], Request]) -> EventProcessor
- def sanic_processor(event, hint):
- # type: (Event, Optional[Hint]) -> Optional[Event]
-
- try:
- if hint and issubclass(hint["exc_info"][0], SanicException):
- return None
- except KeyError:
- pass
-
- request = weak_request()
- if request is None:
- return event
-
- with capture_internal_exceptions():
- extractor = SanicRequestExtractor(request)
- extractor.extract_into_event(event)
-
- request_info = event["request"]
- urlparts = urlparse.urlsplit(request.url)
-
- request_info["url"] = "%s://%s%s" % (
- urlparts.scheme,
- urlparts.netloc,
- urlparts.path,
- )
-
- request_info["query_string"] = urlparts.query
- request_info["method"] = request.method
- request_info["env"] = {"REMOTE_ADDR": request.remote_addr}
- request_info["headers"] = _filter_headers(dict(request.headers))
-
- return event
-
- return sanic_processor
diff --git a/sentry_sdk/integrations/serverless.py b/sentry_sdk/integrations/serverless.py
deleted file mode 100644
index c46f8ce..0000000
--- a/sentry_sdk/integrations/serverless.py
+++ /dev/null
@@ -1,85 +0,0 @@
-import sys
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.utils import event_from_exception
-from sentry_sdk._compat import reraise
-from sentry_sdk._functools import wraps
-
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import TypeVar
- from typing import Union
- from typing import Optional
-
- from typing import overload
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-else:
-
- def overload(x):
- # type: (F) -> F
- return x
-
-
-@overload
-def serverless_function(f, flush=True): # noqa: F811
- # type: (F, bool) -> F
- pass
-
-
-@overload
-def serverless_function(f=None, flush=True): # noqa: F811
- # type: (None, bool) -> Callable[[F], F]
- pass
-
-
-def serverless_function(f=None, flush=True): # noqa
- # type: (Optional[F], bool) -> Union[F, Callable[[F], F]]
- def wrapper(f):
- # type: (F) -> F
- @wraps(f)
- def inner(*args, **kwargs):
- # type: (*Any, **Any) -> Any
- with Hub(Hub.current) as hub:
- with hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
-
- try:
- return f(*args, **kwargs)
- except Exception:
- _capture_and_reraise()
- finally:
- if flush:
- _flush_client()
-
- return inner # type: ignore
-
- if f is None:
- return wrapper
- else:
- return wrapper(f)
-
-
-def _capture_and_reraise():
- # type: () -> None
- exc_info = sys.exc_info()
- hub = Hub.current
- if hub.client is not None:
- event, hint = event_from_exception(
- exc_info,
- client_options=hub.client.options,
- mechanism={"type": "serverless", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
- reraise(*exc_info)
-
-
-def _flush_client():
- # type: () -> None
- return Hub.current.flush()
diff --git a/sentry_sdk/integrations/spark/__init__.py b/sentry_sdk/integrations/spark/__init__.py
deleted file mode 100644
index 10d9416..0000000
--- a/sentry_sdk/integrations/spark/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-from sentry_sdk.integrations.spark.spark_driver import SparkIntegration
-from sentry_sdk.integrations.spark.spark_worker import SparkWorkerIntegration
-
-__all__ = ["SparkIntegration", "SparkWorkerIntegration"]
diff --git a/sentry_sdk/integrations/spark/spark_driver.py b/sentry_sdk/integrations/spark/spark_driver.py
deleted file mode 100644
index ea43c37..0000000
--- a/sentry_sdk/integrations/spark/spark_driver.py
+++ /dev/null
@@ -1,263 +0,0 @@
-from sentry_sdk import configure_scope
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration
-from sentry_sdk.utils import capture_internal_exceptions
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Optional
-
- from sentry_sdk._types import Event, Hint
-
-
-class SparkIntegration(Integration):
- identifier = "spark"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- patch_spark_context_init()
-
-
-def _set_app_properties():
- # type: () -> None
- """
- Set properties in driver that propagate to worker processes, allowing for workers to have access to those properties.
- This allows worker integration to have access to app_name and application_id.
- """
- from pyspark import SparkContext
-
- spark_context = SparkContext._active_spark_context
- if spark_context:
- spark_context.setLocalProperty("sentry_app_name", spark_context.appName)
- spark_context.setLocalProperty(
- "sentry_application_id", spark_context.applicationId
- )
-
-
-def _start_sentry_listener(sc):
- # type: (Any) -> None
- """
- Start java gateway server to add custom `SparkListener`
- """
- from pyspark.java_gateway import ensure_callback_server_started
-
- gw = sc._gateway
- ensure_callback_server_started(gw)
- listener = SentryListener()
- sc._jsc.sc().addSparkListener(listener)
-
-
-def patch_spark_context_init():
- # type: () -> None
- from pyspark import SparkContext
-
- spark_context_init = SparkContext._do_init
-
- def _sentry_patched_spark_context_init(self, *args, **kwargs):
- # type: (SparkContext, *Any, **Any) -> Optional[Any]
- init = spark_context_init(self, *args, **kwargs)
-
- if Hub.current.get_integration(SparkIntegration) is None:
- return init
-
- _start_sentry_listener(self)
- _set_app_properties()
-
- with configure_scope() as scope:
-
- @scope.add_event_processor
- def process_event(event, hint):
- # type: (Event, Hint) -> Optional[Event]
- with capture_internal_exceptions():
- if Hub.current.get_integration(SparkIntegration) is None:
- return event
-
- event.setdefault("user", {}).setdefault("id", self.sparkUser())
-
- event.setdefault("tags", {}).setdefault(
- "executor.id", self._conf.get("spark.executor.id")
- )
- event["tags"].setdefault(
- "spark-submit.deployMode",
- self._conf.get("spark.submit.deployMode"),
- )
- event["tags"].setdefault(
- "driver.host", self._conf.get("spark.driver.host")
- )
- event["tags"].setdefault(
- "driver.port", self._conf.get("spark.driver.port")
- )
- event["tags"].setdefault("spark_version", self.version)
- event["tags"].setdefault("app_name", self.appName)
- event["tags"].setdefault("application_id", self.applicationId)
- event["tags"].setdefault("master", self.master)
- event["tags"].setdefault("spark_home", self.sparkHome)
-
- event.setdefault("extra", {}).setdefault("web_url", self.uiWebUrl)
-
- return event
-
- return init
-
- SparkContext._do_init = _sentry_patched_spark_context_init
-
-
-class SparkListener(object):
- def onApplicationEnd(self, applicationEnd): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onApplicationStart(self, applicationStart): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onBlockManagerAdded(self, blockManagerAdded): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onBlockManagerRemoved(self, blockManagerRemoved): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onBlockUpdated(self, blockUpdated): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onEnvironmentUpdate(self, environmentUpdate): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onExecutorAdded(self, executorAdded): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onExecutorBlacklisted(self, executorBlacklisted): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onExecutorBlacklistedForStage( # noqa: N802
- self, executorBlacklistedForStage # noqa: N803
- ):
- # type: (Any) -> None
- pass
-
- def onExecutorMetricsUpdate(self, executorMetricsUpdate): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onExecutorRemoved(self, executorRemoved): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onJobEnd(self, jobEnd): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onJobStart(self, jobStart): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onNodeBlacklisted(self, nodeBlacklisted): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onNodeBlacklistedForStage(self, nodeBlacklistedForStage): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onNodeUnblacklisted(self, nodeUnblacklisted): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onOtherEvent(self, event): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onSpeculativeTaskSubmitted(self, speculativeTask): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onStageCompleted(self, stageCompleted): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onStageSubmitted(self, stageSubmitted): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onTaskEnd(self, taskEnd): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onTaskGettingResult(self, taskGettingResult): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onTaskStart(self, taskStart): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- def onUnpersistRDD(self, unpersistRDD): # noqa: N802,N803
- # type: (Any) -> None
- pass
-
- class Java:
- implements = ["org.apache.spark.scheduler.SparkListenerInterface"]
-
-
-class SentryListener(SparkListener):
- def __init__(self):
- # type: () -> None
- self.hub = Hub.current
-
- def onJobStart(self, jobStart): # noqa: N802,N803
- # type: (Any) -> None
- message = "Job {} Started".format(jobStart.jobId())
- self.hub.add_breadcrumb(level="info", message=message)
- _set_app_properties()
-
- def onJobEnd(self, jobEnd): # noqa: N802,N803
- # type: (Any) -> None
- level = ""
- message = ""
- data = {"result": jobEnd.jobResult().toString()}
-
- if jobEnd.jobResult().toString() == "JobSucceeded":
- level = "info"
- message = "Job {} Ended".format(jobEnd.jobId())
- else:
- level = "warning"
- message = "Job {} Failed".format(jobEnd.jobId())
-
- self.hub.add_breadcrumb(level=level, message=message, data=data)
-
- def onStageSubmitted(self, stageSubmitted): # noqa: N802,N803
- # type: (Any) -> None
- stage_info = stageSubmitted.stageInfo()
- message = "Stage {} Submitted".format(stage_info.stageId())
- data = {"attemptId": stage_info.attemptId(), "name": stage_info.name()}
- self.hub.add_breadcrumb(level="info", message=message, data=data)
- _set_app_properties()
-
- def onStageCompleted(self, stageCompleted): # noqa: N802,N803
- # type: (Any) -> None
- from py4j.protocol import Py4JJavaError # type: ignore
-
- stage_info = stageCompleted.stageInfo()
- message = ""
- level = ""
- data = {"attemptId": stage_info.attemptId(), "name": stage_info.name()}
-
- # Have to Try Except because stageInfo.failureReason() is typed with Scala Option
- try:
- data["reason"] = stage_info.failureReason().get()
- message = "Stage {} Failed".format(stage_info.stageId())
- level = "warning"
- except Py4JJavaError:
- message = "Stage {} Completed".format(stage_info.stageId())
- level = "info"
-
- self.hub.add_breadcrumb(level=level, message=message, data=data)
diff --git a/sentry_sdk/integrations/spark/spark_worker.py b/sentry_sdk/integrations/spark/spark_worker.py
deleted file mode 100644
index 2c27647..0000000
--- a/sentry_sdk/integrations/spark/spark_worker.py
+++ /dev/null
@@ -1,124 +0,0 @@
-from __future__ import absolute_import
-
-import sys
-
-from sentry_sdk import configure_scope
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- exc_info_from_error,
- single_exception_from_error_tuple,
- walk_exception_chain,
- event_hint_with_exc_info,
-)
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Optional
-
- from sentry_sdk._types import ExcInfo, Event, Hint
-
-
-class SparkWorkerIntegration(Integration):
- identifier = "spark_worker"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- import pyspark.daemon as original_daemon
-
- original_daemon.worker_main = _sentry_worker_main
-
-
-def _capture_exception(exc_info, hub):
- # type: (ExcInfo, Hub) -> None
- client = hub.client
-
- client_options = client.options # type: ignore
-
- mechanism = {"type": "spark", "handled": False}
-
- exc_info = exc_info_from_error(exc_info)
-
- exc_type, exc_value, tb = exc_info
- rv = []
-
- # On Exception worker will call sys.exit(-1), so we can ignore SystemExit and similar errors
- for exc_type, exc_value, tb in walk_exception_chain(exc_info):
- if exc_type not in (SystemExit, EOFError, ConnectionResetError):
- rv.append(
- single_exception_from_error_tuple(
- exc_type, exc_value, tb, client_options, mechanism
- )
- )
-
- if rv:
- rv.reverse()
- hint = event_hint_with_exc_info(exc_info)
- event = {"level": "error", "exception": {"values": rv}}
-
- _tag_task_context()
-
- hub.capture_event(event, hint=hint)
-
-
-def _tag_task_context():
- # type: () -> None
- from pyspark.taskcontext import TaskContext
-
- with configure_scope() as scope:
-
- @scope.add_event_processor
- def process_event(event, hint):
- # type: (Event, Hint) -> Optional[Event]
- with capture_internal_exceptions():
- integration = Hub.current.get_integration(SparkWorkerIntegration)
- task_context = TaskContext.get()
-
- if integration is None or task_context is None:
- return event
-
- event.setdefault("tags", {}).setdefault(
- "stageId", str(task_context.stageId())
- )
- event["tags"].setdefault("partitionId", str(task_context.partitionId()))
- event["tags"].setdefault(
- "attemptNumber", str(task_context.attemptNumber())
- )
- event["tags"].setdefault(
- "taskAttemptId", str(task_context.taskAttemptId())
- )
-
- if task_context._localProperties:
- if "sentry_app_name" in task_context._localProperties:
- event["tags"].setdefault(
- "app_name", task_context._localProperties["sentry_app_name"]
- )
- event["tags"].setdefault(
- "application_id",
- task_context._localProperties["sentry_application_id"],
- )
-
- if "callSite.short" in task_context._localProperties:
- event.setdefault("extra", {}).setdefault(
- "callSite", task_context._localProperties["callSite.short"]
- )
-
- return event
-
-
-def _sentry_worker_main(*args, **kwargs):
- # type: (*Optional[Any], **Optional[Any]) -> None
- import pyspark.worker as original_worker
-
- try:
- original_worker.main(*args, **kwargs)
- except SystemExit:
- if Hub.current.get_integration(SparkWorkerIntegration) is not None:
- hub = Hub.current
- exc_info = sys.exc_info()
- with capture_internal_exceptions():
- _capture_exception(exc_info, hub)
diff --git a/sentry_sdk/integrations/sqlalchemy.py b/sentry_sdk/integrations/sqlalchemy.py
deleted file mode 100644
index 6f776e4..0000000
--- a/sentry_sdk/integrations/sqlalchemy.py
+++ /dev/null
@@ -1,97 +0,0 @@
-from __future__ import absolute_import
-
-from sentry_sdk._types import MYPY
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.tracing_utils import RecordSqlQueries
-
-try:
- from sqlalchemy.engine import Engine # type: ignore
- from sqlalchemy.event import listen # type: ignore
- from sqlalchemy import __version__ as SQLALCHEMY_VERSION # type: ignore
-except ImportError:
- raise DidNotEnable("SQLAlchemy not installed.")
-
-if MYPY:
- from typing import Any
- from typing import ContextManager
- from typing import Optional
-
- from sentry_sdk.tracing import Span
-
-
-class SqlalchemyIntegration(Integration):
- identifier = "sqlalchemy"
-
- @staticmethod
- def setup_once():
- # type: () -> None
-
- try:
- version = tuple(map(int, SQLALCHEMY_VERSION.split("b")[0].split(".")))
- except (TypeError, ValueError):
- raise DidNotEnable(
- "Unparsable SQLAlchemy version: {}".format(SQLALCHEMY_VERSION)
- )
-
- if version < (1, 2):
- raise DidNotEnable("SQLAlchemy 1.2 or newer required.")
-
- listen(Engine, "before_cursor_execute", _before_cursor_execute)
- listen(Engine, "after_cursor_execute", _after_cursor_execute)
- listen(Engine, "handle_error", _handle_error)
-
-
-def _before_cursor_execute(
- conn, cursor, statement, parameters, context, executemany, *args
-):
- # type: (Any, Any, Any, Any, Any, bool, *Any) -> None
- hub = Hub.current
- if hub.get_integration(SqlalchemyIntegration) is None:
- return
-
- ctx_mgr = RecordSqlQueries(
- hub,
- cursor,
- statement,
- parameters,
- paramstyle=context and context.dialect and context.dialect.paramstyle or None,
- executemany=executemany,
- )
- conn._sentry_sql_span_manager = ctx_mgr
-
- span = ctx_mgr.__enter__()
-
- if span is not None:
- conn._sentry_sql_span = span
-
-
-def _after_cursor_execute(conn, cursor, statement, *args):
- # type: (Any, Any, Any, *Any) -> None
- ctx_mgr = getattr(
- conn, "_sentry_sql_span_manager", None
- ) # type: ContextManager[Any]
-
- if ctx_mgr is not None:
- conn._sentry_sql_span_manager = None
- ctx_mgr.__exit__(None, None, None)
-
-
-def _handle_error(context, *args):
- # type: (Any, *Any) -> None
- conn = context.connection
- span = getattr(conn, "_sentry_sql_span", None) # type: Optional[Span]
-
- if span is not None:
- span.set_status("internal_error")
-
- # _after_cursor_execute does not get called for crashing SQL stmts. Judging
- # from SQLAlchemy codebase it does seem like any error coming into this
- # handler is going to be fatal.
- ctx_mgr = getattr(
- conn, "_sentry_sql_span_manager", None
- ) # type: ContextManager[Any]
-
- if ctx_mgr is not None:
- conn._sentry_sql_span_manager = None
- ctx_mgr.__exit__(None, None, None)
diff --git a/sentry_sdk/integrations/stdlib.py b/sentry_sdk/integrations/stdlib.py
deleted file mode 100644
index adea742..0000000
--- a/sentry_sdk/integrations/stdlib.py
+++ /dev/null
@@ -1,238 +0,0 @@
-import os
-import subprocess
-import sys
-import platform
-
-from sentry_sdk.hub import Hub
-from sentry_sdk.integrations import Integration
-from sentry_sdk.scope import add_global_event_processor
-from sentry_sdk.tracing_utils import EnvironHeaders
-from sentry_sdk.utils import capture_internal_exceptions, logger, safe_repr
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import Optional
- from typing import List
-
- from sentry_sdk._types import Event, Hint
-
-
-try:
- from httplib import HTTPConnection # type: ignore
-except ImportError:
- from http.client import HTTPConnection
-
-
-_RUNTIME_CONTEXT = {
- "name": platform.python_implementation(),
- "version": "%s.%s.%s" % (sys.version_info[:3]),
- "build": sys.version,
-}
-
-
-class StdlibIntegration(Integration):
- identifier = "stdlib"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- _install_httplib()
- _install_subprocess()
-
- @add_global_event_processor
- def add_python_runtime_context(event, hint):
- # type: (Event, Hint) -> Optional[Event]
- if Hub.current.get_integration(StdlibIntegration) is not None:
- contexts = event.setdefault("contexts", {})
- if isinstance(contexts, dict) and "runtime" not in contexts:
- contexts["runtime"] = _RUNTIME_CONTEXT
-
- return event
-
-
-def _install_httplib():
- # type: () -> None
- real_putrequest = HTTPConnection.putrequest
- real_getresponse = HTTPConnection.getresponse
-
- def putrequest(self, method, url, *args, **kwargs):
- # type: (HTTPConnection, str, str, *Any, **Any) -> Any
- hub = Hub.current
- if hub.get_integration(StdlibIntegration) is None:
- return real_putrequest(self, method, url, *args, **kwargs)
-
- host = self.host
- port = self.port
- default_port = self.default_port
-
- real_url = url
- if not real_url.startswith(("http://", "https://")):
- real_url = "%s://%s%s%s" % (
- default_port == 443 and "https" or "http",
- host,
- port != default_port and ":%s" % port or "",
- url,
- )
-
- span = hub.start_span(op="http", description="%s %s" % (method, real_url))
-
- span.set_data("method", method)
- span.set_data("url", real_url)
-
- rv = real_putrequest(self, method, url, *args, **kwargs)
-
- for key, value in hub.iter_trace_propagation_headers(span):
- logger.debug(
- "[Tracing] Adding `{key}` header {value} to outgoing request to {real_url}.".format(
- key=key, value=value, real_url=real_url
- )
- )
- self.putheader(key, value)
-
- self._sentrysdk_span = span
-
- return rv
-
- def getresponse(self, *args, **kwargs):
- # type: (HTTPConnection, *Any, **Any) -> Any
- span = getattr(self, "_sentrysdk_span", None)
-
- if span is None:
- return real_getresponse(self, *args, **kwargs)
-
- rv = real_getresponse(self, *args, **kwargs)
-
- span.set_data("status_code", rv.status)
- span.set_http_status(int(rv.status))
- span.set_data("reason", rv.reason)
- span.finish()
-
- return rv
-
- HTTPConnection.putrequest = putrequest
- HTTPConnection.getresponse = getresponse
-
-
-def _init_argument(args, kwargs, name, position, setdefault_callback=None):
- # type: (List[Any], Dict[Any, Any], str, int, Optional[Callable[[Any], Any]]) -> Any
- """
- given (*args, **kwargs) of a function call, retrieve (and optionally set a
- default for) an argument by either name or position.
-
- This is useful for wrapping functions with complex type signatures and
- extracting a few arguments without needing to redefine that function's
- entire type signature.
- """
-
- if name in kwargs:
- rv = kwargs[name]
- if setdefault_callback is not None:
- rv = setdefault_callback(rv)
- if rv is not None:
- kwargs[name] = rv
- elif position < len(args):
- rv = args[position]
- if setdefault_callback is not None:
- rv = setdefault_callback(rv)
- if rv is not None:
- args[position] = rv
- else:
- rv = setdefault_callback and setdefault_callback(None)
- if rv is not None:
- kwargs[name] = rv
-
- return rv
-
-
-def _install_subprocess():
- # type: () -> None
- old_popen_init = subprocess.Popen.__init__
-
- def sentry_patched_popen_init(self, *a, **kw):
- # type: (subprocess.Popen[Any], *Any, **Any) -> None
-
- hub = Hub.current
- if hub.get_integration(StdlibIntegration) is None:
- return old_popen_init(self, *a, **kw) # type: ignore
-
- # Convert from tuple to list to be able to set values.
- a = list(a)
-
- args = _init_argument(a, kw, "args", 0) or []
- cwd = _init_argument(a, kw, "cwd", 9)
-
- # if args is not a list or tuple (and e.g. some iterator instead),
- # let's not use it at all. There are too many things that can go wrong
- # when trying to collect an iterator into a list and setting that list
- # into `a` again.
- #
- # Also invocations where `args` is not a sequence are not actually
- # legal. They just happen to work under CPython.
- description = None
-
- if isinstance(args, (list, tuple)) and len(args) < 100:
- with capture_internal_exceptions():
- description = " ".join(map(str, args))
-
- if description is None:
- description = safe_repr(args)
-
- env = None
-
- with hub.start_span(op="subprocess", description=description) as span:
-
- for k, v in hub.iter_trace_propagation_headers(span):
- if env is None:
- env = _init_argument(
- a, kw, "env", 10, lambda x: dict(x or os.environ)
- )
- env["SUBPROCESS_" + k.upper().replace("-", "_")] = v
-
- if cwd:
- span.set_data("subprocess.cwd", cwd)
-
- rv = old_popen_init(self, *a, **kw) # type: ignore
-
- span.set_tag("subprocess.pid", self.pid)
- return rv
-
- subprocess.Popen.__init__ = sentry_patched_popen_init # type: ignore
-
- old_popen_wait = subprocess.Popen.wait
-
- def sentry_patched_popen_wait(self, *a, **kw):
- # type: (subprocess.Popen[Any], *Any, **Any) -> Any
- hub = Hub.current
-
- if hub.get_integration(StdlibIntegration) is None:
- return old_popen_wait(self, *a, **kw)
-
- with hub.start_span(op="subprocess.wait") as span:
- span.set_tag("subprocess.pid", self.pid)
- return old_popen_wait(self, *a, **kw)
-
- subprocess.Popen.wait = sentry_patched_popen_wait # type: ignore
-
- old_popen_communicate = subprocess.Popen.communicate
-
- def sentry_patched_popen_communicate(self, *a, **kw):
- # type: (subprocess.Popen[Any], *Any, **Any) -> Any
- hub = Hub.current
-
- if hub.get_integration(StdlibIntegration) is None:
- return old_popen_communicate(self, *a, **kw)
-
- with hub.start_span(op="subprocess.communicate") as span:
- span.set_tag("subprocess.pid", self.pid)
- return old_popen_communicate(self, *a, **kw)
-
- subprocess.Popen.communicate = sentry_patched_popen_communicate # type: ignore
-
-
-def get_subprocess_traceparent_headers():
- # type: () -> EnvironHeaders
- return EnvironHeaders(os.environ, prefix="SUBPROCESS_")
diff --git a/sentry_sdk/integrations/threading.py b/sentry_sdk/integrations/threading.py
deleted file mode 100644
index b750257..0000000
--- a/sentry_sdk/integrations/threading.py
+++ /dev/null
@@ -1,90 +0,0 @@
-from __future__ import absolute_import
-
-import sys
-from threading import Thread, current_thread
-
-from sentry_sdk import Hub
-from sentry_sdk._compat import reraise
-from sentry_sdk._types import MYPY
-from sentry_sdk.integrations import Integration
-from sentry_sdk.utils import event_from_exception, capture_internal_exceptions
-
-if MYPY:
- from typing import Any
- from typing import TypeVar
- from typing import Callable
- from typing import Optional
-
- from sentry_sdk._types import ExcInfo
-
- F = TypeVar("F", bound=Callable[..., Any])
-
-
-class ThreadingIntegration(Integration):
- identifier = "threading"
-
- def __init__(self, propagate_hub=False):
- # type: (bool) -> None
- self.propagate_hub = propagate_hub
-
- @staticmethod
- def setup_once():
- # type: () -> None
- old_start = Thread.start
-
- def sentry_start(self, *a, **kw):
- # type: (Thread, *Any, **Any) -> Any
- hub = Hub.current
- integration = hub.get_integration(ThreadingIntegration)
- if integration is not None:
- if not integration.propagate_hub:
- hub_ = None
- else:
- hub_ = Hub(hub)
- # Patching instance methods in `start()` creates a reference cycle if
- # done in a naive way. See
- # https://github.com/getsentry/sentry-python/pull/434
- #
- # In threading module, using current_thread API will access current thread instance
- # without holding it to avoid a reference cycle in an easier way.
- with capture_internal_exceptions():
- new_run = _wrap_run(hub_, getattr(self.run, "__func__", self.run))
- self.run = new_run # type: ignore
-
- return old_start(self, *a, **kw) # type: ignore
-
- Thread.start = sentry_start # type: ignore
-
-
-def _wrap_run(parent_hub, old_run_func):
- # type: (Optional[Hub], F) -> F
- def run(*a, **kw):
- # type: (*Any, **Any) -> Any
- hub = parent_hub or Hub.current
- with hub:
- try:
- self = current_thread()
- return old_run_func(self, *a, **kw)
- except Exception:
- reraise(*_capture_exception())
-
- return run # type: ignore
-
-
-def _capture_exception():
- # type: () -> ExcInfo
- hub = Hub.current
- exc_info = sys.exc_info()
-
- if hub.get_integration(ThreadingIntegration) is not None:
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- exc_info,
- client_options=client.options,
- mechanism={"type": "threading", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
- return exc_info
diff --git a/sentry_sdk/integrations/tornado.py b/sentry_sdk/integrations/tornado.py
deleted file mode 100644
index f9796da..0000000
--- a/sentry_sdk/integrations/tornado.py
+++ /dev/null
@@ -1,219 +0,0 @@
-import weakref
-import contextlib
-from inspect import iscoroutinefunction
-
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.tracing import Transaction
-from sentry_sdk.utils import (
- HAS_REAL_CONTEXTVARS,
- CONTEXTVARS_ERROR_MESSAGE,
- event_from_exception,
- capture_internal_exceptions,
- transaction_from_function,
-)
-from sentry_sdk.integrations import Integration, DidNotEnable
-from sentry_sdk.integrations._wsgi_common import (
- RequestExtractor,
- _filter_headers,
- _is_json_content_type,
-)
-from sentry_sdk.integrations.logging import ignore_logger
-from sentry_sdk._compat import iteritems
-
-try:
- from tornado import version_info as TORNADO_VERSION # type: ignore
- from tornado.web import RequestHandler, HTTPError
- from tornado.gen import coroutine
-except ImportError:
- raise DidNotEnable("Tornado not installed")
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Optional
- from typing import Dict
- from typing import Callable
- from typing import Generator
-
- from sentry_sdk._types import EventProcessor
-
-
-class TornadoIntegration(Integration):
- identifier = "tornado"
-
- @staticmethod
- def setup_once():
- # type: () -> None
- if TORNADO_VERSION < (5, 0):
- raise DidNotEnable("Tornado 5+ required")
-
- if not HAS_REAL_CONTEXTVARS:
- # Tornado is async. We better have contextvars or we're going to leak
- # state between requests.
- raise DidNotEnable(
- "The tornado integration for Sentry requires Python 3.7+ or the aiocontextvars package"
- + CONTEXTVARS_ERROR_MESSAGE
- )
-
- ignore_logger("tornado.access")
-
- old_execute = RequestHandler._execute # type: ignore
-
- awaitable = iscoroutinefunction(old_execute)
-
- if awaitable:
- # Starting Tornado 6 RequestHandler._execute method is a standard Python coroutine (async/await)
- # In that case our method should be a coroutine function too
- async def sentry_execute_request_handler(self, *args, **kwargs):
- # type: (RequestHandler, *Any, **Any) -> Any
- with _handle_request_impl(self):
- return await old_execute(self, *args, **kwargs)
-
- else:
-
- @coroutine # type: ignore
- def sentry_execute_request_handler(self, *args, **kwargs): # type: ignore
- # type: (RequestHandler, *Any, **Any) -> Any
- with _handle_request_impl(self):
- result = yield from old_execute(self, *args, **kwargs)
- return result
-
- RequestHandler._execute = sentry_execute_request_handler # type: ignore
-
- old_log_exception = RequestHandler.log_exception
-
- def sentry_log_exception(self, ty, value, tb, *args, **kwargs):
- # type: (Any, type, BaseException, Any, *Any, **Any) -> Optional[Any]
- _capture_exception(ty, value, tb)
- return old_log_exception(self, ty, value, tb, *args, **kwargs) # type: ignore
-
- RequestHandler.log_exception = sentry_log_exception # type: ignore
-
-
-@contextlib.contextmanager
-def _handle_request_impl(self):
- # type: (RequestHandler) -> Generator[None, None, None]
- hub = Hub.current
- integration = hub.get_integration(TornadoIntegration)
-
- if integration is None:
- yield
-
- weak_handler = weakref.ref(self)
-
- with Hub(hub) as hub:
- with hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
- processor = _make_event_processor(weak_handler) # type: ignore
- scope.add_event_processor(processor)
-
- transaction = Transaction.continue_from_headers(
- self.request.headers,
- op="http.server",
- # Like with all other integrations, this is our
- # fallback transaction in case there is no route.
- # sentry_urldispatcher_resolve is responsible for
- # setting a transaction name later.
- name="generic Tornado request",
- )
-
- with hub.start_transaction(
- transaction, custom_sampling_context={"tornado_request": self.request}
- ):
- yield
-
-
-def _capture_exception(ty, value, tb):
- # type: (type, BaseException, Any) -> None
- hub = Hub.current
- if hub.get_integration(TornadoIntegration) is None:
- return
- if isinstance(value, HTTPError):
- return
-
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
-
- event, hint = event_from_exception(
- (ty, value, tb),
- client_options=client.options,
- mechanism={"type": "tornado", "handled": False},
- )
-
- hub.capture_event(event, hint=hint)
-
-
-def _make_event_processor(weak_handler):
- # type: (Callable[[], RequestHandler]) -> EventProcessor
- def tornado_processor(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- handler = weak_handler()
- if handler is None:
- return event
-
- request = handler.request
-
- with capture_internal_exceptions():
- method = getattr(handler, handler.request.method.lower()) # type: ignore
- event["transaction"] = transaction_from_function(method)
-
- with capture_internal_exceptions():
- extractor = TornadoRequestExtractor(request)
- extractor.extract_into_event(event)
-
- request_info = event["request"]
-
- request_info["url"] = "%s://%s%s" % (
- request.protocol,
- request.host,
- request.path,
- )
-
- request_info["query_string"] = request.query
- request_info["method"] = request.method
- request_info["env"] = {"REMOTE_ADDR": request.remote_ip}
- request_info["headers"] = _filter_headers(dict(request.headers))
-
- with capture_internal_exceptions():
- if handler.current_user and _should_send_default_pii():
- event.setdefault("user", {}).setdefault("is_authenticated", True)
-
- return event
-
- return tornado_processor
-
-
-class TornadoRequestExtractor(RequestExtractor):
- def content_length(self):
- # type: () -> int
- if self.request.body is None:
- return 0
- return len(self.request.body)
-
- def cookies(self):
- # type: () -> Dict[str, str]
- return {k: v.value for k, v in iteritems(self.request.cookies)}
-
- def raw_data(self):
- # type: () -> bytes
- return self.request.body
-
- def form(self):
- # type: () -> Dict[str, Any]
- return {
- k: [v.decode("latin1", "replace") for v in vs]
- for k, vs in iteritems(self.request.body_arguments)
- }
-
- def is_json(self):
- # type: () -> bool
- return _is_json_content_type(self.request.headers.get("content-type"))
-
- def files(self):
- # type: () -> Dict[str, Any]
- return {k: v[0] for k, v in iteritems(self.request.files) if v}
-
- def size_of_file(self, file):
- # type: (Any) -> int
- return len(file.body or ())
diff --git a/sentry_sdk/integrations/trytond.py b/sentry_sdk/integrations/trytond.py
deleted file mode 100644
index 062a756..0000000
--- a/sentry_sdk/integrations/trytond.py
+++ /dev/null
@@ -1,55 +0,0 @@
-import sentry_sdk.hub
-import sentry_sdk.utils
-import sentry_sdk.integrations
-import sentry_sdk.integrations.wsgi
-from sentry_sdk._types import MYPY
-
-from trytond.exceptions import TrytonException # type: ignore
-from trytond.wsgi import app # type: ignore
-
-if MYPY:
- from typing import Any
-
-
-# TODO: trytond-worker, trytond-cron and trytond-admin intergations
-
-
-class TrytondWSGIIntegration(sentry_sdk.integrations.Integration):
- identifier = "trytond_wsgi"
-
- def __init__(self): # type: () -> None
- pass
-
- @staticmethod
- def setup_once(): # type: () -> None
-
- app.wsgi_app = sentry_sdk.integrations.wsgi.SentryWsgiMiddleware(app.wsgi_app)
-
- def error_handler(e): # type: (Exception) -> None
- hub = sentry_sdk.hub.Hub.current
-
- if hub.get_integration(TrytondWSGIIntegration) is None:
- return
- elif isinstance(e, TrytonException):
- return
- else:
- # If an integration is there, a client has to be there.
- client = hub.client # type: Any
- event, hint = sentry_sdk.utils.event_from_exception(
- e,
- client_options=client.options,
- mechanism={"type": "trytond", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
- # Expected error handlers signature was changed
- # when the error_handler decorator was introduced
- # in Tryton-5.4
- if hasattr(app, "error_handler"):
-
- @app.error_handler
- def _(app, request, e): # type: ignore
- error_handler(e)
-
- else:
- app.error_handlers.append(error_handler)
diff --git a/sentry_sdk/integrations/wsgi.py b/sentry_sdk/integrations/wsgi.py
deleted file mode 100644
index 4f274fa..0000000
--- a/sentry_sdk/integrations/wsgi.py
+++ /dev/null
@@ -1,321 +0,0 @@
-import sys
-
-from sentry_sdk._functools import partial
-from sentry_sdk.hub import Hub, _should_send_default_pii
-from sentry_sdk.utils import (
- ContextVar,
- capture_internal_exceptions,
- event_from_exception,
-)
-from sentry_sdk._compat import PY2, reraise, iteritems
-from sentry_sdk.tracing import Transaction
-from sentry_sdk.sessions import auto_session_tracking
-from sentry_sdk.integrations._wsgi_common import _filter_headers
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Callable
- from typing import Dict
- from typing import Iterator
- from typing import Any
- from typing import Tuple
- from typing import Optional
- from typing import TypeVar
- from typing import Protocol
-
- from sentry_sdk.utils import ExcInfo
- from sentry_sdk._types import EventProcessor
-
- WsgiResponseIter = TypeVar("WsgiResponseIter")
- WsgiResponseHeaders = TypeVar("WsgiResponseHeaders")
- WsgiExcInfo = TypeVar("WsgiExcInfo")
-
- class StartResponse(Protocol):
- def __call__(self, status, response_headers, exc_info=None):
- # type: (str, WsgiResponseHeaders, Optional[WsgiExcInfo]) -> WsgiResponseIter
- pass
-
-
-_wsgi_middleware_applied = ContextVar("sentry_wsgi_middleware_applied")
-
-
-if PY2:
-
- def wsgi_decoding_dance(s, charset="utf-8", errors="replace"):
- # type: (str, str, str) -> str
- return s.decode(charset, errors)
-
-
-else:
-
- def wsgi_decoding_dance(s, charset="utf-8", errors="replace"):
- # type: (str, str, str) -> str
- return s.encode("latin1").decode(charset, errors)
-
-
-def get_host(environ, use_x_forwarded_for=False):
- # type: (Dict[str, str], bool) -> str
- """Return the host for the given WSGI environment. Yanked from Werkzeug."""
- if use_x_forwarded_for and "HTTP_X_FORWARDED_HOST" in environ:
- rv = environ["HTTP_X_FORWARDED_HOST"]
- if environ["wsgi.url_scheme"] == "http" and rv.endswith(":80"):
- rv = rv[:-3]
- elif environ["wsgi.url_scheme"] == "https" and rv.endswith(":443"):
- rv = rv[:-4]
- elif environ.get("HTTP_HOST"):
- rv = environ["HTTP_HOST"]
- if environ["wsgi.url_scheme"] == "http" and rv.endswith(":80"):
- rv = rv[:-3]
- elif environ["wsgi.url_scheme"] == "https" and rv.endswith(":443"):
- rv = rv[:-4]
- elif environ.get("SERVER_NAME"):
- rv = environ["SERVER_NAME"]
- if (environ["wsgi.url_scheme"], environ["SERVER_PORT"]) not in (
- ("https", "443"),
- ("http", "80"),
- ):
- rv += ":" + environ["SERVER_PORT"]
- else:
- # In spite of the WSGI spec, SERVER_NAME might not be present.
- rv = "unknown"
-
- return rv
-
-
-def get_request_url(environ, use_x_forwarded_for=False):
- # type: (Dict[str, str], bool) -> str
- """Return the absolute URL without query string for the given WSGI
- environment."""
- return "%s://%s/%s" % (
- environ.get("wsgi.url_scheme"),
- get_host(environ, use_x_forwarded_for),
- wsgi_decoding_dance(environ.get("PATH_INFO") or "").lstrip("/"),
- )
-
-
-class SentryWsgiMiddleware(object):
- __slots__ = ("app", "use_x_forwarded_for")
-
- def __init__(self, app, use_x_forwarded_for=False):
- # type: (Callable[[Dict[str, str], Callable[..., Any]], Any], bool) -> None
- self.app = app
- self.use_x_forwarded_for = use_x_forwarded_for
-
- def __call__(self, environ, start_response):
- # type: (Dict[str, str], Callable[..., Any]) -> _ScopedResponse
- if _wsgi_middleware_applied.get(False):
- return self.app(environ, start_response)
-
- _wsgi_middleware_applied.set(True)
- try:
- hub = Hub(Hub.current)
- with auto_session_tracking(hub, session_mode="request"):
- with hub:
- with capture_internal_exceptions():
- with hub.configure_scope() as scope:
- scope.clear_breadcrumbs()
- scope._name = "wsgi"
- scope.add_event_processor(
- _make_wsgi_event_processor(
- environ, self.use_x_forwarded_for
- )
- )
-
- transaction = Transaction.continue_from_environ(
- environ, op="http.server", name="generic WSGI request"
- )
-
- with hub.start_transaction(
- transaction, custom_sampling_context={"wsgi_environ": environ}
- ):
- try:
- rv = self.app(
- environ,
- partial(
- _sentry_start_response, start_response, transaction
- ),
- )
- except BaseException:
- reraise(*_capture_exception(hub))
- finally:
- _wsgi_middleware_applied.set(False)
-
- return _ScopedResponse(hub, rv)
-
-
-def _sentry_start_response(
- old_start_response, # type: StartResponse
- transaction, # type: Transaction
- status, # type: str
- response_headers, # type: WsgiResponseHeaders
- exc_info=None, # type: Optional[WsgiExcInfo]
-):
- # type: (...) -> WsgiResponseIter
- with capture_internal_exceptions():
- status_int = int(status.split(" ", 1)[0])
- transaction.set_http_status(status_int)
-
- if exc_info is None:
- # The Django Rest Framework WSGI test client, and likely other
- # (incorrect) implementations, cannot deal with the exc_info argument
- # if one is present. Avoid providing a third argument if not necessary.
- return old_start_response(status, response_headers)
- else:
- return old_start_response(status, response_headers, exc_info)
-
-
-def _get_environ(environ):
- # type: (Dict[str, str]) -> Iterator[Tuple[str, str]]
- """
- Returns our explicitly included environment variables we want to
- capture (server name, port and remote addr if pii is enabled).
- """
- keys = ["SERVER_NAME", "SERVER_PORT"]
- if _should_send_default_pii():
- # make debugging of proxy setup easier. Proxy headers are
- # in headers.
- keys += ["REMOTE_ADDR"]
-
- for key in keys:
- if key in environ:
- yield key, environ[key]
-
-
-# `get_headers` comes from `werkzeug.datastructures.EnvironHeaders`
-#
-# We need this function because Django does not give us a "pure" http header
-# dict. So we might as well use it for all WSGI integrations.
-def _get_headers(environ):
- # type: (Dict[str, str]) -> Iterator[Tuple[str, str]]
- """
- Returns only proper HTTP headers.
-
- """
- for key, value in iteritems(environ):
- key = str(key)
- if key.startswith("HTTP_") and key not in (
- "HTTP_CONTENT_TYPE",
- "HTTP_CONTENT_LENGTH",
- ):
- yield key[5:].replace("_", "-").title(), value
- elif key in ("CONTENT_TYPE", "CONTENT_LENGTH"):
- yield key.replace("_", "-").title(), value
-
-
-def get_client_ip(environ):
- # type: (Dict[str, str]) -> Optional[Any]
- """
- Infer the user IP address from various headers. This cannot be used in
- security sensitive situations since the value may be forged from a client,
- but it's good enough for the event payload.
- """
- try:
- return environ["HTTP_X_FORWARDED_FOR"].split(",")[0].strip()
- except (KeyError, IndexError):
- pass
-
- try:
- return environ["HTTP_X_REAL_IP"]
- except KeyError:
- pass
-
- return environ.get("REMOTE_ADDR")
-
-
-def _capture_exception(hub):
- # type: (Hub) -> ExcInfo
- exc_info = sys.exc_info()
-
- # Check client here as it might have been unset while streaming response
- if hub.client is not None:
- e = exc_info[1]
-
- # SystemExit(0) is the only uncaught exception that is expected behavior
- should_skip_capture = isinstance(e, SystemExit) and e.code in (0, None)
- if not should_skip_capture:
- event, hint = event_from_exception(
- exc_info,
- client_options=hub.client.options,
- mechanism={"type": "wsgi", "handled": False},
- )
- hub.capture_event(event, hint=hint)
-
- return exc_info
-
-
-class _ScopedResponse(object):
- __slots__ = ("_response", "_hub")
-
- def __init__(self, hub, response):
- # type: (Hub, Iterator[bytes]) -> None
- self._hub = hub
- self._response = response
-
- def __iter__(self):
- # type: () -> Iterator[bytes]
- iterator = iter(self._response)
-
- while True:
- with self._hub:
- try:
- chunk = next(iterator)
- except StopIteration:
- break
- except BaseException:
- reraise(*_capture_exception(self._hub))
-
- yield chunk
-
- def close(self):
- # type: () -> None
- with self._hub:
- try:
- self._response.close() # type: ignore
- except AttributeError:
- pass
- except BaseException:
- reraise(*_capture_exception(self._hub))
-
-
-def _make_wsgi_event_processor(environ, use_x_forwarded_for):
- # type: (Dict[str, str], bool) -> EventProcessor
- # It's a bit unfortunate that we have to extract and parse the request data
- # from the environ so eagerly, but there are a few good reasons for this.
- #
- # We might be in a situation where the scope/hub never gets torn down
- # properly. In that case we will have an unnecessary strong reference to
- # all objects in the environ (some of which may take a lot of memory) when
- # we're really just interested in a few of them.
- #
- # Keeping the environment around for longer than the request lifecycle is
- # also not necessarily something uWSGI can deal with:
- # https://github.com/unbit/uwsgi/issues/1950
-
- client_ip = get_client_ip(environ)
- request_url = get_request_url(environ, use_x_forwarded_for)
- query_string = environ.get("QUERY_STRING")
- method = environ.get("REQUEST_METHOD")
- env = dict(_get_environ(environ))
- headers = _filter_headers(dict(_get_headers(environ)))
-
- def event_processor(event, hint):
- # type: (Dict[str, Any], Dict[str, Any]) -> Dict[str, Any]
- with capture_internal_exceptions():
- # if the code below fails halfway through we at least have some data
- request_info = event.setdefault("request", {})
-
- if _should_send_default_pii():
- user_info = event.setdefault("user", {})
- if client_ip:
- user_info.setdefault("ip_address", client_ip)
-
- request_info["url"] = request_url
- request_info["query_string"] = query_string
- request_info["method"] = method
- request_info["env"] = env
- request_info["headers"] = headers
-
- return event
-
- return event_processor
diff --git a/sentry_sdk/py.typed b/sentry_sdk/py.typed
deleted file mode 100644
index e69de29..0000000
diff --git a/sentry_sdk/scope.py b/sentry_sdk/scope.py
deleted file mode 100644
index bcfbf5c..0000000
--- a/sentry_sdk/scope.py
+++ /dev/null
@@ -1,479 +0,0 @@
-from copy import copy
-from collections import deque
-from itertools import chain
-
-from sentry_sdk._functools import wraps
-from sentry_sdk._types import MYPY
-from sentry_sdk.utils import logger, capture_internal_exceptions
-from sentry_sdk.tracing import Transaction
-from sentry_sdk.attachments import Attachment
-
-if MYPY:
- from typing import Any
- from typing import Dict
- from typing import Optional
- from typing import Deque
- from typing import List
- from typing import Callable
- from typing import TypeVar
-
- from sentry_sdk._types import (
- Breadcrumb,
- Event,
- EventProcessor,
- ErrorProcessor,
- ExcInfo,
- Hint,
- Type,
- )
-
- from sentry_sdk.tracing import Span
- from sentry_sdk.session import Session
-
- F = TypeVar("F", bound=Callable[..., Any])
- T = TypeVar("T")
-
-
-global_event_processors = [] # type: List[EventProcessor]
-
-
-def add_global_event_processor(processor):
- # type: (EventProcessor) -> None
- global_event_processors.append(processor)
-
-
-def _attr_setter(fn):
- # type: (Any) -> Any
- return property(fset=fn, doc=fn.__doc__)
-
-
-def _disable_capture(fn):
- # type: (F) -> F
- @wraps(fn)
- def wrapper(self, *args, **kwargs):
- # type: (Any, *Dict[str, Any], **Any) -> Any
- if not self._should_capture:
- return
- try:
- self._should_capture = False
- return fn(self, *args, **kwargs)
- finally:
- self._should_capture = True
-
- return wrapper # type: ignore
-
-
-class Scope(object):
- """The scope holds extra information that should be sent with all
- events that belong to it.
- """
-
- # NOTE: Even though it should not happen, the scope needs to not crash when
- # accessed by multiple threads. It's fine if it's full of races, but those
- # races should never make the user application crash.
- #
- # The same needs to hold for any accesses of the scope the SDK makes.
-
- __slots__ = (
- "_level",
- "_name",
- "_fingerprint",
- # note that for legacy reasons, _transaction is the transaction *name*,
- # not a Transaction object (the object is stored in _span)
- "_transaction",
- "_user",
- "_tags",
- "_contexts",
- "_extras",
- "_breadcrumbs",
- "_event_processors",
- "_error_processors",
- "_should_capture",
- "_span",
- "_session",
- "_attachments",
- "_force_auto_session_tracking",
- )
-
- def __init__(self):
- # type: () -> None
- self._event_processors = [] # type: List[EventProcessor]
- self._error_processors = [] # type: List[ErrorProcessor]
-
- self._name = None # type: Optional[str]
- self.clear()
-
- def clear(self):
- # type: () -> None
- """Clears the entire scope."""
- self._level = None # type: Optional[str]
- self._fingerprint = None # type: Optional[List[str]]
- self._transaction = None # type: Optional[str]
- self._user = None # type: Optional[Dict[str, Any]]
-
- self._tags = {} # type: Dict[str, Any]
- self._contexts = {} # type: Dict[str, Dict[str, Any]]
- self._extras = {} # type: Dict[str, Any]
- self._attachments = [] # type: List[Attachment]
-
- self.clear_breadcrumbs()
- self._should_capture = True
-
- self._span = None # type: Optional[Span]
- self._session = None # type: Optional[Session]
- self._force_auto_session_tracking = None # type: Optional[bool]
-
- @_attr_setter
- def level(self, value):
- # type: (Optional[str]) -> None
- """When set this overrides the level. Deprecated in favor of set_level."""
- self._level = value
-
- def set_level(self, value):
- # type: (Optional[str]) -> None
- """Sets the level for the scope."""
- self._level = value
-
- @_attr_setter
- def fingerprint(self, value):
- # type: (Optional[List[str]]) -> None
- """When set this overrides the default fingerprint."""
- self._fingerprint = value
-
- @property
- def transaction(self):
- # type: () -> Any
- # would be type: () -> Optional[Transaction], see https://github.com/python/mypy/issues/3004
- """Return the transaction (root span) in the scope, if any."""
-
- # there is no span/transaction on the scope
- if self._span is None:
- return None
-
- # there is an orphan span on the scope
- if self._span.containing_transaction is None:
- return None
-
- # there is either a transaction (which is its own containing
- # transaction) or a non-orphan span on the scope
- return self._span.containing_transaction
-
- @transaction.setter
- def transaction(self, value):
- # type: (Any) -> None
- # would be type: (Optional[str]) -> None, see https://github.com/python/mypy/issues/3004
- """When set this forces a specific transaction name to be set."""
- # XXX: the docstring above is misleading. The implementation of
- # apply_to_event prefers an existing value of event.transaction over
- # anything set in the scope.
- # XXX: note that with the introduction of the Scope.transaction getter,
- # there is a semantic and type mismatch between getter and setter. The
- # getter returns a Transaction, the setter sets a transaction name.
- # Without breaking version compatibility, we could make the setter set a
- # transaction name or transaction (self._span) depending on the type of
- # the value argument.
- self._transaction = value
- if self._span and self._span.containing_transaction:
- self._span.containing_transaction.name = value
-
- @_attr_setter
- def user(self, value):
- # type: (Optional[Dict[str, Any]]) -> None
- """When set a specific user is bound to the scope. Deprecated in favor of set_user."""
- self.set_user(value)
-
- def set_user(self, value):
- # type: (Optional[Dict[str, Any]]) -> None
- """Sets a user for the scope."""
- self._user = value
- if self._session is not None:
- self._session.update(user=value)
-
- @property
- def span(self):
- # type: () -> Optional[Span]
- """Get/set current tracing span or transaction."""
- return self._span
-
- @span.setter
- def span(self, span):
- # type: (Optional[Span]) -> None
- self._span = span
- # XXX: this differs from the implementation in JS, there Scope.setSpan
- # does not set Scope._transactionName.
- if isinstance(span, Transaction):
- transaction = span
- if transaction.name:
- self._transaction = transaction.name
-
- def set_tag(
- self,
- key, # type: str
- value, # type: Any
- ):
- # type: (...) -> None
- """Sets a tag for a key to a specific value."""
- self._tags[key] = value
-
- def remove_tag(
- self, key # type: str
- ):
- # type: (...) -> None
- """Removes a specific tag."""
- self._tags.pop(key, None)
-
- def set_context(
- self,
- key, # type: str
- value, # type: Dict[str, Any]
- ):
- # type: (...) -> None
- """Binds a context at a certain key to a specific value."""
- self._contexts[key] = value
-
- def remove_context(
- self, key # type: str
- ):
- # type: (...) -> None
- """Removes a context."""
- self._contexts.pop(key, None)
-
- def set_extra(
- self,
- key, # type: str
- value, # type: Any
- ):
- # type: (...) -> None
- """Sets an extra key to a specific value."""
- self._extras[key] = value
-
- def remove_extra(
- self, key # type: str
- ):
- # type: (...) -> None
- """Removes a specific extra key."""
- self._extras.pop(key, None)
-
- def clear_breadcrumbs(self):
- # type: () -> None
- """Clears breadcrumb buffer."""
- self._breadcrumbs = deque() # type: Deque[Breadcrumb]
-
- def add_attachment(
- self,
- bytes=None, # type: Optional[bytes]
- filename=None, # type: Optional[str]
- path=None, # type: Optional[str]
- content_type=None, # type: Optional[str]
- add_to_transactions=False, # type: bool
- ):
- # type: (...) -> None
- """Adds an attachment to future events sent."""
- self._attachments.append(
- Attachment(
- bytes=bytes,
- path=path,
- filename=filename,
- content_type=content_type,
- add_to_transactions=add_to_transactions,
- )
- )
-
- def add_event_processor(
- self, func # type: EventProcessor
- ):
- # type: (...) -> None
- """Register a scope local event processor on the scope.
-
- :param func: This function behaves like `before_send.`
- """
- if len(self._event_processors) > 20:
- logger.warning(
- "Too many event processors on scope! Clearing list to free up some memory: %r",
- self._event_processors,
- )
- del self._event_processors[:]
-
- self._event_processors.append(func)
-
- def add_error_processor(
- self,
- func, # type: ErrorProcessor
- cls=None, # type: Optional[Type[BaseException]]
- ):
- # type: (...) -> None
- """Register a scope local error processor on the scope.
-
- :param func: A callback that works similar to an event processor but is invoked with the original exception info triple as second argument.
-
- :param cls: Optionally, only process exceptions of this type.
- """
- if cls is not None:
- cls_ = cls # For mypy.
- real_func = func
-
- def func(event, exc_info):
- # type: (Event, ExcInfo) -> Optional[Event]
- try:
- is_inst = isinstance(exc_info[1], cls_)
- except Exception:
- is_inst = False
- if is_inst:
- return real_func(event, exc_info)
- return event
-
- self._error_processors.append(func)
-
- @_disable_capture
- def apply_to_event(
- self,
- event, # type: Event
- hint, # type: Hint
- ):
- # type: (...) -> Optional[Event]
- """Applies the information contained on the scope to the given event."""
-
- def _drop(event, cause, ty):
- # type: (Dict[str, Any], Any, str) -> Optional[Any]
- logger.info("%s (%s) dropped event (%s)", ty, cause, event)
- return None
-
- is_transaction = event.get("type") == "transaction"
-
- # put all attachments into the hint. This lets callbacks play around
- # with attachments. We also later pull this out of the hint when we
- # create the envelope.
- attachments_to_send = hint.get("attachments") or []
- for attachment in self._attachments:
- if not is_transaction or attachment.add_to_transactions:
- attachments_to_send.append(attachment)
- hint["attachments"] = attachments_to_send
-
- if self._level is not None:
- event["level"] = self._level
-
- if not is_transaction:
- event.setdefault("breadcrumbs", {}).setdefault("values", []).extend(
- self._breadcrumbs
- )
-
- if event.get("user") is None and self._user is not None:
- event["user"] = self._user
-
- if event.get("transaction") is None and self._transaction is not None:
- event["transaction"] = self._transaction
-
- if event.get("fingerprint") is None and self._fingerprint is not None:
- event["fingerprint"] = self._fingerprint
-
- if self._extras:
- event.setdefault("extra", {}).update(self._extras)
-
- if self._tags:
- event.setdefault("tags", {}).update(self._tags)
-
- if self._contexts:
- event.setdefault("contexts", {}).update(self._contexts)
-
- if self._span is not None:
- contexts = event.setdefault("contexts", {})
- if not contexts.get("trace"):
- contexts["trace"] = self._span.get_trace_context()
-
- exc_info = hint.get("exc_info")
- if exc_info is not None:
- for error_processor in self._error_processors:
- new_event = error_processor(event, exc_info)
- if new_event is None:
- return _drop(event, error_processor, "error processor")
- event = new_event
-
- for event_processor in chain(global_event_processors, self._event_processors):
- new_event = event
- with capture_internal_exceptions():
- new_event = event_processor(event, hint)
- if new_event is None:
- return _drop(event, event_processor, "event processor")
- event = new_event
-
- return event
-
- def update_from_scope(self, scope):
- # type: (Scope) -> None
- if scope._level is not None:
- self._level = scope._level
- if scope._fingerprint is not None:
- self._fingerprint = scope._fingerprint
- if scope._transaction is not None:
- self._transaction = scope._transaction
- if scope._user is not None:
- self._user = scope._user
- if scope._tags:
- self._tags.update(scope._tags)
- if scope._contexts:
- self._contexts.update(scope._contexts)
- if scope._extras:
- self._extras.update(scope._extras)
- if scope._breadcrumbs:
- self._breadcrumbs.extend(scope._breadcrumbs)
- if scope._span:
- self._span = scope._span
- if scope._attachments:
- self._attachments.extend(scope._attachments)
-
- def update_from_kwargs(
- self,
- user=None, # type: Optional[Any]
- level=None, # type: Optional[str]
- extras=None, # type: Optional[Dict[str, Any]]
- contexts=None, # type: Optional[Dict[str, Any]]
- tags=None, # type: Optional[Dict[str, str]]
- fingerprint=None, # type: Optional[List[str]]
- ):
- # type: (...) -> None
- if level is not None:
- self._level = level
- if user is not None:
- self._user = user
- if extras is not None:
- self._extras.update(extras)
- if contexts is not None:
- self._contexts.update(contexts)
- if tags is not None:
- self._tags.update(tags)
- if fingerprint is not None:
- self._fingerprint = fingerprint
-
- def __copy__(self):
- # type: () -> Scope
- rv = object.__new__(self.__class__) # type: Scope
-
- rv._level = self._level
- rv._name = self._name
- rv._fingerprint = self._fingerprint
- rv._transaction = self._transaction
- rv._user = self._user
-
- rv._tags = dict(self._tags)
- rv._contexts = dict(self._contexts)
- rv._extras = dict(self._extras)
-
- rv._breadcrumbs = copy(self._breadcrumbs)
- rv._event_processors = list(self._event_processors)
- rv._error_processors = list(self._error_processors)
-
- rv._should_capture = self._should_capture
- rv._span = self._span
- rv._session = self._session
- rv._force_auto_session_tracking = self._force_auto_session_tracking
- rv._attachments = list(self._attachments)
-
- return rv
-
- def __repr__(self):
- # type: () -> str
- return "<%s id=%s name=%s>" % (
- self.__class__.__name__,
- hex(id(self)),
- self._name,
- )
diff --git a/sentry_sdk/serializer.py b/sentry_sdk/serializer.py
deleted file mode 100644
index 134528c..0000000
--- a/sentry_sdk/serializer.py
+++ /dev/null
@@ -1,468 +0,0 @@
-import sys
-import math
-
-from datetime import datetime
-
-from sentry_sdk.utils import (
- AnnotatedValue,
- capture_internal_exception,
- disable_capture_event,
- format_timestamp,
- json_dumps,
- safe_repr,
- strip_string,
-)
-
-import sentry_sdk.utils
-
-from sentry_sdk._compat import text_type, PY2, string_types, number_types, iteritems
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from datetime import timedelta
-
- from types import TracebackType
-
- from typing import Any
- from typing import Callable
- from typing import ContextManager
- from typing import Dict
- from typing import List
- from typing import Optional
- from typing import Tuple
- from typing import Type
- from typing import Union
-
- from sentry_sdk._types import NotImplementedType, Event
-
- Span = Dict[str, Any]
-
- ReprProcessor = Callable[[Any, Dict[str, Any]], Union[NotImplementedType, str]]
- Segment = Union[str, int]
-
-
-if PY2:
- # Importing ABCs from collections is deprecated, and will stop working in 3.8
- # https://github.com/python/cpython/blob/master/Lib/collections/__init__.py#L49
- from collections import Mapping, Sequence, Set
-
- serializable_str_types = string_types
-
-else:
- # New in 3.3
- # https://docs.python.org/3/library/collections.abc.html
- from collections.abc import Mapping, Sequence, Set
-
- # Bytes are technically not strings in Python 3, but we can serialize them
- serializable_str_types = (str, bytes)
-
-
-# Maximum length of JSON-serialized event payloads that can be safely sent
-# before the server may reject the event due to its size. This is not intended
-# to reflect actual values defined server-side, but rather only be an upper
-# bound for events sent by the SDK.
-#
-# Can be overwritten if wanting to send more bytes, e.g. with a custom server.
-# When changing this, keep in mind that events may be a little bit larger than
-# this value due to attached metadata, so keep the number conservative.
-MAX_EVENT_BYTES = 10 ** 6
-
-MAX_DATABAG_DEPTH = 5
-MAX_DATABAG_BREADTH = 10
-CYCLE_MARKER = u""
-
-
-global_repr_processors = [] # type: List[ReprProcessor]
-
-
-def add_global_repr_processor(processor):
- # type: (ReprProcessor) -> None
- global_repr_processors.append(processor)
-
-
-class Memo(object):
- __slots__ = ("_ids", "_objs")
-
- def __init__(self):
- # type: () -> None
- self._ids = {} # type: Dict[int, Any]
- self._objs = [] # type: List[Any]
-
- def memoize(self, obj):
- # type: (Any) -> ContextManager[bool]
- self._objs.append(obj)
- return self
-
- def __enter__(self):
- # type: () -> bool
- obj = self._objs[-1]
- if id(obj) in self._ids:
- return True
- else:
- self._ids[id(obj)] = obj
- return False
-
- def __exit__(
- self,
- ty, # type: Optional[Type[BaseException]]
- value, # type: Optional[BaseException]
- tb, # type: Optional[TracebackType]
- ):
- # type: (...) -> None
- self._ids.pop(id(self._objs.pop()), None)
-
-
-def serialize(event, smart_transaction_trimming=False, **kwargs):
- # type: (Event, bool, **Any) -> Event
- memo = Memo()
- path = [] # type: List[Segment]
- meta_stack = [] # type: List[Dict[str, Any]]
- span_description_bytes = [] # type: List[int]
-
- def _annotate(**meta):
- # type: (**Any) -> None
- while len(meta_stack) <= len(path):
- try:
- segment = path[len(meta_stack) - 1]
- node = meta_stack[-1].setdefault(text_type(segment), {})
- except IndexError:
- node = {}
-
- meta_stack.append(node)
-
- meta_stack[-1].setdefault("", {}).update(meta)
-
- def _should_repr_strings():
- # type: () -> Optional[bool]
- """
- By default non-serializable objects are going through
- safe_repr(). For certain places in the event (local vars) we
- want to repr() even things that are JSON-serializable to
- make their type more apparent. For example, it's useful to
- see the difference between a unicode-string and a bytestring
- when viewing a stacktrace.
-
- For container-types we still don't do anything different.
- Generally we just try to make the Sentry UI present exactly
- what a pretty-printed repr would look like.
-
- :returns: `True` if we are somewhere in frame variables, and `False` if
- we are in a position where we will never encounter frame variables
- when recursing (for example, we're in `event.extra`). `None` if we
- are not (yet) in frame variables, but might encounter them when
- recursing (e.g. we're in `event.exception`)
- """
- try:
- p0 = path[0]
- if p0 == "stacktrace" and path[1] == "frames" and path[3] == "vars":
- return True
-
- if (
- p0 in ("threads", "exception")
- and path[1] == "values"
- and path[3] == "stacktrace"
- and path[4] == "frames"
- and path[6] == "vars"
- ):
- return True
- except IndexError:
- return None
-
- return False
-
- def _is_databag():
- # type: () -> Optional[bool]
- """
- A databag is any value that we need to trim.
-
- :returns: Works like `_should_repr_strings()`. `True` for "yes",
- `False` for :"no", `None` for "maybe soon".
- """
- try:
- rv = _should_repr_strings()
- if rv in (True, None):
- return rv
-
- p0 = path[0]
- if p0 == "request" and path[1] == "data":
- return True
-
- if p0 == "breadcrumbs" and path[1] == "values":
- path[2]
- return True
-
- if p0 == "extra":
- return True
-
- except IndexError:
- return None
-
- return False
-
- def _serialize_node(
- obj, # type: Any
- is_databag=None, # type: Optional[bool]
- should_repr_strings=None, # type: Optional[bool]
- segment=None, # type: Optional[Segment]
- remaining_breadth=None, # type: Optional[int]
- remaining_depth=None, # type: Optional[int]
- ):
- # type: (...) -> Any
- if segment is not None:
- path.append(segment)
-
- try:
- with memo.memoize(obj) as result:
- if result:
- return CYCLE_MARKER
-
- return _serialize_node_impl(
- obj,
- is_databag=is_databag,
- should_repr_strings=should_repr_strings,
- remaining_depth=remaining_depth,
- remaining_breadth=remaining_breadth,
- )
- except BaseException:
- capture_internal_exception(sys.exc_info())
-
- if is_databag:
- return u""
-
- return None
- finally:
- if segment is not None:
- path.pop()
- del meta_stack[len(path) + 1 :]
-
- def _flatten_annotated(obj):
- # type: (Any) -> Any
- if isinstance(obj, AnnotatedValue):
- _annotate(**obj.metadata)
- obj = obj.value
- return obj
-
- def _serialize_node_impl(
- obj, is_databag, should_repr_strings, remaining_depth, remaining_breadth
- ):
- # type: (Any, Optional[bool], Optional[bool], Optional[int], Optional[int]) -> Any
- if should_repr_strings is None:
- should_repr_strings = _should_repr_strings()
-
- if is_databag is None:
- is_databag = _is_databag()
-
- if is_databag and remaining_depth is None:
- remaining_depth = MAX_DATABAG_DEPTH
- if is_databag and remaining_breadth is None:
- remaining_breadth = MAX_DATABAG_BREADTH
-
- obj = _flatten_annotated(obj)
-
- if remaining_depth is not None and remaining_depth <= 0:
- _annotate(rem=[["!limit", "x"]])
- if is_databag:
- return _flatten_annotated(strip_string(safe_repr(obj)))
- return None
-
- if is_databag and global_repr_processors:
- hints = {"memo": memo, "remaining_depth": remaining_depth}
- for processor in global_repr_processors:
- result = processor(obj, hints)
- if result is not NotImplemented:
- return _flatten_annotated(result)
-
- sentry_repr = getattr(type(obj), "__sentry_repr__", None)
-
- if obj is None or isinstance(obj, (bool, number_types)):
- if should_repr_strings or (
- isinstance(obj, float) and (math.isinf(obj) or math.isnan(obj))
- ):
- return safe_repr(obj)
- else:
- return obj
-
- elif callable(sentry_repr):
- return sentry_repr(obj)
-
- elif isinstance(obj, datetime):
- return (
- text_type(format_timestamp(obj))
- if not should_repr_strings
- else safe_repr(obj)
- )
-
- elif isinstance(obj, Mapping):
- # Create temporary copy here to avoid calling too much code that
- # might mutate our dictionary while we're still iterating over it.
- obj = dict(iteritems(obj))
-
- rv_dict = {} # type: Dict[str, Any]
- i = 0
-
- for k, v in iteritems(obj):
- if remaining_breadth is not None and i >= remaining_breadth:
- _annotate(len=len(obj))
- break
-
- str_k = text_type(k)
- v = _serialize_node(
- v,
- segment=str_k,
- should_repr_strings=should_repr_strings,
- is_databag=is_databag,
- remaining_depth=remaining_depth - 1
- if remaining_depth is not None
- else None,
- remaining_breadth=remaining_breadth,
- )
- rv_dict[str_k] = v
- i += 1
-
- return rv_dict
-
- elif not isinstance(obj, serializable_str_types) and isinstance(
- obj, (Set, Sequence)
- ):
- rv_list = []
-
- for i, v in enumerate(obj):
- if remaining_breadth is not None and i >= remaining_breadth:
- _annotate(len=len(obj))
- break
-
- rv_list.append(
- _serialize_node(
- v,
- segment=i,
- should_repr_strings=should_repr_strings,
- is_databag=is_databag,
- remaining_depth=remaining_depth - 1
- if remaining_depth is not None
- else None,
- remaining_breadth=remaining_breadth,
- )
- )
-
- return rv_list
-
- if should_repr_strings:
- obj = safe_repr(obj)
- else:
- if isinstance(obj, bytes):
- obj = obj.decode("utf-8", "replace")
-
- if not isinstance(obj, string_types):
- obj = safe_repr(obj)
-
- # Allow span descriptions to be longer than other strings.
- #
- # For database auto-instrumented spans, the description contains
- # potentially long SQL queries that are most useful when not truncated.
- # Because arbitrarily large events may be discarded by the server as a
- # protection mechanism, we dynamically limit the description length
- # later in _truncate_span_descriptions.
- if (
- smart_transaction_trimming
- and len(path) == 3
- and path[0] == "spans"
- and path[-1] == "description"
- ):
- span_description_bytes.append(len(obj))
- return obj
- return _flatten_annotated(strip_string(obj))
-
- def _truncate_span_descriptions(serialized_event, event, excess_bytes):
- # type: (Event, Event, int) -> None
- """
- Modifies serialized_event in-place trying to remove excess_bytes from
- span descriptions. The original event is used read-only to access the
- span timestamps (represented as RFC3399-formatted strings in
- serialized_event).
-
- It uses heuristics to prioritize preserving the description of spans
- that might be the most interesting ones in terms of understanding and
- optimizing performance.
- """
- # When truncating a description, preserve a small prefix.
- min_length = 10
-
- def shortest_duration_longest_description_first(args):
- # type: (Tuple[int, Span]) -> Tuple[timedelta, int]
- i, serialized_span = args
- span = event["spans"][i]
- now = datetime.utcnow()
- start = span.get("start_timestamp") or now
- end = span.get("timestamp") or now
- duration = end - start
- description = serialized_span.get("description") or ""
- return (duration, -len(description))
-
- # Note: for simplicity we sort spans by exact duration and description
- # length. If ever needed, we could have a more involved heuristic, e.g.
- # replacing exact durations with "buckets" and/or looking at other span
- # properties.
- path.append("spans")
- for i, span in sorted(
- enumerate(serialized_event.get("spans") or []),
- key=shortest_duration_longest_description_first,
- ):
- description = span.get("description") or ""
- if len(description) <= min_length:
- continue
- excess_bytes -= len(description) - min_length
- path.extend([i, "description"])
- # Note: the last time we call strip_string we could preserve a few
- # more bytes up to a total length of MAX_EVENT_BYTES. Since that's
- # not strictly required, we leave it out for now for simplicity.
- span["description"] = _flatten_annotated(
- strip_string(description, max_length=min_length)
- )
- del path[-2:]
- del meta_stack[len(path) + 1 :]
-
- if excess_bytes <= 0:
- break
- path.pop()
- del meta_stack[len(path) + 1 :]
-
- disable_capture_event.set(True)
- try:
- rv = _serialize_node(event, **kwargs)
- if meta_stack and isinstance(rv, dict):
- rv["_meta"] = meta_stack[0]
-
- sum_span_description_bytes = sum(span_description_bytes)
- if smart_transaction_trimming and sum_span_description_bytes > 0:
- span_count = len(event.get("spans") or [])
- # This is an upper bound of how many bytes all descriptions would
- # consume if the usual string truncation in _serialize_node_impl
- # would have taken place, not accounting for the metadata attached
- # as event["_meta"].
- descriptions_budget_bytes = span_count * sentry_sdk.utils.MAX_STRING_LENGTH
-
- # If by not truncating descriptions we ended up with more bytes than
- # per the usual string truncation, check if the event is too large
- # and we need to truncate some descriptions.
- #
- # This is guarded with an if statement to avoid JSON-encoding the
- # event unnecessarily.
- if sum_span_description_bytes > descriptions_budget_bytes:
- original_bytes = len(json_dumps(rv))
- excess_bytes = original_bytes - MAX_EVENT_BYTES
- if excess_bytes > 0:
- # Event is too large, will likely be discarded by the
- # server. Trim it down before sending.
- _truncate_span_descriptions(rv, event, excess_bytes)
-
- # Span descriptions truncated, set or reset _meta.
- #
- # We run the same code earlier because we want to account
- # for _meta when calculating original_bytes, the number of
- # bytes in the JSON-encoded event.
- if meta_stack and isinstance(rv, dict):
- rv["_meta"] = meta_stack[0]
- return rv
- finally:
- disable_capture_event.set(False)
diff --git a/sentry_sdk/session.py b/sentry_sdk/session.py
deleted file mode 100644
index 98a8c72..0000000
--- a/sentry_sdk/session.py
+++ /dev/null
@@ -1,174 +0,0 @@
-import uuid
-from datetime import datetime
-
-from sentry_sdk._types import MYPY
-from sentry_sdk.utils import format_timestamp
-
-if MYPY:
- from typing import Optional
- from typing import Union
- from typing import Any
- from typing import Dict
-
- from sentry_sdk._types import SessionStatus
-
-
-def _minute_trunc(ts):
- # type: (datetime) -> datetime
- return ts.replace(second=0, microsecond=0)
-
-
-def _make_uuid(
- val, # type: Union[str, uuid.UUID]
-):
- # type: (...) -> uuid.UUID
- if isinstance(val, uuid.UUID):
- return val
- return uuid.UUID(val)
-
-
-class Session(object):
- def __init__(
- self,
- sid=None, # type: Optional[Union[str, uuid.UUID]]
- did=None, # type: Optional[str]
- timestamp=None, # type: Optional[datetime]
- started=None, # type: Optional[datetime]
- duration=None, # type: Optional[float]
- status=None, # type: Optional[SessionStatus]
- release=None, # type: Optional[str]
- environment=None, # type: Optional[str]
- user_agent=None, # type: Optional[str]
- ip_address=None, # type: Optional[str]
- errors=None, # type: Optional[int]
- user=None, # type: Optional[Any]
- session_mode="application", # type: str
- ):
- # type: (...) -> None
- if sid is None:
- sid = uuid.uuid4()
- if started is None:
- started = datetime.utcnow()
- if status is None:
- status = "ok"
- self.status = status
- self.did = None # type: Optional[str]
- self.started = started
- self.release = None # type: Optional[str]
- self.environment = None # type: Optional[str]
- self.duration = None # type: Optional[float]
- self.user_agent = None # type: Optional[str]
- self.ip_address = None # type: Optional[str]
- self.session_mode = session_mode # type: str
- self.errors = 0
-
- self.update(
- sid=sid,
- did=did,
- timestamp=timestamp,
- duration=duration,
- release=release,
- environment=environment,
- user_agent=user_agent,
- ip_address=ip_address,
- errors=errors,
- user=user,
- )
-
- @property
- def truncated_started(self):
- # type: (...) -> datetime
- return _minute_trunc(self.started)
-
- def update(
- self,
- sid=None, # type: Optional[Union[str, uuid.UUID]]
- did=None, # type: Optional[str]
- timestamp=None, # type: Optional[datetime]
- started=None, # type: Optional[datetime]
- duration=None, # type: Optional[float]
- status=None, # type: Optional[SessionStatus]
- release=None, # type: Optional[str]
- environment=None, # type: Optional[str]
- user_agent=None, # type: Optional[str]
- ip_address=None, # type: Optional[str]
- errors=None, # type: Optional[int]
- user=None, # type: Optional[Any]
- ):
- # type: (...) -> None
- # If a user is supplied we pull some data form it
- if user:
- if ip_address is None:
- ip_address = user.get("ip_address")
- if did is None:
- did = user.get("id") or user.get("email") or user.get("username")
-
- if sid is not None:
- self.sid = _make_uuid(sid)
- if did is not None:
- self.did = str(did)
- if timestamp is None:
- timestamp = datetime.utcnow()
- self.timestamp = timestamp
- if started is not None:
- self.started = started
- if duration is not None:
- self.duration = duration
- if release is not None:
- self.release = release
- if environment is not None:
- self.environment = environment
- if ip_address is not None:
- self.ip_address = ip_address
- if user_agent is not None:
- self.user_agent = user_agent
- if errors is not None:
- self.errors = errors
-
- if status is not None:
- self.status = status
-
- def close(
- self, status=None # type: Optional[SessionStatus]
- ):
- # type: (...) -> Any
- if status is None and self.status == "ok":
- status = "exited"
- if status is not None:
- self.update(status=status)
-
- def get_json_attrs(
- self, with_user_info=True # type: Optional[bool]
- ):
- # type: (...) -> Any
- attrs = {}
- if self.release is not None:
- attrs["release"] = self.release
- if self.environment is not None:
- attrs["environment"] = self.environment
- if with_user_info:
- if self.ip_address is not None:
- attrs["ip_address"] = self.ip_address
- if self.user_agent is not None:
- attrs["user_agent"] = self.user_agent
- return attrs
-
- def to_json(self):
- # type: (...) -> Any
- rv = {
- "sid": str(self.sid),
- "init": True,
- "started": format_timestamp(self.started),
- "timestamp": format_timestamp(self.timestamp),
- "status": self.status,
- } # type: Dict[str, Any]
- if self.errors:
- rv["errors"] = self.errors
- if self.did is not None:
- rv["did"] = self.did
- if self.duration is not None:
- rv["duration"] = self.duration
- attrs = self.get_json_attrs()
- if attrs:
- rv["attrs"] = attrs
- return rv
diff --git a/sentry_sdk/sessions.py b/sentry_sdk/sessions.py
deleted file mode 100644
index 4e4d21b..0000000
--- a/sentry_sdk/sessions.py
+++ /dev/null
@@ -1,175 +0,0 @@
-import os
-import time
-from threading import Thread, Lock
-from contextlib import contextmanager
-
-import sentry_sdk
-from sentry_sdk.envelope import Envelope
-from sentry_sdk.session import Session
-from sentry_sdk._types import MYPY
-from sentry_sdk.utils import format_timestamp
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import Generator
- from typing import List
- from typing import Optional
- from typing import Union
-
-
-def is_auto_session_tracking_enabled(hub=None):
- # type: (Optional[sentry_sdk.Hub]) -> Union[Any, bool, None]
- """Utility function to find out if session tracking is enabled."""
- if hub is None:
- hub = sentry_sdk.Hub.current
-
- should_track = hub.scope._force_auto_session_tracking
-
- if should_track is None:
- client_options = hub.client.options if hub.client else {}
- should_track = client_options.get("auto_session_tracking", False)
-
- return should_track
-
-
-@contextmanager
-def auto_session_tracking(hub=None, session_mode="application"):
- # type: (Optional[sentry_sdk.Hub], str) -> Generator[None, None, None]
- """Starts and stops a session automatically around a block."""
- if hub is None:
- hub = sentry_sdk.Hub.current
- should_track = is_auto_session_tracking_enabled(hub)
- if should_track:
- hub.start_session(session_mode=session_mode)
- try:
- yield
- finally:
- if should_track:
- hub.end_session()
-
-
-TERMINAL_SESSION_STATES = ("exited", "abnormal", "crashed")
-MAX_ENVELOPE_ITEMS = 100
-
-
-def make_aggregate_envelope(aggregate_states, attrs):
- # type: (Any, Any) -> Any
- return {"attrs": dict(attrs), "aggregates": list(aggregate_states.values())}
-
-
-class SessionFlusher(object):
- def __init__(
- self,
- capture_func, # type: Callable[[Envelope], None]
- flush_interval=60, # type: int
- ):
- # type: (...) -> None
- self.capture_func = capture_func
- self.flush_interval = flush_interval
- self.pending_sessions = [] # type: List[Any]
- self.pending_aggregates = {} # type: Dict[Any, Any]
- self._thread = None # type: Optional[Thread]
- self._thread_lock = Lock()
- self._aggregate_lock = Lock()
- self._thread_for_pid = None # type: Optional[int]
- self._running = True
-
- def flush(self):
- # type: (...) -> None
- pending_sessions = self.pending_sessions
- self.pending_sessions = []
-
- with self._aggregate_lock:
- pending_aggregates = self.pending_aggregates
- self.pending_aggregates = {}
-
- envelope = Envelope()
- for session in pending_sessions:
- if len(envelope.items) == MAX_ENVELOPE_ITEMS:
- self.capture_func(envelope)
- envelope = Envelope()
-
- envelope.add_session(session)
-
- for (attrs, states) in pending_aggregates.items():
- if len(envelope.items) == MAX_ENVELOPE_ITEMS:
- self.capture_func(envelope)
- envelope = Envelope()
-
- envelope.add_sessions(make_aggregate_envelope(states, attrs))
-
- if len(envelope.items) > 0:
- self.capture_func(envelope)
-
- def _ensure_running(self):
- # type: (...) -> None
- if self._thread_for_pid == os.getpid() and self._thread is not None:
- return None
- with self._thread_lock:
- if self._thread_for_pid == os.getpid() and self._thread is not None:
- return None
-
- def _thread():
- # type: (...) -> None
- while self._running:
- time.sleep(self.flush_interval)
- if self._running:
- self.flush()
-
- thread = Thread(target=_thread)
- thread.daemon = True
- thread.start()
- self._thread = thread
- self._thread_for_pid = os.getpid()
- return None
-
- def add_aggregate_session(
- self, session # type: Session
- ):
- # type: (...) -> None
- # NOTE on `session.did`:
- # the protocol can deal with buckets that have a distinct-id, however
- # in practice we expect the python SDK to have an extremely high cardinality
- # here, effectively making aggregation useless, therefore we do not
- # aggregate per-did.
-
- # For this part we can get away with using the global interpreter lock
- with self._aggregate_lock:
- attrs = session.get_json_attrs(with_user_info=False)
- primary_key = tuple(sorted(attrs.items()))
- secondary_key = session.truncated_started # (, session.did)
- states = self.pending_aggregates.setdefault(primary_key, {})
- state = states.setdefault(secondary_key, {})
-
- if "started" not in state:
- state["started"] = format_timestamp(session.truncated_started)
- # if session.did is not None:
- # state["did"] = session.did
- if session.status == "crashed":
- state["crashed"] = state.get("crashed", 0) + 1
- elif session.status == "abnormal":
- state["abnormal"] = state.get("abnormal", 0) + 1
- elif session.errors > 0:
- state["errored"] = state.get("errored", 0) + 1
- else:
- state["exited"] = state.get("exited", 0) + 1
-
- def add_session(
- self, session # type: Session
- ):
- # type: (...) -> None
- if session.session_mode == "request":
- self.add_aggregate_session(session)
- else:
- self.pending_sessions.append(session.to_json())
- self._ensure_running()
-
- def kill(self):
- # type: (...) -> None
- self._running = False
-
- def __del__(self):
- # type: (...) -> None
- self.kill()
diff --git a/sentry_sdk/tracing.py b/sentry_sdk/tracing.py
deleted file mode 100644
index 4805035..0000000
--- a/sentry_sdk/tracing.py
+++ /dev/null
@@ -1,724 +0,0 @@
-import uuid
-import random
-import time
-
-from datetime import datetime, timedelta
-
-import sentry_sdk
-
-from sentry_sdk.utils import logger
-from sentry_sdk._types import MYPY
-
-
-if MYPY:
- import typing
-
- from typing import Optional
- from typing import Any
- from typing import Dict
- from typing import List
- from typing import Tuple
- from typing import Iterator
-
- from sentry_sdk._types import SamplingContext
-
-
-class _SpanRecorder(object):
- """Limits the number of spans recorded in a transaction."""
-
- __slots__ = ("maxlen", "spans")
-
- def __init__(self, maxlen):
- # type: (int) -> None
- # FIXME: this is `maxlen - 1` only to preserve historical behavior
- # enforced by tests.
- # Either this should be changed to `maxlen` or the JS SDK implementation
- # should be changed to match a consistent interpretation of what maxlen
- # limits: either transaction+spans or only child spans.
- self.maxlen = maxlen - 1
- self.spans = [] # type: List[Span]
-
- def add(self, span):
- # type: (Span) -> None
- if len(self.spans) > self.maxlen:
- span._span_recorder = None
- else:
- self.spans.append(span)
-
-
-class Span(object):
- __slots__ = (
- "trace_id",
- "span_id",
- "parent_span_id",
- "same_process_as_parent",
- "sampled",
- "op",
- "description",
- "start_timestamp",
- "_start_timestamp_monotonic",
- "status",
- "timestamp",
- "_tags",
- "_data",
- "_span_recorder",
- "hub",
- "_context_manager_state",
- "_containing_transaction",
- )
-
- def __new__(cls, **kwargs):
- # type: (**Any) -> Any
- """
- Backwards-compatible implementation of Span and Transaction
- creation.
- """
-
- # TODO: consider removing this in a future release.
- # This is for backwards compatibility with releases before Transaction
- # existed, to allow for a smoother transition.
- if "transaction" in kwargs:
- return object.__new__(Transaction)
- return object.__new__(cls)
-
- def __init__(
- self,
- trace_id=None, # type: Optional[str]
- span_id=None, # type: Optional[str]
- parent_span_id=None, # type: Optional[str]
- same_process_as_parent=True, # type: bool
- sampled=None, # type: Optional[bool]
- op=None, # type: Optional[str]
- description=None, # type: Optional[str]
- hub=None, # type: Optional[sentry_sdk.Hub]
- status=None, # type: Optional[str]
- transaction=None, # type: Optional[str] # deprecated
- containing_transaction=None, # type: Optional[Transaction]
- ):
- # type: (...) -> None
- self.trace_id = trace_id or uuid.uuid4().hex
- self.span_id = span_id or uuid.uuid4().hex[16:]
- self.parent_span_id = parent_span_id
- self.same_process_as_parent = same_process_as_parent
- self.sampled = sampled
- self.op = op
- self.description = description
- self.status = status
- self.hub = hub
- self._tags = {} # type: Dict[str, str]
- self._data = {} # type: Dict[str, Any]
- self._containing_transaction = containing_transaction
- self.start_timestamp = datetime.utcnow()
- try:
- # TODO: For Python 3.7+, we could use a clock with ns resolution:
- # self._start_timestamp_monotonic = time.perf_counter_ns()
-
- # Python 3.3+
- self._start_timestamp_monotonic = time.perf_counter()
- except AttributeError:
- pass
-
- #: End timestamp of span
- self.timestamp = None # type: Optional[datetime]
-
- self._span_recorder = None # type: Optional[_SpanRecorder]
-
- # TODO this should really live on the Transaction class rather than the Span
- # class
- def init_span_recorder(self, maxlen):
- # type: (int) -> None
- if self._span_recorder is None:
- self._span_recorder = _SpanRecorder(maxlen)
-
- def __repr__(self):
- # type: () -> str
- return "<%s(op=%r, description:%r, trace_id=%r, span_id=%r, parent_span_id=%r, sampled=%r)>" % (
- self.__class__.__name__,
- self.op,
- self.description,
- self.trace_id,
- self.span_id,
- self.parent_span_id,
- self.sampled,
- )
-
- def __enter__(self):
- # type: () -> Span
- hub = self.hub or sentry_sdk.Hub.current
-
- _, scope = hub._stack[-1]
- old_span = scope.span
- scope.span = self
- self._context_manager_state = (hub, scope, old_span)
- return self
-
- def __exit__(self, ty, value, tb):
- # type: (Optional[Any], Optional[Any], Optional[Any]) -> None
- if value is not None:
- self.set_status("internal_error")
-
- hub, scope, old_span = self._context_manager_state
- del self._context_manager_state
-
- self.finish(hub)
- scope.span = old_span
-
- @property
- def containing_transaction(self):
- # type: () -> Optional[Transaction]
-
- # this is a getter rather than a regular attribute so that transactions
- # can return `self` here instead (as a way to prevent them circularly
- # referencing themselves)
- return self._containing_transaction
-
- def start_child(self, **kwargs):
- # type: (**Any) -> Span
- """
- Start a sub-span from the current span or transaction.
-
- Takes the same arguments as the initializer of :py:class:`Span`. The
- trace id, sampling decision, transaction pointer, and span recorder are
- inherited from the current span/transaction.
- """
- kwargs.setdefault("sampled", self.sampled)
-
- child = Span(
- trace_id=self.trace_id,
- parent_span_id=self.span_id,
- containing_transaction=self.containing_transaction,
- **kwargs
- )
-
- span_recorder = (
- self.containing_transaction and self.containing_transaction._span_recorder
- )
- if span_recorder:
- span_recorder.add(child)
- return child
-
- def new_span(self, **kwargs):
- # type: (**Any) -> Span
- """Deprecated: use start_child instead."""
- logger.warning("Deprecated: use Span.start_child instead of Span.new_span.")
- return self.start_child(**kwargs)
-
- @classmethod
- def continue_from_environ(
- cls,
- environ, # type: typing.Mapping[str, str]
- **kwargs # type: Any
- ):
- # type: (...) -> Transaction
- """
- Create a Transaction with the given params, then add in data pulled from
- the 'sentry-trace' and 'tracestate' headers from the environ (if any)
- before returning the Transaction.
-
- This is different from `continue_from_headers` in that it assumes header
- names in the form "HTTP_HEADER_NAME" - such as you would get from a wsgi
- environ - rather than the form "header-name".
- """
- if cls is Span:
- logger.warning(
- "Deprecated: use Transaction.continue_from_environ "
- "instead of Span.continue_from_environ."
- )
- return Transaction.continue_from_headers(EnvironHeaders(environ), **kwargs)
-
- @classmethod
- def continue_from_headers(
- cls,
- headers, # type: typing.Mapping[str, str]
- **kwargs # type: Any
- ):
- # type: (...) -> Transaction
- """
- Create a transaction with the given params (including any data pulled from
- the 'sentry-trace' and 'tracestate' headers).
- """
- # TODO move this to the Transaction class
- if cls is Span:
- logger.warning(
- "Deprecated: use Transaction.continue_from_headers "
- "instead of Span.continue_from_headers."
- )
-
- kwargs.update(extract_sentrytrace_data(headers.get("sentry-trace")))
- kwargs.update(extract_tracestate_data(headers.get("tracestate")))
-
- transaction = Transaction(**kwargs)
- transaction.same_process_as_parent = False
-
- return transaction
-
- def iter_headers(self):
- # type: () -> Iterator[Tuple[str, str]]
- """
- Creates a generator which returns the span's `sentry-trace` and
- `tracestate` headers.
-
- If the span's containing transaction doesn't yet have a
- `sentry_tracestate` value, this will cause one to be generated and
- stored.
- """
- yield "sentry-trace", self.to_traceparent()
-
- tracestate = self.to_tracestate() if has_tracestate_enabled(self) else None
- # `tracestate` will only be `None` if there's no client or no DSN
- # TODO (kmclb) the above will be true once the feature is no longer
- # behind a flag
- if tracestate:
- yield "tracestate", tracestate
-
- @classmethod
- def from_traceparent(
- cls,
- traceparent, # type: Optional[str]
- **kwargs # type: Any
- ):
- # type: (...) -> Optional[Transaction]
- """
- DEPRECATED: Use Transaction.continue_from_headers(headers, **kwargs)
-
- Create a Transaction with the given params, then add in data pulled from
- the given 'sentry-trace' header value before returning the Transaction.
-
- """
- logger.warning(
- "Deprecated: Use Transaction.continue_from_headers(headers, **kwargs) "
- "instead of from_traceparent(traceparent, **kwargs)"
- )
-
- if not traceparent:
- return None
-
- return cls.continue_from_headers({"sentry-trace": traceparent}, **kwargs)
-
- def to_traceparent(self):
- # type: () -> str
- sampled = ""
- if self.sampled is True:
- sampled = "1"
- if self.sampled is False:
- sampled = "0"
- return "%s-%s-%s" % (self.trace_id, self.span_id, sampled)
-
- def to_tracestate(self):
- # type: () -> Optional[str]
- """
- Computes the `tracestate` header value using data from the containing
- transaction.
-
- If the containing transaction doesn't yet have a `sentry_tracestate`
- value, this will cause one to be generated and stored.
-
- If there is no containing transaction, a value will be generated but not
- stored.
-
- Returns None if there's no client and/or no DSN.
- """
-
- sentry_tracestate = self.get_or_set_sentry_tracestate()
- third_party_tracestate = (
- self.containing_transaction._third_party_tracestate
- if self.containing_transaction
- else None
- )
-
- if not sentry_tracestate:
- return None
-
- header_value = sentry_tracestate
-
- if third_party_tracestate:
- header_value = header_value + "," + third_party_tracestate
-
- return header_value
-
- def get_or_set_sentry_tracestate(self):
- # type: (Span) -> Optional[str]
- """
- Read sentry tracestate off of the span's containing transaction.
-
- If the transaction doesn't yet have a `_sentry_tracestate` value,
- compute one and store it.
- """
- transaction = self.containing_transaction
-
- if transaction:
- if not transaction._sentry_tracestate:
- transaction._sentry_tracestate = compute_tracestate_entry(self)
-
- return transaction._sentry_tracestate
-
- # orphan span - nowhere to store the value, so just return it
- return compute_tracestate_entry(self)
-
- def set_tag(self, key, value):
- # type: (str, Any) -> None
- self._tags[key] = value
-
- def set_data(self, key, value):
- # type: (str, Any) -> None
- self._data[key] = value
-
- def set_status(self, value):
- # type: (str) -> None
- self.status = value
-
- def set_http_status(self, http_status):
- # type: (int) -> None
- self.set_tag("http.status_code", str(http_status))
-
- if http_status < 400:
- self.set_status("ok")
- elif 400 <= http_status < 500:
- if http_status == 403:
- self.set_status("permission_denied")
- elif http_status == 404:
- self.set_status("not_found")
- elif http_status == 429:
- self.set_status("resource_exhausted")
- elif http_status == 413:
- self.set_status("failed_precondition")
- elif http_status == 401:
- self.set_status("unauthenticated")
- elif http_status == 409:
- self.set_status("already_exists")
- else:
- self.set_status("invalid_argument")
- elif 500 <= http_status < 600:
- if http_status == 504:
- self.set_status("deadline_exceeded")
- elif http_status == 501:
- self.set_status("unimplemented")
- elif http_status == 503:
- self.set_status("unavailable")
- else:
- self.set_status("internal_error")
- else:
- self.set_status("unknown_error")
-
- def is_success(self):
- # type: () -> bool
- return self.status == "ok"
-
- def finish(self, hub=None):
- # type: (Optional[sentry_sdk.Hub]) -> Optional[str]
- # XXX: would be type: (Optional[sentry_sdk.Hub]) -> None, but that leads
- # to incompatible return types for Span.finish and Transaction.finish.
- if self.timestamp is not None:
- # This span is already finished, ignore.
- return None
-
- hub = hub or self.hub or sentry_sdk.Hub.current
-
- try:
- duration_seconds = time.perf_counter() - self._start_timestamp_monotonic
- self.timestamp = self.start_timestamp + timedelta(seconds=duration_seconds)
- except AttributeError:
- self.timestamp = datetime.utcnow()
-
- maybe_create_breadcrumbs_from_span(hub, self)
- return None
-
- def to_json(self):
- # type: () -> Dict[str, Any]
- rv = {
- "trace_id": self.trace_id,
- "span_id": self.span_id,
- "parent_span_id": self.parent_span_id,
- "same_process_as_parent": self.same_process_as_parent,
- "op": self.op,
- "description": self.description,
- "start_timestamp": self.start_timestamp,
- "timestamp": self.timestamp,
- } # type: Dict[str, Any]
-
- if self.status:
- self._tags["status"] = self.status
-
- tags = self._tags
- if tags:
- rv["tags"] = tags
-
- data = self._data
- if data:
- rv["data"] = data
-
- return rv
-
- def get_trace_context(self):
- # type: () -> Any
- rv = {
- "trace_id": self.trace_id,
- "span_id": self.span_id,
- "parent_span_id": self.parent_span_id,
- "op": self.op,
- "description": self.description,
- }
- if self.status:
- rv["status"] = self.status
-
- # if the transaction didn't inherit a tracestate value, and no outgoing
- # requests - whose need for headers would have caused a tracestate value
- # to be created - were made as part of the transaction, the transaction
- # still won't have a tracestate value, so compute one now
- sentry_tracestate = self.get_or_set_sentry_tracestate()
-
- if sentry_tracestate:
- rv["tracestate"] = sentry_tracestate
-
- return rv
-
-
-class Transaction(Span):
- __slots__ = (
- "name",
- "parent_sampled",
- # the sentry portion of the `tracestate` header used to transmit
- # correlation context for server-side dynamic sampling, of the form
- # `sentry=xxxxx`, where `xxxxx` is the base64-encoded json of the
- # correlation context data, missing trailing any =
- "_sentry_tracestate",
- # tracestate data from other vendors, of the form `dogs=yes,cats=maybe`
- "_third_party_tracestate",
- )
-
- def __init__(
- self,
- name="", # type: str
- parent_sampled=None, # type: Optional[bool]
- sentry_tracestate=None, # type: Optional[str]
- third_party_tracestate=None, # type: Optional[str]
- **kwargs # type: Any
- ):
- # type: (...) -> None
- # TODO: consider removing this in a future release.
- # This is for backwards compatibility with releases before Transaction
- # existed, to allow for a smoother transition.
- if not name and "transaction" in kwargs:
- logger.warning(
- "Deprecated: use Transaction(name=...) to create transactions "
- "instead of Span(transaction=...)."
- )
- name = kwargs.pop("transaction")
- Span.__init__(self, **kwargs)
- self.name = name
- self.parent_sampled = parent_sampled
- # if tracestate isn't inherited and set here, it will get set lazily,
- # either the first time an outgoing request needs it for a header or the
- # first time an event needs it for inclusion in the captured data
- self._sentry_tracestate = sentry_tracestate
- self._third_party_tracestate = third_party_tracestate
-
- def __repr__(self):
- # type: () -> str
- return "<%s(name=%r, op=%r, trace_id=%r, span_id=%r, parent_span_id=%r, sampled=%r)>" % (
- self.__class__.__name__,
- self.name,
- self.op,
- self.trace_id,
- self.span_id,
- self.parent_span_id,
- self.sampled,
- )
-
- @property
- def containing_transaction(self):
- # type: () -> Transaction
-
- # Transactions (as spans) belong to themselves (as transactions). This
- # is a getter rather than a regular attribute to avoid having a circular
- # reference.
- return self
-
- def finish(self, hub=None):
- # type: (Optional[sentry_sdk.Hub]) -> Optional[str]
- if self.timestamp is not None:
- # This transaction is already finished, ignore.
- return None
-
- hub = hub or self.hub or sentry_sdk.Hub.current
- client = hub.client
-
- if client is None:
- # We have no client and therefore nowhere to send this transaction.
- return None
-
- # This is a de facto proxy for checking if sampled = False
- if self._span_recorder is None:
- logger.debug("Discarding transaction because sampled = False")
-
- # This is not entirely accurate because discards here are not
- # exclusively based on sample rate but also traces sampler, but
- # we handle this the same here.
- if client.transport and has_tracing_enabled(client.options):
- client.transport.record_lost_event(
- "sample_rate", data_category="transaction"
- )
-
- return None
-
- if not self.name:
- logger.warning(
- "Transaction has no name, falling back to ``."
- )
- self.name = ""
-
- Span.finish(self, hub)
-
- if not self.sampled:
- # At this point a `sampled = None` should have already been resolved
- # to a concrete decision.
- if self.sampled is None:
- logger.warning("Discarding transaction without sampling decision.")
- return None
-
- finished_spans = [
- span.to_json()
- for span in self._span_recorder.spans
- if span.timestamp is not None
- ]
-
- # we do this to break the circular reference of transaction -> span
- # recorder -> span -> containing transaction (which is where we started)
- # before either the spans or the transaction goes out of scope and has
- # to be garbage collected
- self._span_recorder = None
-
- return hub.capture_event(
- {
- "type": "transaction",
- "transaction": self.name,
- "contexts": {"trace": self.get_trace_context()},
- "tags": self._tags,
- "timestamp": self.timestamp,
- "start_timestamp": self.start_timestamp,
- "spans": finished_spans,
- }
- )
-
- def to_json(self):
- # type: () -> Dict[str, Any]
- rv = super(Transaction, self).to_json()
-
- rv["name"] = self.name
- rv["sampled"] = self.sampled
-
- return rv
-
- def _set_initial_sampling_decision(self, sampling_context):
- # type: (SamplingContext) -> None
- """
- Sets the transaction's sampling decision, according to the following
- precedence rules:
-
- 1. If a sampling decision is passed to `start_transaction`
- (`start_transaction(name: "my transaction", sampled: True)`), that
- decision will be used, regardless of anything else
-
- 2. If `traces_sampler` is defined, its decision will be used. It can
- choose to keep or ignore any parent sampling decision, or use the
- sampling context data to make its own decision or to choose a sample
- rate for the transaction.
-
- 3. If `traces_sampler` is not defined, but there's a parent sampling
- decision, the parent sampling decision will be used.
-
- 4. If `traces_sampler` is not defined and there's no parent sampling
- decision, `traces_sample_rate` will be used.
- """
-
- hub = self.hub or sentry_sdk.Hub.current
- client = hub.client
- options = (client and client.options) or {}
- transaction_description = "{op}transaction <{name}>".format(
- op=("<" + self.op + "> " if self.op else ""), name=self.name
- )
-
- # nothing to do if there's no client or if tracing is disabled
- if not client or not has_tracing_enabled(options):
- self.sampled = False
- return
-
- # if the user has forced a sampling decision by passing a `sampled`
- # value when starting the transaction, go with that
- if self.sampled is not None:
- return
-
- # we would have bailed already if neither `traces_sampler` nor
- # `traces_sample_rate` were defined, so one of these should work; prefer
- # the hook if so
- sample_rate = (
- options["traces_sampler"](sampling_context)
- if callable(options.get("traces_sampler"))
- else (
- # default inheritance behavior
- sampling_context["parent_sampled"]
- if sampling_context["parent_sampled"] is not None
- else options["traces_sample_rate"]
- )
- )
-
- # Since this is coming from the user (or from a function provided by the
- # user), who knows what we might get. (The only valid values are
- # booleans or numbers between 0 and 1.)
- if not is_valid_sample_rate(sample_rate):
- logger.warning(
- "[Tracing] Discarding {transaction_description} because of invalid sample rate.".format(
- transaction_description=transaction_description,
- )
- )
- self.sampled = False
- return
-
- # if the function returned 0 (or false), or if `traces_sample_rate` is
- # 0, it's a sign the transaction should be dropped
- if not sample_rate:
- logger.debug(
- "[Tracing] Discarding {transaction_description} because {reason}".format(
- transaction_description=transaction_description,
- reason=(
- "traces_sampler returned 0 or False"
- if callable(options.get("traces_sampler"))
- else "traces_sample_rate is set to 0"
- ),
- )
- )
- self.sampled = False
- return
-
- # Now we roll the dice. random.random is inclusive of 0, but not of 1,
- # so strict < is safe here. In case sample_rate is a boolean, cast it
- # to a float (True becomes 1.0 and False becomes 0.0)
- self.sampled = random.random() < float(sample_rate)
-
- if self.sampled:
- logger.debug(
- "[Tracing] Starting {transaction_description}".format(
- transaction_description=transaction_description,
- )
- )
- else:
- logger.debug(
- "[Tracing] Discarding {transaction_description} because it's not included in the random sample (sampling rate = {sample_rate})".format(
- transaction_description=transaction_description,
- sample_rate=float(sample_rate),
- )
- )
-
-
-# Circular imports
-
-from sentry_sdk.tracing_utils import (
- EnvironHeaders,
- compute_tracestate_entry,
- extract_sentrytrace_data,
- extract_tracestate_data,
- has_tracestate_enabled,
- has_tracing_enabled,
- is_valid_sample_rate,
- maybe_create_breadcrumbs_from_span,
-)
diff --git a/sentry_sdk/tracing_utils.py b/sentry_sdk/tracing_utils.py
deleted file mode 100644
index d754da4..0000000
--- a/sentry_sdk/tracing_utils.py
+++ /dev/null
@@ -1,420 +0,0 @@
-import re
-import json
-import math
-
-from numbers import Real
-
-import sentry_sdk
-
-from sentry_sdk.utils import (
- capture_internal_exceptions,
- Dsn,
- logger,
- safe_str,
- to_base64,
- to_string,
- from_base64,
-)
-from sentry_sdk._compat import PY2
-from sentry_sdk._types import MYPY
-
-if PY2:
- from collections import Mapping
-else:
- from collections.abc import Mapping
-
-if MYPY:
- import typing
-
- from typing import Generator
- from typing import Optional
- from typing import Any
- from typing import Dict
- from typing import Union
-
-
-SENTRY_TRACE_REGEX = re.compile(
- "^[ \t]*" # whitespace
- "([0-9a-f]{32})?" # trace_id
- "-?([0-9a-f]{16})?" # span_id
- "-?([01])?" # sampled
- "[ \t]*$" # whitespace
-)
-
-# This is a normal base64 regex, modified to reflect that fact that we strip the
-# trailing = or == off
-base64_stripped = (
- # any of the characters in the base64 "alphabet", in multiples of 4
- "([a-zA-Z0-9+/]{4})*"
- # either nothing or 2 or 3 base64-alphabet characters (see
- # https://en.wikipedia.org/wiki/Base64#Decoding_Base64_without_padding for
- # why there's never only 1 extra character)
- "([a-zA-Z0-9+/]{2,3})?"
-)
-
-# comma-delimited list of entries of the form `xxx=yyy`
-tracestate_entry = "[^=]+=[^=]+"
-TRACESTATE_ENTRIES_REGEX = re.compile(
- # one or more xxxxx=yyyy entries
- "^({te})+"
- # each entry except the last must be followed by a comma
- "(,|$)".format(te=tracestate_entry)
-)
-
-# this doesn't check that the value is valid, just that there's something there
-# of the form `sentry=xxxx`
-SENTRY_TRACESTATE_ENTRY_REGEX = re.compile(
- # either sentry is the first entry or there's stuff immediately before it,
- # ending in a comma (this prevents matching something like `coolsentry=xxx`)
- "(?:^|.+,)"
- # sentry's part, not including the potential comma
- "(sentry=[^,]*)"
- # either there's a comma and another vendor's entry or we end
- "(?:,.+|$)"
-)
-
-
-class EnvironHeaders(Mapping): # type: ignore
- def __init__(
- self,
- environ, # type: typing.Mapping[str, str]
- prefix="HTTP_", # type: str
- ):
- # type: (...) -> None
- self.environ = environ
- self.prefix = prefix
-
- def __getitem__(self, key):
- # type: (str) -> Optional[Any]
- return self.environ[self.prefix + key.replace("-", "_").upper()]
-
- def __len__(self):
- # type: () -> int
- return sum(1 for _ in iter(self))
-
- def __iter__(self):
- # type: () -> Generator[str, None, None]
- for k in self.environ:
- if not isinstance(k, str):
- continue
-
- k = k.replace("-", "_").upper()
- if not k.startswith(self.prefix):
- continue
-
- yield k[len(self.prefix) :]
-
-
-class RecordSqlQueries:
- def __init__(
- self,
- hub, # type: sentry_sdk.Hub
- cursor, # type: Any
- query, # type: Any
- params_list, # type: Any
- paramstyle, # type: Optional[str]
- executemany, # type: bool
- ):
- # type: (...) -> None
- # TODO: Bring back capturing of params by default
- self._hub = hub
- if self._hub.client and self._hub.client.options["_experiments"].get(
- "record_sql_params", False
- ):
- if not params_list or params_list == [None]:
- params_list = None
-
- if paramstyle == "pyformat":
- paramstyle = "format"
- else:
- params_list = None
- paramstyle = None
-
- self._query = _format_sql(cursor, query)
-
- self._data = {}
- if params_list is not None:
- self._data["db.params"] = params_list
- if paramstyle is not None:
- self._data["db.paramstyle"] = paramstyle
- if executemany:
- self._data["db.executemany"] = True
-
- def __enter__(self):
- # type: () -> Span
- with capture_internal_exceptions():
- self._hub.add_breadcrumb(
- message=self._query, category="query", data=self._data
- )
-
- with self._hub.start_span(op="db", description=self._query) as span:
- for k, v in self._data.items():
- span.set_data(k, v)
- return span
-
- def __exit__(self, exc_type, exc_val, exc_tb):
- # type: (Any, Any, Any) -> None
- pass
-
-
-def has_tracing_enabled(options):
- # type: (Dict[str, Any]) -> bool
- """
- Returns True if either traces_sample_rate or traces_sampler is
- defined, False otherwise.
- """
-
- return bool(
- options.get("traces_sample_rate") is not None
- or options.get("traces_sampler") is not None
- )
-
-
-def is_valid_sample_rate(rate):
- # type: (Any) -> bool
- """
- Checks the given sample rate to make sure it is valid type and value (a
- boolean or a number between 0 and 1, inclusive).
- """
-
- # both booleans and NaN are instances of Real, so a) checking for Real
- # checks for the possibility of a boolean also, and b) we have to check
- # separately for NaN
- if not isinstance(rate, Real) or math.isnan(rate):
- logger.warning(
- "[Tracing] Given sample rate is invalid. Sample rate must be a boolean or a number between 0 and 1. Got {rate} of type {type}.".format(
- rate=rate, type=type(rate)
- )
- )
- return False
-
- # in case rate is a boolean, it will get cast to 1 if it's True and 0 if it's False
- rate = float(rate)
- if rate < 0 or rate > 1:
- logger.warning(
- "[Tracing] Given sample rate is invalid. Sample rate must be between 0 and 1. Got {rate}.".format(
- rate=rate
- )
- )
- return False
-
- return True
-
-
-def maybe_create_breadcrumbs_from_span(hub, span):
- # type: (sentry_sdk.Hub, Span) -> None
- if span.op == "redis":
- hub.add_breadcrumb(
- message=span.description, type="redis", category="redis", data=span._tags
- )
- elif span.op == "http":
- hub.add_breadcrumb(type="http", category="httplib", data=span._data)
- elif span.op == "subprocess":
- hub.add_breadcrumb(
- type="subprocess",
- category="subprocess",
- message=span.description,
- data=span._data,
- )
-
-
-def extract_sentrytrace_data(header):
- # type: (Optional[str]) -> typing.Mapping[str, Union[str, bool, None]]
- """
- Given a `sentry-trace` header string, return a dictionary of data.
- """
- trace_id = parent_span_id = parent_sampled = None
-
- if header:
- if header.startswith("00-") and header.endswith("-00"):
- header = header[3:-3]
-
- match = SENTRY_TRACE_REGEX.match(header)
-
- if match:
- trace_id, parent_span_id, sampled_str = match.groups()
-
- if trace_id:
- trace_id = "{:032x}".format(int(trace_id, 16))
- if parent_span_id:
- parent_span_id = "{:016x}".format(int(parent_span_id, 16))
- if sampled_str:
- parent_sampled = sampled_str != "0"
-
- return {
- "trace_id": trace_id,
- "parent_span_id": parent_span_id,
- "parent_sampled": parent_sampled,
- }
-
-
-def extract_tracestate_data(header):
- # type: (Optional[str]) -> typing.Mapping[str, Optional[str]]
- """
- Extracts the sentry tracestate value and any third-party data from the given
- tracestate header, returning a dictionary of data.
- """
- sentry_entry = third_party_entry = None
- before = after = ""
-
- if header:
- # find sentry's entry, if any
- sentry_match = SENTRY_TRACESTATE_ENTRY_REGEX.search(header)
-
- if sentry_match:
- sentry_entry = sentry_match.group(1)
-
- # remove the commas after the split so we don't end up with
- # `xxx=yyy,,zzz=qqq` (double commas) when we put them back together
- before, after = map(lambda s: s.strip(","), header.split(sentry_entry))
-
- # extract sentry's value from its entry and test to make sure it's
- # valid; if it isn't, discard the entire entry so that a new one
- # will be created
- sentry_value = sentry_entry.replace("sentry=", "")
- if not re.search("^{b64}$".format(b64=base64_stripped), sentry_value):
- sentry_entry = None
- else:
- after = header
-
- # if either part is invalid or empty, remove it before gluing them together
- third_party_entry = (
- ",".join(filter(TRACESTATE_ENTRIES_REGEX.search, [before, after])) or None
- )
-
- return {
- "sentry_tracestate": sentry_entry,
- "third_party_tracestate": third_party_entry,
- }
-
-
-def compute_tracestate_value(data):
- # type: (typing.Mapping[str, str]) -> str
- """
- Computes a new tracestate value using the given data.
-
- Note: Returns just the base64-encoded data, NOT the full `sentry=...`
- tracestate entry.
- """
-
- tracestate_json = json.dumps(data, default=safe_str)
-
- # Base64-encoded strings always come out with a length which is a multiple
- # of 4. In order to achieve this, the end is padded with one or more `=`
- # signs. Because the tracestate standard calls for using `=` signs between
- # vendor name and value (`sentry=xxx,dogsaregreat=yyy`), to avoid confusion
- # we strip the `=`
- return (to_base64(tracestate_json) or "").rstrip("=")
-
-
-def compute_tracestate_entry(span):
- # type: (Span) -> Optional[str]
- """
- Computes a new sentry tracestate for the span. Includes the `sentry=`.
-
- Will return `None` if there's no client and/or no DSN.
- """
- data = {}
-
- hub = span.hub or sentry_sdk.Hub.current
-
- client = hub.client
- scope = hub.scope
-
- if client and client.options.get("dsn"):
- options = client.options
- user = scope._user
-
- data = {
- "trace_id": span.trace_id,
- "environment": options["environment"],
- "release": options.get("release"),
- "public_key": Dsn(options["dsn"]).public_key,
- }
-
- if user and (user.get("id") or user.get("segment")):
- user_data = {}
-
- if user.get("id"):
- user_data["id"] = user["id"]
-
- if user.get("segment"):
- user_data["segment"] = user["segment"]
-
- data["user"] = user_data
-
- if span.containing_transaction:
- data["transaction"] = span.containing_transaction.name
-
- return "sentry=" + compute_tracestate_value(data)
-
- return None
-
-
-def reinflate_tracestate(encoded_tracestate):
- # type: (str) -> typing.Optional[Mapping[str, str]]
- """
- Given a sentry tracestate value in its encoded form, translate it back into
- a dictionary of data.
- """
- inflated_tracestate = None
-
- if encoded_tracestate:
- # Base64-encoded strings always come out with a length which is a
- # multiple of 4. In order to achieve this, the end is padded with one or
- # more `=` signs. Because the tracestate standard calls for using `=`
- # signs between vendor name and value (`sentry=xxx,dogsaregreat=yyy`),
- # to avoid confusion we strip the `=` when the data is initially
- # encoded. Python's decoding function requires they be put back.
- # Fortunately, it doesn't complain if there are too many, so we just
- # attach two `=` on spec (there will never be more than 2, see
- # https://en.wikipedia.org/wiki/Base64#Decoding_Base64_without_padding).
- tracestate_json = from_base64(encoded_tracestate + "==")
-
- try:
- assert tracestate_json is not None
- inflated_tracestate = json.loads(tracestate_json)
- except Exception as err:
- logger.warning(
- (
- "Unable to attach tracestate data to envelope header: {err}"
- + "\nTracestate value is {encoded_tracestate}"
- ).format(err=err, encoded_tracestate=encoded_tracestate),
- )
-
- return inflated_tracestate
-
-
-def _format_sql(cursor, sql):
- # type: (Any, str) -> Optional[str]
-
- real_sql = None
-
- # If we're using psycopg2, it could be that we're
- # looking at a query that uses Composed objects. Use psycopg2's mogrify
- # function to format the query. We lose per-parameter trimming but gain
- # accuracy in formatting.
- try:
- if hasattr(cursor, "mogrify"):
- real_sql = cursor.mogrify(sql)
- if isinstance(real_sql, bytes):
- real_sql = real_sql.decode(cursor.connection.encoding)
- except Exception:
- real_sql = None
-
- return real_sql or to_string(sql)
-
-
-def has_tracestate_enabled(span=None):
- # type: (Optional[Span]) -> bool
-
- client = ((span and span.hub) or sentry_sdk.Hub.current).client
- options = client and client.options
-
- return bool(options and options["_experiments"].get("propagate_tracestate"))
-
-
-# Circular imports
-
-if MYPY:
- from sentry_sdk.tracing import Span
diff --git a/sentry_sdk/transport.py b/sentry_sdk/transport.py
deleted file mode 100644
index fca6fa8..0000000
--- a/sentry_sdk/transport.py
+++ /dev/null
@@ -1,531 +0,0 @@
-from __future__ import print_function
-
-import io
-import urllib3 # type: ignore
-import certifi
-import gzip
-import time
-
-from datetime import datetime, timedelta
-from collections import defaultdict
-
-from sentry_sdk.utils import Dsn, logger, capture_internal_exceptions, json_dumps
-from sentry_sdk.worker import BackgroundWorker
-from sentry_sdk.envelope import Envelope, Item, PayloadRef
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import Iterable
- from typing import Optional
- from typing import Tuple
- from typing import Type
- from typing import Union
- from typing import DefaultDict
-
- from urllib3.poolmanager import PoolManager # type: ignore
- from urllib3.poolmanager import ProxyManager
-
- from sentry_sdk._types import Event, EndpointType
-
- DataCategory = Optional[str]
-
-try:
- from urllib.request import getproxies
-except ImportError:
- from urllib import getproxies # type: ignore
-
-
-class Transport(object):
- """Baseclass for all transports.
-
- A transport is used to send an event to sentry.
- """
-
- parsed_dsn = None # type: Optional[Dsn]
-
- def __init__(
- self, options=None # type: Optional[Dict[str, Any]]
- ):
- # type: (...) -> None
- self.options = options
- if options and options["dsn"] is not None and options["dsn"]:
- self.parsed_dsn = Dsn(options["dsn"])
- else:
- self.parsed_dsn = None
-
- def capture_event(
- self, event # type: Event
- ):
- # type: (...) -> None
- """
- This gets invoked with the event dictionary when an event should
- be sent to sentry.
- """
- raise NotImplementedError()
-
- def capture_envelope(
- self, envelope # type: Envelope
- ):
- # type: (...) -> None
- """
- Send an envelope to Sentry.
-
- Envelopes are a data container format that can hold any type of data
- submitted to Sentry. We use it for transactions and sessions, but
- regular "error" events should go through `capture_event` for backwards
- compat.
- """
- raise NotImplementedError()
-
- def flush(
- self,
- timeout, # type: float
- callback=None, # type: Optional[Any]
- ):
- # type: (...) -> None
- """Wait `timeout` seconds for the current events to be sent out."""
- pass
-
- def kill(self):
- # type: () -> None
- """Forcefully kills the transport."""
- pass
-
- def record_lost_event(
- self,
- reason, # type: str
- data_category=None, # type: Optional[str]
- item=None, # type: Optional[Item]
- ):
- # type: (...) -> None
- """This increments a counter for event loss by reason and
- data category.
- """
- return None
-
- def __del__(self):
- # type: () -> None
- try:
- self.kill()
- except Exception:
- pass
-
-
-def _parse_rate_limits(header, now=None):
- # type: (Any, Optional[datetime]) -> Iterable[Tuple[DataCategory, datetime]]
- if now is None:
- now = datetime.utcnow()
-
- for limit in header.split(","):
- try:
- retry_after, categories, _ = limit.strip().split(":", 2)
- retry_after = now + timedelta(seconds=int(retry_after))
- for category in categories and categories.split(";") or (None,):
- yield category, retry_after
- except (LookupError, ValueError):
- continue
-
-
-class HttpTransport(Transport):
- """The default HTTP transport."""
-
- def __init__(
- self, options # type: Dict[str, Any]
- ):
- # type: (...) -> None
- from sentry_sdk.consts import VERSION
-
- Transport.__init__(self, options)
- assert self.parsed_dsn is not None
- self.options = options # type: Dict[str, Any]
- self._worker = BackgroundWorker(queue_size=options["transport_queue_size"])
- self._auth = self.parsed_dsn.to_auth("sentry.python/%s" % VERSION)
- self._disabled_until = {} # type: Dict[DataCategory, datetime]
- self._retry = urllib3.util.Retry()
- self._discarded_events = defaultdict(
- int
- ) # type: DefaultDict[Tuple[str, str], int]
- self._last_client_report_sent = time.time()
-
- self._pool = self._make_pool(
- self.parsed_dsn,
- http_proxy=options["http_proxy"],
- https_proxy=options["https_proxy"],
- ca_certs=options["ca_certs"],
- )
-
- from sentry_sdk import Hub
-
- self.hub_cls = Hub
-
- def record_lost_event(
- self,
- reason, # type: str
- data_category=None, # type: Optional[str]
- item=None, # type: Optional[Item]
- ):
- # type: (...) -> None
- if not self.options["send_client_reports"]:
- return
-
- quantity = 1
- if item is not None:
- data_category = item.data_category
- if data_category == "attachment":
- # quantity of 0 is actually 1 as we do not want to count
- # empty attachments as actually empty.
- quantity = len(item.get_bytes()) or 1
- elif data_category is None:
- raise TypeError("data category not provided")
-
- self._discarded_events[data_category, reason] += quantity
-
- def _update_rate_limits(self, response):
- # type: (urllib3.HTTPResponse) -> None
-
- # new sentries with more rate limit insights. We honor this header
- # no matter of the status code to update our internal rate limits.
- header = response.headers.get("x-sentry-rate-limits")
- if header:
- logger.warning("Rate-limited via x-sentry-rate-limits")
- self._disabled_until.update(_parse_rate_limits(header))
-
- # old sentries only communicate global rate limit hits via the
- # retry-after header on 429. This header can also be emitted on new
- # sentries if a proxy in front wants to globally slow things down.
- elif response.status == 429:
- logger.warning("Rate-limited via 429")
- self._disabled_until[None] = datetime.utcnow() + timedelta(
- seconds=self._retry.get_retry_after(response) or 60
- )
-
- def _send_request(
- self,
- body, # type: bytes
- headers, # type: Dict[str, str]
- endpoint_type="store", # type: EndpointType
- envelope=None, # type: Optional[Envelope]
- ):
- # type: (...) -> None
-
- def record_loss(reason):
- # type: (str) -> None
- if envelope is None:
- self.record_lost_event(reason, data_category="error")
- else:
- for item in envelope.items:
- self.record_lost_event(reason, item=item)
-
- headers.update(
- {
- "User-Agent": str(self._auth.client),
- "X-Sentry-Auth": str(self._auth.to_header()),
- }
- )
- try:
- response = self._pool.request(
- "POST",
- str(self._auth.get_api_url(endpoint_type)),
- body=body,
- headers=headers,
- )
- except Exception:
- self.on_dropped_event("network")
- record_loss("network_error")
- raise
-
- try:
- self._update_rate_limits(response)
-
- if response.status == 429:
- # if we hit a 429. Something was rate limited but we already
- # acted on this in `self._update_rate_limits`. Note that we
- # do not want to record event loss here as we will have recorded
- # an outcome in relay already.
- self.on_dropped_event("status_429")
- pass
-
- elif response.status >= 300 or response.status < 200:
- logger.error(
- "Unexpected status code: %s (body: %s)",
- response.status,
- response.data,
- )
- self.on_dropped_event("status_{}".format(response.status))
- record_loss("network_error")
- finally:
- response.close()
-
- def on_dropped_event(self, reason):
- # type: (str) -> None
- return None
-
- def _fetch_pending_client_report(self, force=False, interval=60):
- # type: (bool, int) -> Optional[Item]
- if not self.options["send_client_reports"]:
- return None
-
- if not (force or self._last_client_report_sent < time.time() - interval):
- return None
-
- discarded_events = self._discarded_events
- self._discarded_events = defaultdict(int)
- self._last_client_report_sent = time.time()
-
- if not discarded_events:
- return None
-
- return Item(
- PayloadRef(
- json={
- "timestamp": time.time(),
- "discarded_events": [
- {"reason": reason, "category": category, "quantity": quantity}
- for (
- (category, reason),
- quantity,
- ) in discarded_events.items()
- ],
- }
- ),
- type="client_report",
- )
-
- def _flush_client_reports(self, force=False):
- # type: (bool) -> None
- client_report = self._fetch_pending_client_report(force=force, interval=60)
- if client_report is not None:
- self.capture_envelope(Envelope(items=[client_report]))
-
- def _check_disabled(self, category):
- # type: (str) -> bool
- def _disabled(bucket):
- # type: (Any) -> bool
- ts = self._disabled_until.get(bucket)
- return ts is not None and ts > datetime.utcnow()
-
- return _disabled(category) or _disabled(None)
-
- def _send_event(
- self, event # type: Event
- ):
- # type: (...) -> None
-
- if self._check_disabled("error"):
- self.on_dropped_event("self_rate_limits")
- self.record_lost_event("ratelimit_backoff", data_category="error")
- return None
-
- body = io.BytesIO()
- with gzip.GzipFile(fileobj=body, mode="w") as f:
- f.write(json_dumps(event))
-
- assert self.parsed_dsn is not None
- logger.debug(
- "Sending event, type:%s level:%s event_id:%s project:%s host:%s"
- % (
- event.get("type") or "null",
- event.get("level") or "null",
- event.get("event_id") or "null",
- self.parsed_dsn.project_id,
- self.parsed_dsn.host,
- )
- )
- self._send_request(
- body.getvalue(),
- headers={"Content-Type": "application/json", "Content-Encoding": "gzip"},
- )
- return None
-
- def _send_envelope(
- self, envelope # type: Envelope
- ):
- # type: (...) -> None
-
- # remove all items from the envelope which are over quota
- new_items = []
- for item in envelope.items:
- if self._check_disabled(item.data_category):
- if item.data_category in ("transaction", "error", "default"):
- self.on_dropped_event("self_rate_limits")
- self.record_lost_event("ratelimit_backoff", item=item)
- else:
- new_items.append(item)
-
- # Since we're modifying the envelope here make a copy so that others
- # that hold references do not see their envelope modified.
- envelope = Envelope(headers=envelope.headers, items=new_items)
-
- if not envelope.items:
- return None
-
- # since we're already in the business of sending out an envelope here
- # check if we have one pending for the stats session envelopes so we
- # can attach it to this enveloped scheduled for sending. This will
- # currently typically attach the client report to the most recent
- # session update.
- client_report_item = self._fetch_pending_client_report(interval=30)
- if client_report_item is not None:
- envelope.items.append(client_report_item)
-
- body = io.BytesIO()
- with gzip.GzipFile(fileobj=body, mode="w") as f:
- envelope.serialize_into(f)
-
- assert self.parsed_dsn is not None
- logger.debug(
- "Sending envelope [%s] project:%s host:%s",
- envelope.description,
- self.parsed_dsn.project_id,
- self.parsed_dsn.host,
- )
-
- self._send_request(
- body.getvalue(),
- headers={
- "Content-Type": "application/x-sentry-envelope",
- "Content-Encoding": "gzip",
- },
- endpoint_type="envelope",
- envelope=envelope,
- )
- return None
-
- def _get_pool_options(self, ca_certs):
- # type: (Optional[Any]) -> Dict[str, Any]
- return {
- "num_pools": 2,
- "cert_reqs": "CERT_REQUIRED",
- "ca_certs": ca_certs or certifi.where(),
- }
-
- def _in_no_proxy(self, parsed_dsn):
- # type: (Dsn) -> bool
- no_proxy = getproxies().get("no")
- if not no_proxy:
- return False
- for host in no_proxy.split(","):
- host = host.strip()
- if parsed_dsn.host.endswith(host) or parsed_dsn.netloc.endswith(host):
- return True
- return False
-
- def _make_pool(
- self,
- parsed_dsn, # type: Dsn
- http_proxy, # type: Optional[str]
- https_proxy, # type: Optional[str]
- ca_certs, # type: Optional[Any]
- ):
- # type: (...) -> Union[PoolManager, ProxyManager]
- proxy = None
- no_proxy = self._in_no_proxy(parsed_dsn)
-
- # try HTTPS first
- if parsed_dsn.scheme == "https" and (https_proxy != ""):
- proxy = https_proxy or (not no_proxy and getproxies().get("https"))
-
- # maybe fallback to HTTP proxy
- if not proxy and (http_proxy != ""):
- proxy = http_proxy or (not no_proxy and getproxies().get("http"))
-
- opts = self._get_pool_options(ca_certs)
-
- if proxy:
- return urllib3.ProxyManager(proxy, **opts)
- else:
- return urllib3.PoolManager(**opts)
-
- def capture_event(
- self, event # type: Event
- ):
- # type: (...) -> None
- hub = self.hub_cls.current
-
- def send_event_wrapper():
- # type: () -> None
- with hub:
- with capture_internal_exceptions():
- self._send_event(event)
- self._flush_client_reports()
-
- if not self._worker.submit(send_event_wrapper):
- self.on_dropped_event("full_queue")
- self.record_lost_event("queue_overflow", data_category="error")
-
- def capture_envelope(
- self, envelope # type: Envelope
- ):
- # type: (...) -> None
- hub = self.hub_cls.current
-
- def send_envelope_wrapper():
- # type: () -> None
- with hub:
- with capture_internal_exceptions():
- self._send_envelope(envelope)
- self._flush_client_reports()
-
- if not self._worker.submit(send_envelope_wrapper):
- self.on_dropped_event("full_queue")
- for item in envelope.items:
- self.record_lost_event("queue_overflow", item=item)
-
- def flush(
- self,
- timeout, # type: float
- callback=None, # type: Optional[Any]
- ):
- # type: (...) -> None
- logger.debug("Flushing HTTP transport")
-
- if timeout > 0:
- self._worker.submit(lambda: self._flush_client_reports(force=True))
- self._worker.flush(timeout, callback)
-
- def kill(self):
- # type: () -> None
- logger.debug("Killing HTTP transport")
- self._worker.kill()
-
-
-class _FunctionTransport(Transport):
- def __init__(
- self, func # type: Callable[[Event], None]
- ):
- # type: (...) -> None
- Transport.__init__(self)
- self._func = func
-
- def capture_event(
- self, event # type: Event
- ):
- # type: (...) -> None
- self._func(event)
- return None
-
-
-def make_transport(options):
- # type: (Dict[str, Any]) -> Optional[Transport]
- ref_transport = options["transport"]
-
- # If no transport is given, we use the http transport class
- if ref_transport is None:
- transport_cls = HttpTransport # type: Type[Transport]
- elif isinstance(ref_transport, Transport):
- return ref_transport
- elif isinstance(ref_transport, type) and issubclass(ref_transport, Transport):
- transport_cls = ref_transport
- elif callable(ref_transport):
- return _FunctionTransport(ref_transport) # type: ignore
-
- # if a transport class is given only instantiate it if the dsn is not
- # empty or None
- if options["dsn"]:
- return transport_cls(options)
-
- return None
diff --git a/sentry_sdk/utils.py b/sentry_sdk/utils.py
deleted file mode 100644
index a2bc528..0000000
--- a/sentry_sdk/utils.py
+++ /dev/null
@@ -1,1014 +0,0 @@
-import base64
-import json
-import linecache
-import logging
-import os
-import sys
-import threading
-import subprocess
-import re
-
-from datetime import datetime
-
-import sentry_sdk
-from sentry_sdk._compat import urlparse, text_type, implements_str, PY2
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from types import FrameType
- from types import TracebackType
- from typing import Any
- from typing import Callable
- from typing import Dict
- from typing import ContextManager
- from typing import Iterator
- from typing import List
- from typing import Optional
- from typing import Set
- from typing import Tuple
- from typing import Union
- from typing import Type
-
- from sentry_sdk._types import ExcInfo, EndpointType
-
-
-epoch = datetime(1970, 1, 1)
-
-
-# The logger is created here but initialized in the debug support module
-logger = logging.getLogger("sentry_sdk.errors")
-
-MAX_STRING_LENGTH = 512
-MAX_FORMAT_PARAM_LENGTH = 128
-BASE64_ALPHABET = re.compile(r"^[a-zA-Z0-9/+=]*$")
-
-
-def json_dumps(data):
- # type: (Any) -> bytes
- """Serialize data into a compact JSON representation encoded as UTF-8."""
- return json.dumps(data, allow_nan=False, separators=(",", ":")).encode("utf-8")
-
-
-def _get_debug_hub():
- # type: () -> Optional[sentry_sdk.Hub]
- # This function is replaced by debug.py
- pass
-
-
-def get_default_release():
- # type: () -> Optional[str]
- """Try to guess a default release."""
- release = os.environ.get("SENTRY_RELEASE")
- if release:
- return release
-
- with open(os.path.devnull, "w+") as null:
- try:
- release = (
- subprocess.Popen(
- ["git", "rev-parse", "HEAD"],
- stdout=subprocess.PIPE,
- stderr=null,
- stdin=null,
- )
- .communicate()[0]
- .strip()
- .decode("utf-8")
- )
- except (OSError, IOError):
- pass
-
- if release:
- return release
-
- for var in (
- "HEROKU_SLUG_COMMIT",
- "SOURCE_VERSION",
- "CODEBUILD_RESOLVED_SOURCE_VERSION",
- "CIRCLE_SHA1",
- "GAE_DEPLOYMENT_ID",
- ):
- release = os.environ.get(var)
- if release:
- return release
- return None
-
-
-class CaptureInternalException(object):
- __slots__ = ()
-
- def __enter__(self):
- # type: () -> ContextManager[Any]
- return self
-
- def __exit__(self, ty, value, tb):
- # type: (Optional[Type[BaseException]], Optional[BaseException], Optional[TracebackType]) -> bool
- if ty is not None and value is not None:
- capture_internal_exception((ty, value, tb))
-
- return True
-
-
-_CAPTURE_INTERNAL_EXCEPTION = CaptureInternalException()
-
-
-def capture_internal_exceptions():
- # type: () -> ContextManager[Any]
- return _CAPTURE_INTERNAL_EXCEPTION
-
-
-def capture_internal_exception(exc_info):
- # type: (ExcInfo) -> None
- hub = _get_debug_hub()
- if hub is not None:
- hub._capture_internal_exception(exc_info)
-
-
-def to_timestamp(value):
- # type: (datetime) -> float
- return (value - epoch).total_seconds()
-
-
-def format_timestamp(value):
- # type: (datetime) -> str
- return value.strftime("%Y-%m-%dT%H:%M:%S.%fZ")
-
-
-def event_hint_with_exc_info(exc_info=None):
- # type: (Optional[ExcInfo]) -> Dict[str, Optional[ExcInfo]]
- """Creates a hint with the exc info filled in."""
- if exc_info is None:
- exc_info = sys.exc_info()
- else:
- exc_info = exc_info_from_error(exc_info)
- if exc_info[0] is None:
- exc_info = None
- return {"exc_info": exc_info}
-
-
-class BadDsn(ValueError):
- """Raised on invalid DSNs."""
-
-
-@implements_str
-class Dsn(object):
- """Represents a DSN."""
-
- def __init__(self, value):
- # type: (Union[Dsn, str]) -> None
- if isinstance(value, Dsn):
- self.__dict__ = dict(value.__dict__)
- return
- parts = urlparse.urlsplit(text_type(value))
-
- if parts.scheme not in (u"http", u"https"):
- raise BadDsn("Unsupported scheme %r" % parts.scheme)
- self.scheme = parts.scheme
-
- if parts.hostname is None:
- raise BadDsn("Missing hostname")
-
- self.host = parts.hostname
-
- if parts.port is None:
- self.port = self.scheme == "https" and 443 or 80
- else:
- self.port = parts.port
-
- if not parts.username:
- raise BadDsn("Missing public key")
-
- self.public_key = parts.username
- self.secret_key = parts.password
-
- path = parts.path.rsplit("/", 1)
-
- try:
- self.project_id = text_type(int(path.pop()))
- except (ValueError, TypeError):
- raise BadDsn("Invalid project in DSN (%r)" % (parts.path or "")[1:])
-
- self.path = "/".join(path) + "/"
-
- @property
- def netloc(self):
- # type: () -> str
- """The netloc part of a DSN."""
- rv = self.host
- if (self.scheme, self.port) not in (("http", 80), ("https", 443)):
- rv = "%s:%s" % (rv, self.port)
- return rv
-
- def to_auth(self, client=None):
- # type: (Optional[Any]) -> Auth
- """Returns the auth info object for this dsn."""
- return Auth(
- scheme=self.scheme,
- host=self.netloc,
- path=self.path,
- project_id=self.project_id,
- public_key=self.public_key,
- secret_key=self.secret_key,
- client=client,
- )
-
- def __str__(self):
- # type: () -> str
- return "%s://%s%s@%s%s%s" % (
- self.scheme,
- self.public_key,
- self.secret_key and "@" + self.secret_key or "",
- self.netloc,
- self.path,
- self.project_id,
- )
-
-
-class Auth(object):
- """Helper object that represents the auth info."""
-
- def __init__(
- self,
- scheme,
- host,
- project_id,
- public_key,
- secret_key=None,
- version=7,
- client=None,
- path="/",
- ):
- # type: (str, str, str, str, Optional[str], int, Optional[Any], str) -> None
- self.scheme = scheme
- self.host = host
- self.path = path
- self.project_id = project_id
- self.public_key = public_key
- self.secret_key = secret_key
- self.version = version
- self.client = client
-
- @property
- def store_api_url(self):
- # type: () -> str
- """Returns the API url for storing events.
-
- Deprecated: use get_api_url instead.
- """
- return self.get_api_url(type="store")
-
- def get_api_url(
- self, type="store" # type: EndpointType
- ):
- # type: (...) -> str
- """Returns the API url for storing events."""
- return "%s://%s%sapi/%s/%s/" % (
- self.scheme,
- self.host,
- self.path,
- self.project_id,
- type,
- )
-
- def to_header(self, timestamp=None):
- # type: (Optional[datetime]) -> str
- """Returns the auth header a string."""
- rv = [("sentry_key", self.public_key), ("sentry_version", self.version)]
- if timestamp is not None:
- rv.append(("sentry_timestamp", str(to_timestamp(timestamp))))
- if self.client is not None:
- rv.append(("sentry_client", self.client))
- if self.secret_key is not None:
- rv.append(("sentry_secret", self.secret_key))
- return u"Sentry " + u", ".join("%s=%s" % (key, value) for key, value in rv)
-
-
-class AnnotatedValue(object):
- __slots__ = ("value", "metadata")
-
- def __init__(self, value, metadata):
- # type: (Optional[Any], Dict[str, Any]) -> None
- self.value = value
- self.metadata = metadata
-
-
-if MYPY:
- from typing import TypeVar
-
- T = TypeVar("T")
- Annotated = Union[AnnotatedValue, T]
-
-
-def get_type_name(cls):
- # type: (Optional[type]) -> Optional[str]
- return getattr(cls, "__qualname__", None) or getattr(cls, "__name__", None)
-
-
-def get_type_module(cls):
- # type: (Optional[type]) -> Optional[str]
- mod = getattr(cls, "__module__", None)
- if mod not in (None, "builtins", "__builtins__"):
- return mod
- return None
-
-
-def should_hide_frame(frame):
- # type: (FrameType) -> bool
- try:
- mod = frame.f_globals["__name__"]
- if mod.startswith("sentry_sdk."):
- return True
- except (AttributeError, KeyError):
- pass
-
- for flag_name in "__traceback_hide__", "__tracebackhide__":
- try:
- if frame.f_locals[flag_name]:
- return True
- except Exception:
- pass
-
- return False
-
-
-def iter_stacks(tb):
- # type: (Optional[TracebackType]) -> Iterator[TracebackType]
- tb_ = tb # type: Optional[TracebackType]
- while tb_ is not None:
- if not should_hide_frame(tb_.tb_frame):
- yield tb_
- tb_ = tb_.tb_next
-
-
-def get_lines_from_file(
- filename, # type: str
- lineno, # type: int
- loader=None, # type: Optional[Any]
- module=None, # type: Optional[str]
-):
- # type: (...) -> Tuple[List[Annotated[str]], Optional[Annotated[str]], List[Annotated[str]]]
- context_lines = 5
- source = None
- if loader is not None and hasattr(loader, "get_source"):
- try:
- source_str = loader.get_source(module) # type: Optional[str]
- except (ImportError, IOError):
- source_str = None
- if source_str is not None:
- source = source_str.splitlines()
-
- if source is None:
- try:
- source = linecache.getlines(filename)
- except (OSError, IOError):
- return [], None, []
-
- if not source:
- return [], None, []
-
- lower_bound = max(0, lineno - context_lines)
- upper_bound = min(lineno + 1 + context_lines, len(source))
-
- try:
- pre_context = [
- strip_string(line.strip("\r\n")) for line in source[lower_bound:lineno]
- ]
- context_line = strip_string(source[lineno].strip("\r\n"))
- post_context = [
- strip_string(line.strip("\r\n"))
- for line in source[(lineno + 1) : upper_bound]
- ]
- return pre_context, context_line, post_context
- except IndexError:
- # the file may have changed since it was loaded into memory
- return [], None, []
-
-
-def get_source_context(
- frame, # type: FrameType
- tb_lineno, # type: int
-):
- # type: (...) -> Tuple[List[Annotated[str]], Optional[Annotated[str]], List[Annotated[str]]]
- try:
- abs_path = frame.f_code.co_filename # type: Optional[str]
- except Exception:
- abs_path = None
- try:
- module = frame.f_globals["__name__"]
- except Exception:
- return [], None, []
- try:
- loader = frame.f_globals["__loader__"]
- except Exception:
- loader = None
- lineno = tb_lineno - 1
- if lineno is not None and abs_path:
- return get_lines_from_file(abs_path, lineno, loader, module)
- return [], None, []
-
-
-def safe_str(value):
- # type: (Any) -> str
- try:
- return text_type(value)
- except Exception:
- return safe_repr(value)
-
-
-if PY2:
-
- def safe_repr(value):
- # type: (Any) -> str
- try:
- rv = repr(value).decode("utf-8", "replace")
-
- # At this point `rv` contains a bunch of literal escape codes, like
- # this (exaggerated example):
- #
- # u"\\x2f"
- #
- # But we want to show this string as:
- #
- # u"/"
- try:
- # unicode-escape does this job, but can only decode latin1. So we
- # attempt to encode in latin1.
- return rv.encode("latin1").decode("unicode-escape")
- except Exception:
- # Since usually strings aren't latin1 this can break. In those
- # cases we just give up.
- return rv
- except Exception:
- # If e.g. the call to `repr` already fails
- return u""
-
-
-else:
-
- def safe_repr(value):
- # type: (Any) -> str
- try:
- return repr(value)
- except Exception:
- return ""
-
-
-def filename_for_module(module, abs_path):
- # type: (Optional[str], Optional[str]) -> Optional[str]
- if not abs_path or not module:
- return abs_path
-
- try:
- if abs_path.endswith(".pyc"):
- abs_path = abs_path[:-1]
-
- base_module = module.split(".", 1)[0]
- if base_module == module:
- return os.path.basename(abs_path)
-
- base_module_path = sys.modules[base_module].__file__
- return abs_path.split(base_module_path.rsplit(os.sep, 2)[0], 1)[-1].lstrip(
- os.sep
- )
- except Exception:
- return abs_path
-
-
-def serialize_frame(frame, tb_lineno=None, with_locals=True):
- # type: (FrameType, Optional[int], bool) -> Dict[str, Any]
- f_code = getattr(frame, "f_code", None)
- if not f_code:
- abs_path = None
- function = None
- else:
- abs_path = frame.f_code.co_filename
- function = frame.f_code.co_name
- try:
- module = frame.f_globals["__name__"]
- except Exception:
- module = None
-
- if tb_lineno is None:
- tb_lineno = frame.f_lineno
-
- pre_context, context_line, post_context = get_source_context(frame, tb_lineno)
-
- rv = {
- "filename": filename_for_module(module, abs_path) or None,
- "abs_path": os.path.abspath(abs_path) if abs_path else None,
- "function": function or "",
- "module": module,
- "lineno": tb_lineno,
- "pre_context": pre_context,
- "context_line": context_line,
- "post_context": post_context,
- } # type: Dict[str, Any]
- if with_locals:
- rv["vars"] = frame.f_locals
-
- return rv
-
-
-def current_stacktrace(with_locals=True):
- # type: (bool) -> Any
- __tracebackhide__ = True
- frames = []
-
- f = sys._getframe() # type: Optional[FrameType]
- while f is not None:
- if not should_hide_frame(f):
- frames.append(serialize_frame(f, with_locals=with_locals))
- f = f.f_back
-
- frames.reverse()
-
- return {"frames": frames}
-
-
-def get_errno(exc_value):
- # type: (BaseException) -> Optional[Any]
- return getattr(exc_value, "errno", None)
-
-
-def single_exception_from_error_tuple(
- exc_type, # type: Optional[type]
- exc_value, # type: Optional[BaseException]
- tb, # type: Optional[TracebackType]
- client_options=None, # type: Optional[Dict[str, Any]]
- mechanism=None, # type: Optional[Dict[str, Any]]
-):
- # type: (...) -> Dict[str, Any]
- if exc_value is not None:
- errno = get_errno(exc_value)
- else:
- errno = None
-
- if errno is not None:
- mechanism = mechanism or {"type": "generic"}
- mechanism.setdefault("meta", {}).setdefault("errno", {}).setdefault(
- "number", errno
- )
-
- if client_options is None:
- with_locals = True
- else:
- with_locals = client_options["with_locals"]
-
- frames = [
- serialize_frame(tb.tb_frame, tb_lineno=tb.tb_lineno, with_locals=with_locals)
- for tb in iter_stacks(tb)
- ]
-
- rv = {
- "module": get_type_module(exc_type),
- "type": get_type_name(exc_type),
- "value": safe_str(exc_value),
- "mechanism": mechanism,
- }
-
- if frames:
- rv["stacktrace"] = {"frames": frames}
-
- return rv
-
-
-HAS_CHAINED_EXCEPTIONS = hasattr(Exception, "__suppress_context__")
-
-if HAS_CHAINED_EXCEPTIONS:
-
- def walk_exception_chain(exc_info):
- # type: (ExcInfo) -> Iterator[ExcInfo]
- exc_type, exc_value, tb = exc_info
-
- seen_exceptions = []
- seen_exception_ids = set() # type: Set[int]
-
- while (
- exc_type is not None
- and exc_value is not None
- and id(exc_value) not in seen_exception_ids
- ):
- yield exc_type, exc_value, tb
-
- # Avoid hashing random types we don't know anything
- # about. Use the list to keep a ref so that the `id` is
- # not used for another object.
- seen_exceptions.append(exc_value)
- seen_exception_ids.add(id(exc_value))
-
- if exc_value.__suppress_context__:
- cause = exc_value.__cause__
- else:
- cause = exc_value.__context__
- if cause is None:
- break
- exc_type = type(cause)
- exc_value = cause
- tb = getattr(cause, "__traceback__", None)
-
-
-else:
-
- def walk_exception_chain(exc_info):
- # type: (ExcInfo) -> Iterator[ExcInfo]
- yield exc_info
-
-
-def exceptions_from_error_tuple(
- exc_info, # type: ExcInfo
- client_options=None, # type: Optional[Dict[str, Any]]
- mechanism=None, # type: Optional[Dict[str, Any]]
-):
- # type: (...) -> List[Dict[str, Any]]
- exc_type, exc_value, tb = exc_info
- rv = []
- for exc_type, exc_value, tb in walk_exception_chain(exc_info):
- rv.append(
- single_exception_from_error_tuple(
- exc_type, exc_value, tb, client_options, mechanism
- )
- )
-
- rv.reverse()
-
- return rv
-
-
-def to_string(value):
- # type: (str) -> str
- try:
- return text_type(value)
- except UnicodeDecodeError:
- return repr(value)[1:-1]
-
-
-def iter_event_stacktraces(event):
- # type: (Dict[str, Any]) -> Iterator[Dict[str, Any]]
- if "stacktrace" in event:
- yield event["stacktrace"]
- if "threads" in event:
- for thread in event["threads"].get("values") or ():
- if "stacktrace" in thread:
- yield thread["stacktrace"]
- if "exception" in event:
- for exception in event["exception"].get("values") or ():
- if "stacktrace" in exception:
- yield exception["stacktrace"]
-
-
-def iter_event_frames(event):
- # type: (Dict[str, Any]) -> Iterator[Dict[str, Any]]
- for stacktrace in iter_event_stacktraces(event):
- for frame in stacktrace.get("frames") or ():
- yield frame
-
-
-def handle_in_app(event, in_app_exclude=None, in_app_include=None):
- # type: (Dict[str, Any], Optional[List[str]], Optional[List[str]]) -> Dict[str, Any]
- for stacktrace in iter_event_stacktraces(event):
- handle_in_app_impl(
- stacktrace.get("frames"),
- in_app_exclude=in_app_exclude,
- in_app_include=in_app_include,
- )
-
- return event
-
-
-def handle_in_app_impl(frames, in_app_exclude, in_app_include):
- # type: (Any, Optional[List[str]], Optional[List[str]]) -> Optional[Any]
- if not frames:
- return None
-
- any_in_app = False
- for frame in frames:
- in_app = frame.get("in_app")
- if in_app is not None:
- if in_app:
- any_in_app = True
- continue
-
- module = frame.get("module")
- if not module:
- continue
- elif _module_in_set(module, in_app_include):
- frame["in_app"] = True
- any_in_app = True
- elif _module_in_set(module, in_app_exclude):
- frame["in_app"] = False
-
- if not any_in_app:
- for frame in frames:
- if frame.get("in_app") is None:
- frame["in_app"] = True
-
- return frames
-
-
-def exc_info_from_error(error):
- # type: (Union[BaseException, ExcInfo]) -> ExcInfo
- if isinstance(error, tuple) and len(error) == 3:
- exc_type, exc_value, tb = error
- elif isinstance(error, BaseException):
- tb = getattr(error, "__traceback__", None)
- if tb is not None:
- exc_type = type(error)
- exc_value = error
- else:
- exc_type, exc_value, tb = sys.exc_info()
- if exc_value is not error:
- tb = None
- exc_value = error
- exc_type = type(error)
-
- else:
- raise ValueError("Expected Exception object to report, got %s!" % type(error))
-
- return exc_type, exc_value, tb
-
-
-def event_from_exception(
- exc_info, # type: Union[BaseException, ExcInfo]
- client_options=None, # type: Optional[Dict[str, Any]]
- mechanism=None, # type: Optional[Dict[str, Any]]
-):
- # type: (...) -> Tuple[Dict[str, Any], Dict[str, Any]]
- exc_info = exc_info_from_error(exc_info)
- hint = event_hint_with_exc_info(exc_info)
- return (
- {
- "level": "error",
- "exception": {
- "values": exceptions_from_error_tuple(
- exc_info, client_options, mechanism
- )
- },
- },
- hint,
- )
-
-
-def _module_in_set(name, set):
- # type: (str, Optional[List[str]]) -> bool
- if not set:
- return False
- for item in set or ():
- if item == name or name.startswith(item + "."):
- return True
- return False
-
-
-def strip_string(value, max_length=None):
- # type: (str, Optional[int]) -> Union[AnnotatedValue, str]
- # TODO: read max_length from config
- if not value:
- return value
-
- if max_length is None:
- # This is intentionally not just the default such that one can patch `MAX_STRING_LENGTH` and affect `strip_string`.
- max_length = MAX_STRING_LENGTH
-
- length = len(value)
-
- if length > max_length:
- return AnnotatedValue(
- value=value[: max_length - 3] + u"...",
- metadata={
- "len": length,
- "rem": [["!limit", "x", max_length - 3, max_length]],
- },
- )
- return value
-
-
-def _is_contextvars_broken():
- # type: () -> bool
- """
- Returns whether gevent/eventlet have patched the stdlib in a way where thread locals are now more "correct" than contextvars.
- """
- try:
- import gevent # type: ignore
- from gevent.monkey import is_object_patched # type: ignore
-
- # Get the MAJOR and MINOR version numbers of Gevent
- version_tuple = tuple(
- [int(part) for part in re.split(r"a|b|rc|\.", gevent.__version__)[:2]]
- )
- if is_object_patched("threading", "local"):
- # Gevent 20.9.0 depends on Greenlet 0.4.17 which natively handles switching
- # context vars when greenlets are switched, so, Gevent 20.9.0+ is all fine.
- # Ref: https://github.com/gevent/gevent/blob/83c9e2ae5b0834b8f84233760aabe82c3ba065b4/src/gevent/monkey.py#L604-L609
- # Gevent 20.5, that doesn't depend on Greenlet 0.4.17 with native support
- # for contextvars, is able to patch both thread locals and contextvars, in
- # that case, check if contextvars are effectively patched.
- if (
- # Gevent 20.9.0+
- (sys.version_info >= (3, 7) and version_tuple >= (20, 9))
- # Gevent 20.5.0+ or Python < 3.7
- or (is_object_patched("contextvars", "ContextVar"))
- ):
- return False
-
- return True
- except ImportError:
- pass
-
- try:
- from eventlet.patcher import is_monkey_patched # type: ignore
-
- if is_monkey_patched("thread"):
- return True
- except ImportError:
- pass
-
- return False
-
-
-def _make_threadlocal_contextvars(local):
- # type: (type) -> type
- class ContextVar(object):
- # Super-limited impl of ContextVar
-
- def __init__(self, name):
- # type: (str) -> None
- self._name = name
- self._local = local()
-
- def get(self, default):
- # type: (Any) -> Any
- return getattr(self._local, "value", default)
-
- def set(self, value):
- # type: (Any) -> None
- self._local.value = value
-
- return ContextVar
-
-
-def _get_contextvars():
- # type: () -> Tuple[bool, type]
- """
- Figure out the "right" contextvars installation to use. Returns a
- `contextvars.ContextVar`-like class with a limited API.
-
- See https://docs.sentry.io/platforms/python/contextvars/ for more information.
- """
- if not _is_contextvars_broken():
- # aiocontextvars is a PyPI package that ensures that the contextvars
- # backport (also a PyPI package) works with asyncio under Python 3.6
- #
- # Import it if available.
- if sys.version_info < (3, 7):
- # `aiocontextvars` is absolutely required for functional
- # contextvars on Python 3.6.
- try:
- from aiocontextvars import ContextVar # noqa
-
- return True, ContextVar
- except ImportError:
- pass
- else:
- # On Python 3.7 contextvars are functional.
- try:
- from contextvars import ContextVar
-
- return True, ContextVar
- except ImportError:
- pass
-
- # Fall back to basic thread-local usage.
-
- from threading import local
-
- return False, _make_threadlocal_contextvars(local)
-
-
-HAS_REAL_CONTEXTVARS, ContextVar = _get_contextvars()
-
-CONTEXTVARS_ERROR_MESSAGE = """
-
-With asyncio/ASGI applications, the Sentry SDK requires a functional
-installation of `contextvars` to avoid leaking scope/context data across
-requests.
-
-Please refer to https://docs.sentry.io/platforms/python/contextvars/ for more information.
-"""
-
-
-def transaction_from_function(func):
- # type: (Callable[..., Any]) -> Optional[str]
- # Methods in Python 2
- try:
- return "%s.%s.%s" % (
- func.im_class.__module__, # type: ignore
- func.im_class.__name__, # type: ignore
- func.__name__,
- )
- except Exception:
- pass
-
- func_qualname = (
- getattr(func, "__qualname__", None) or getattr(func, "__name__", None) or None
- ) # type: Optional[str]
-
- if not func_qualname:
- # No idea what it is
- return None
-
- # Methods in Python 3
- # Functions
- # Classes
- try:
- return "%s.%s" % (func.__module__, func_qualname)
- except Exception:
- pass
-
- # Possibly a lambda
- return func_qualname
-
-
-disable_capture_event = ContextVar("disable_capture_event")
-
-
-class ServerlessTimeoutWarning(Exception):
- """Raised when a serverless method is about to reach its timeout."""
-
- pass
-
-
-class TimeoutThread(threading.Thread):
- """Creates a Thread which runs (sleeps) for a time duration equal to
- waiting_time and raises a custom ServerlessTimeout exception.
- """
-
- def __init__(self, waiting_time, configured_timeout):
- # type: (float, int) -> None
- threading.Thread.__init__(self)
- self.waiting_time = waiting_time
- self.configured_timeout = configured_timeout
- self._stop_event = threading.Event()
-
- def stop(self):
- # type: () -> None
- self._stop_event.set()
-
- def run(self):
- # type: () -> None
-
- self._stop_event.wait(self.waiting_time)
-
- if self._stop_event.is_set():
- return
-
- integer_configured_timeout = int(self.configured_timeout)
-
- # Setting up the exact integer value of configured time(in seconds)
- if integer_configured_timeout < self.configured_timeout:
- integer_configured_timeout = integer_configured_timeout + 1
-
- # Raising Exception after timeout duration is reached
- raise ServerlessTimeoutWarning(
- "WARNING : Function is expected to get timed out. Configured timeout duration = {} seconds.".format(
- integer_configured_timeout
- )
- )
-
-
-def to_base64(original):
- # type: (str) -> Optional[str]
- """
- Convert a string to base64, via UTF-8. Returns None on invalid input.
- """
- base64_string = None
-
- try:
- utf8_bytes = original.encode("UTF-8")
- base64_bytes = base64.b64encode(utf8_bytes)
- base64_string = base64_bytes.decode("UTF-8")
- except Exception as err:
- logger.warning("Unable to encode {orig} to base64:".format(orig=original), err)
-
- return base64_string
-
-
-def from_base64(base64_string):
- # type: (str) -> Optional[str]
- """
- Convert a string from base64, via UTF-8. Returns None on invalid input.
- """
- utf8_string = None
-
- try:
- only_valid_chars = BASE64_ALPHABET.match(base64_string)
- assert only_valid_chars
-
- base64_bytes = base64_string.encode("UTF-8")
- utf8_bytes = base64.b64decode(base64_bytes)
- utf8_string = utf8_bytes.decode("UTF-8")
- except Exception as err:
- logger.warning(
- "Unable to decode {b64} from base64:".format(b64=base64_string), err
- )
-
- return utf8_string
diff --git a/sentry_sdk/worker.py b/sentry_sdk/worker.py
deleted file mode 100644
index a06fb8f..0000000
--- a/sentry_sdk/worker.py
+++ /dev/null
@@ -1,133 +0,0 @@
-import os
-import threading
-
-from time import sleep, time
-from sentry_sdk._compat import check_thread_support
-from sentry_sdk._queue import Queue, Full
-from sentry_sdk.utils import logger
-from sentry_sdk.consts import DEFAULT_QUEUE_SIZE
-
-from sentry_sdk._types import MYPY
-
-if MYPY:
- from typing import Any
- from typing import Optional
- from typing import Callable
-
-
-_TERMINATOR = object()
-
-
-class BackgroundWorker(object):
- def __init__(self, queue_size=DEFAULT_QUEUE_SIZE):
- # type: (int) -> None
- check_thread_support()
- self._queue = Queue(queue_size) # type: Queue
- self._lock = threading.Lock()
- self._thread = None # type: Optional[threading.Thread]
- self._thread_for_pid = None # type: Optional[int]
-
- @property
- def is_alive(self):
- # type: () -> bool
- if self._thread_for_pid != os.getpid():
- return False
- if not self._thread:
- return False
- return self._thread.is_alive()
-
- def _ensure_thread(self):
- # type: () -> None
- if not self.is_alive:
- self.start()
-
- def _timed_queue_join(self, timeout):
- # type: (float) -> bool
- deadline = time() + timeout
- queue = self._queue
-
- queue.all_tasks_done.acquire()
-
- try:
- while queue.unfinished_tasks:
- delay = deadline - time()
- if delay <= 0:
- return False
- queue.all_tasks_done.wait(timeout=delay)
-
- return True
- finally:
- queue.all_tasks_done.release()
-
- def start(self):
- # type: () -> None
- with self._lock:
- if not self.is_alive:
- self._thread = threading.Thread(
- target=self._target, name="raven-sentry.BackgroundWorker"
- )
- self._thread.daemon = True
- self._thread.start()
- self._thread_for_pid = os.getpid()
-
- def kill(self):
- # type: () -> None
- """
- Kill worker thread. Returns immediately. Not useful for
- waiting on shutdown for events, use `flush` for that.
- """
- logger.debug("background worker got kill request")
- with self._lock:
- if self._thread:
- try:
- self._queue.put_nowait(_TERMINATOR)
- except Full:
- logger.debug("background worker queue full, kill failed")
-
- self._thread = None
- self._thread_for_pid = None
-
- def flush(self, timeout, callback=None):
- # type: (float, Optional[Any]) -> None
- logger.debug("background worker got flush request")
- with self._lock:
- if self.is_alive and timeout > 0.0:
- self._wait_flush(timeout, callback)
- logger.debug("background worker flushed")
-
- def _wait_flush(self, timeout, callback):
- # type: (float, Optional[Any]) -> None
- initial_timeout = min(0.1, timeout)
- if not self._timed_queue_join(initial_timeout):
- pending = self._queue.qsize() + 1
- logger.debug("%d event(s) pending on flush", pending)
- if callback is not None:
- callback(pending, timeout)
-
- if not self._timed_queue_join(timeout - initial_timeout):
- pending = self._queue.qsize() + 1
- logger.error("flush timed out, dropped %s events", pending)
-
- def submit(self, callback):
- # type: (Callable[[], None]) -> bool
- self._ensure_thread()
- try:
- self._queue.put_nowait(callback)
- return True
- except Full:
- return False
-
- def _target(self):
- # type: () -> None
- while True:
- callback = self._queue.get()
- try:
- if callback is _TERMINATOR:
- break
- try:
- callback()
- except Exception:
- logger.error("Failed processing job", exc_info=True)
- finally:
- self._queue.task_done()
- sleep(0)
diff --git a/splitargs.py b/splitargs.py
index 90235c1..f6ea59f 100755
--- a/splitargs.py
+++ b/splitargs.py
@@ -23,20 +23,8 @@ def get_args(wf):
pronounce = arg_array[4]
operation = arg_array[5]
- # 是否有更新
- if operation == 'update_now':
- wf.start_update()
- return
- elif operation == 'update_with_url':
- import webbrowser
- url = "https://github.com/whyliam/whyliam.workflows.youdao/releases"
- webbrowser.open(url)
- return
- elif operation == 'update_next_time':
- return
-
# 是否有错误
- elif operation == 'error':
+ if operation == 'error':
import webbrowser
url = "https://blog.naaln.com/2017/04/alfred-youdao-intro/"
webbrowser.open(url)
@@ -96,8 +84,5 @@ def get_args(wf):
if __name__ == '__main__':
- wf = Workflow3(update_settings={
- 'github_slug': 'whyliam/whyliam.workflows.youdao',
- 'frequency': 0
- })
+ wf = Workflow3()
sys.exit(wf.run(get_args))
diff --git a/version b/version
index 56fea8a..a0cd9f0 100755
--- a/version
+++ b/version
@@ -1 +1 @@
-3.0.0
\ No newline at end of file
+3.1.0
\ No newline at end of file
diff --git a/whyliam.workflows.youdao.alfredworkflow b/whyliam.workflows.youdao.alfredworkflow
index 5048198..89e3955 100644
Binary files a/whyliam.workflows.youdao.alfredworkflow and b/whyliam.workflows.youdao.alfredworkflow differ
diff --git a/youdao.py b/youdao.py
index a4000e1..c150d02 100755
--- a/youdao.py
+++ b/youdao.py
@@ -1,7 +1,6 @@
# -*- coding: utf-8 -*-
from workflow import Workflow3
-import sentry_sdk
import os
import json
import uuid
@@ -10,47 +9,145 @@
import sys
import random
-YOUDAO_DEFAULT_KEYFROM = ('whyliam-wf-1', 'whyliam-wf-2', 'whyliam-wf-3',
- 'whyliam-wf-4', 'whyliam-wf-5', 'whyliam-wf-6',
- 'whyliam-wf-7', 'whyliam-wf-8', 'whyliam-wf-9',
- 'whyliam-wf-10', 'whyliam-wf-11')
-
-YOUDAO_DEFAULT_KEY = (2002493135, 2002493136, 2002493137,
- 2002493138, 2002493139, 2002493140,
- 2002493141, 2002493142, 2002493143,
- 1947745089, 1947745090)
-
ERRORCODE_DICT = {
"20": "要翻译的文本过长",
"30": "无法进行有效的翻译",
"40": "不支持的语言类型",
"50": "无效的key",
"60": "无词典结果,仅在获取词典结果生效",
- "101": "缺少必填的参数,出现这个情况还可能是et的值和实际加密方式不对应",
+ "101": "缺少必填的参数,首先确保必填参数齐全,然后确认参数书写是否正确。",
"102": "不支持的语言类型",
"103": "翻译文本过长",
"104": "不支持的API类型",
"105": "不支持的签名类型",
"106": "不支持的响应类型",
"107": "不支持的传输加密类型",
- "108": "appKey无效,注册账号, 登录后台创建应用和实例并完成绑定,\
- 可获得应用ID和密钥等信息,其中应用ID就是appKey(注意不是应用密钥)",
+ "108": "应用ID无效,注册账号,登录后台创建应用并完成绑定,可获得应用ID和应用密钥等信息",
"109": "batchLog格式不正确",
- "110": "无相关服务的有效实例",
+ "110": "无相关服务的有效应用,应用没有绑定服务应用,可以新建服务应用。注:某些服务的翻译结果发音需要tts服务,需要在控制台创建语音合成服务绑定应用后方能使用。",
"111": "开发者账号无效",
+ "112": "请求服务无效",
"113": "q不能为空",
+ "114": "不支持的图片传输方式",
+ "116": "strict字段取值无效,请参考文档填写正确参数值",
"201": "解密失败,可能为DES,BASE64,URLDecode的错误",
- "202": "签名检验失败",
+ "202": "签名检验失败,如果确认应用ID和应用密钥的正确性,仍返回202,一般是编码问题。请确保翻译文本 q 为UTF-8编码.",
"203": "访问IP地址不在可访问IP列表",
- "205": "请求的接口与应用的平台类型不一致,如有疑问请参考[入门指南]",
+ "205": "请求的接口与应用的平台类型不一致,确保接入方式(Android SDK、IOS SDK、API)与创建的应用平台类型一致。如有疑问请参考入门指南",
"206": "因为时间戳无效导致签名校验失败",
"207": "重放请求",
"301": "辞典查询失败",
"302": "翻译查询失败",
"303": "服务端的其它异常",
- "401": "账户已经欠费停",
+ "304": "会话闲置太久超时",
+ "308": "rejectFallback参数错误",
+ "309": "domain参数错误",
+ "310": "未开通领域翻译服务",
+ "401": "账户已经欠费,请进行账户充值",
+ "402": "offlinesdk不可用",
"411": "访问频率受限,请稍后访问",
"412": "长请求过于频繁,请稍后访问",
+ "1001": "无效的OCR类型",
+ "1002": "不支持的OCR image类型",
+ "1003": "不支持的OCR Language类型",
+ "1004": "识别图片过大",
+ "1201": "图片base64解密失败",
+ "1301": "OCR段落识别失败",
+ "1411": "访问频率受限",
+ "1412": "超过最大识别字节数",
+ "2003": "不支持的语言识别Language类型",
+ "2004": "合成字符过长",
+ "2005": "不支持的音频文件类型",
+ "2006": "不支持的发音类型",
+ "2201": "解密失败",
+ "2301": "服务的异常",
+ "2411": "访问频率受限,请稍后访问",
+ "2412": "超过最大请求字符数",
+ "3001": "不支持的语音格式",
+ "3002": "不支持的语音采样率",
+ "3003": "不支持的语音声道",
+ "3004": "不支持的语音上传类型",
+ "3005": "不支持的语言类型",
+ "3006": "不支持的识别类型",
+ "3007": "识别音频文件过大",
+ "3008": "识别音频时长过长",
+ "3009": "不支持的音频文件类型",
+ "3010": "不支持的发音类型",
+ "3201": "解密失败",
+ "3301": "语音识别失败",
+ "3302": "语音翻译失败",
+ "3303": "服务的异常",
+ "3411": "访问频率受限,请稍后访问",
+ "3412": "超过最大请求字符数",
+ "4001": "不支持的语音识别格式",
+ "4002": "不支持的语音识别采样率",
+ "4003": "不支持的语音识别声道",
+ "4004": "不支持的语音上传类型",
+ "4005": "不支持的语言类型",
+ "4006": "识别音频文件过大",
+ "4007": "识别音频时长过长",
+ "4201": "解密失败",
+ "4301": "语音识别失败",
+ "4303": "服务的异常",
+ "4411": "访问频率受限,请稍后访问",
+ "4412": "超过最大请求时长",
+ "5001": "无效的OCR类型",
+ "5002": "不支持的OCR image类型",
+ "5003": "不支持的语言类型",
+ "5004": "识别图片过大",
+ "5005": "不支持的图片类型",
+ "5006": "文件为空",
+ "5201": "解密错误,图片base64解密失败",
+ "5301": "OCR段落识别失败",
+ "5411": "访问频率受限",
+ "5412": "超过最大识别流量",
+ "9001": "不支持的语音格式",
+ "9002": "不支持的语音采样率",
+ "9003": "不支持的语音声道",
+ "9004": "不支持的语音上传类型",
+ "9005": "不支持的语音识别 Language类型",
+ "9301": "ASR识别失败",
+ "9303": "服务器内部错误",
+ "9411": "访问频率受限(超过最大调用次数)",
+ "9412": "超过最大处理语音长度",
+ "10001": "无效的OCR类型",
+ "10002": "不支持的OCR image类型",
+ "10004": "识别图片过大",
+ "10201": "图片base64解密失败",
+ "10301": "OCR段落识别失败",
+ "10411": "访问频率受限",
+ "10412": "超过最大识别流量",
+ "11001": "不支持的语音识别格式",
+ "11002": "不支持的语音识别采样率",
+ "11003": "不支持的语音识别声道",
+ "11004": "不支持的语音上传类型",
+ "11005": "不支持的语言类型",
+ "11006": "识别音频文件过大",
+ "11007": "识别音频时长过长,最大支持30s",
+ "11201": "解密失败",
+ "11301": "语音识别失败",
+ "11303": "服务的异常",
+ "11411": "访问频率受限,请稍后访问",
+ "11412": "超过最大请求时长",
+ "12001": "图片尺寸过大",
+ "12002": "图片base64解密失败",
+ "12003": "引擎服务器返回错误",
+ "12004": "图片为空",
+ "12005": "不支持的识别图片类型",
+ "12006": "图片无匹配结果",
+ "13001": "不支持的角度类型",
+ "13002": "不支持的文件类型",
+ "13003": "表格识别图片过大",
+ "13004": "文件为空",
+ "13301": "表格识别失败",
+ "15001": "需要图片",
+ "15002": "图片过大(1M)",
+ "15003": "服务调用失败",
+ "17001": "需要图片",
+ "17002": "图片过大(1M)",
+ "17003": "识别类型未找到",
+ "17004": "不支持的识别类型",
+ "17005": "服务调用失败",
"500": "有道翻译失败"
}
@@ -64,15 +161,7 @@
QUERY_LANGUAGE = 'EN2zh-CHS'
-def init_sentry():
- # 收集错误信息
- if os.getenv('sentry', 'False').strip():
- sentry_sdk.init(
- "https://4d5a5b1f2e68484da9edd9076b86e9b7@sentry.io/1500348")
- with sentry_sdk.configure_scope() as scope:
- user_id = get_user_id()
- scope.user = {"id": user_id}
- scope.set_tag("version", str(wf.version))
+
def get_user_id():
@@ -83,13 +172,6 @@ def get_user_id():
return user_id
-def sentry_message(errorCode, msg):
- if os.getenv('sentry', 'False').strip():
- with sentry_sdk.configure_scope() as scope:
- scope.set_tag("errorCode", errorCode)
- sentry_sdk.capture_message(msg)
-
-
def get_youdao_url(query):
# 构建有道翻译URL
zhiyun_id = os.getenv('zhiyun_id', '').strip()
@@ -97,25 +179,7 @@ def get_youdao_url(query):
if zhiyun_id and zhiyun_key:
url = get_youdao_new_url(query, zhiyun_id, zhiyun_key)
else:
- youdao_keyfrom = os.getenv('youdao_keyfrom', '').strip()
- youdao_key = os.getenv('youdao_key', '').strip()
- if not youdao_keyfrom or not youdao_key:
- i = random.randrange(0, 11, 1)
- youdao_keyfrom = YOUDAO_DEFAULT_KEYFROM[i]
- youdao_key = YOUDAO_DEFAULT_KEY[i]
- url = get_youdao_old_url(query, youdao_keyfrom, youdao_key)
- wf.logger.debug(url)
- return url
-
-
-def get_youdao_old_url(query, youdao_keyfrom, youdao_key):
- import urllib.parse
-
- query = urllib.parse.quote(str(query))
- url = 'http://fanyi.youdao.com/openapi.do?' + \
- 'keyfrom=' + str(youdao_keyfrom) + \
- '&key=' + str(youdao_key) + \
- '&type=data&doctype=json&version=1.1&q=' + query
+ url = ''
return url
@@ -160,6 +224,10 @@ def fetch_translation(query):
# 获取翻译数据
url = get_youdao_url(query)
+ if url == '':
+ rt = {}
+ rt['errorCode'] = "108"
+ return rt
try:
data = request.urlopen(url).read()
rt = json.loads(data)
@@ -192,31 +260,6 @@ def get_history_data():
except Exception as e:
pass
-
-def is_expired():
- # 检查更新,随机检测
- if random.random() < 0.01 and wf.update_available:
- arg = get_arg_str('', '', operation='update_now')
- wf.add_item(
- title='马上更新(点击后请打开 Alfred 的 Preference 完成更新)',
- subtitle='有新版本更新', arg=arg,
- valid=True, icon=ICON_UPDATE)
-
- arg = get_arg_str('', '', operation='update_with_url')
- wf.add_item(
- title='手动更新', subtitle='有新版本更新', arg=arg,
- valid=True, icon=ICON_ERROR)
-
- arg = get_arg_str('', '', operation='update_next_time')
- wf.add_item(
- title='暂不更新', subtitle='有新版本更新', arg=arg,
- valid=True, icon=ICON_ERROR)
-
- wf.send_feedback()
- return True
- return False
-
-
def get_query_language(query):
import re
global QUERY_LANGUAGE
@@ -315,9 +358,6 @@ def add_web_translation(query, rt):
def main(wf):
- if is_expired():
- return
-
query = wf.args[0].strip()
if query == "*":
@@ -328,9 +368,6 @@ def main(wf):
errorCode = str(rt.get("errorCode"))
if errorCode in ERRORCODE_DICT:
- if errorCode == "500":
- sentry_message(errorCode, ERRORCODE_DICT[errorCode])
-
arg = get_arg_str('', '', operation='error')
wf.add_item(
title=errorCode + " " + ERRORCODE_DICT[errorCode],
@@ -345,7 +382,6 @@ def main(wf):
add_web_translation(query, rt)
else:
- sentry_message(errorCode, '有道也翻译不出来了')
title = '有道也翻译不出来了'
subtitle = '尝试一下去网站搜索'
arg = get_arg_str(query, '')
@@ -356,9 +392,5 @@ def main(wf):
if __name__ == '__main__':
- wf = Workflow3(update_settings={
- 'github_slug': 'whyliam/whyliam.workflows.youdao',
- 'frequency': 7
- })
- init_sentry()
+ wf = Workflow3()
sys.exit(wf.run(main))