From aebf47f5b758bea5062af373cf1148955bac71e6 Mon Sep 17 00:00:00 2001 From: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Date: Thu, 11 Jan 2024 16:14:43 -0800 Subject: [PATCH] Logging Attributes (#1033) * Log Forwarding User Attributes (#682) * Add context data setting Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei * Update record_log_event signature with attributes * Logging attributes initial implementation * Fix settings attribute error * Update logging instrumentation with attributes * Update log handler API * Add loguru support for extra attrs * Add more explicit messaging to validator * Expanding testing for record_log_event * Expand record log event testing * Fix settings typo * Remove missing loguru attributes from test * Adjust safe log attr encoding * Correct py2 issues * Fix missing record attrs in logging. Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek * Log Attribute Filtering (#1008) * Expand validator for log events * Add settings for context data filtering * Add attribute filtering for log events * Linting * Apply suggestions from code review * Remove none check on attributes * Squashed commit of the following: commit 3962f54d91bef1980523f40eb4649ef354634396 Author: Uma Annamalai Date: Thu Jan 4 12:50:58 2024 -0800 Remove case sensitive check in ASGIBrowserMiddleware check. (#1017) * Remove case sensitive check in should_insert_html. * [Mega-Linter] Apply linters fixes * Remove header decoding. --------- Co-authored-by: umaannamalai commit c3314aeac97b21615252f18c98384840a47db06f Author: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Date: Tue Jan 2 17:17:20 2024 -0800 Temporarily pin hypercorn version in tests (#1021) * Temporarily pin hypercorn to <0.16 * Temporarily pin hypercorn to <0.16 * Add comment to tox.ini --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> commit 13571451da8e48a8a2ea96d110e45b8e4ef537e3 Author: Uma Annamalai Date: Tue Jan 2 16:17:08 2024 -0800 Drop py27 from memcache testing. (#1018) commit 23f969fcfa9e1bf52e80766be0786152229cd43c Author: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Date: Wed Dec 20 17:01:50 2023 -0800 Nonced CSP Support (#998) * Add nonce to CSP in browser agent * Adjust nonce position * Add testing for browser timing nonces commit 8bfd2b788b222534639523f294577d724a3be5bb Author: Uma Annamalai Date: Mon Dec 18 13:58:10 2023 -0800 Remove RPM config workflow. (#1007) * Add Dictionary Log Message Support (#1014) * Add tests for logging's json logging * Upgrade record_log_event to handle dict logging * Update logging to capture dict messages * Add attributes for dict log messages * Implementation of JSON message filtering * Correct attributes only log behavior * Testing for logging attributes * Add logging context test for py2 * Logically separate attribute tests * Clean out imports * Fix failing tests * Remove logging instrumentation changes for new PR * Add test for record log event edge cases * Update record_log_event for code review * Fix truncation * Move safe_json_encode back to api.log as it's unused elsewhere * Black formatting * Add missing import * Fixup warning message --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Logging Attribute Instrumentation (#1015) * Add tests for logging's json logging * Upgrade record_log_event to handle dict logging * Update logging to capture dict messages * Add attributes for dict log messages * Implementation of JSON message filtering * Correct attributes only log behavior * Testing for logging attributes * Add logging context test for py2 * Logically separate attribute tests * Clean out imports * Fix failing tests * Linting * Ignore path hash * Fix linter errors * Fix linting issues * Apply suggestions from code review * StructLog Attribute Instrumentation (#1026) * Add tests for logging's json logging * Upgrade record_log_event to handle dict logging * Update logging to capture dict messages * Add attributes for dict log messages * Implementation of JSON message filtering * Correct attributes only log behavior * Testing for logging attributes * Add logging context test for py2 * Logically separate attribute tests * Clean out imports * Fix failing tests * Structlog cleanup * Attempting list instrumentation * Structlog attributes support Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai * Remove other frameworks changes * Bump tests * Change cache to lru cache * Linting * Remove TODO * Remove unnecessary check --------- Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai * Loguru Attribute Instrumentation (#1025) * Add tests for logging's json logging * Upgrade record_log_event to handle dict logging * Update logging to capture dict messages * Add attributes for dict log messages * Implementation of JSON message filtering * Correct attributes only log behavior * Testing for logging attributes * Add logging context test for py2 * Logically separate attribute tests * Clean out imports * Fix failing tests * Structlog cleanup * Attempting list instrumentation * Structlog attributes support Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai * Loguru instrumentation refactor * New attribute testing * Move exception settings * Clean up testing * Remove unneeded option * Remove other framework changes * [Mega-Linter] Apply linters fixes * Bump tests --------- Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai Co-authored-by: TimPansino Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Temporarily pin starlette tests * Update web_transaction.py --------- Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: TimPansino --- newrelic/api/application.py | 7 +- newrelic/api/log.py | 53 ++- newrelic/api/transaction.py | 65 +++- newrelic/common/encoding_utils.py | 173 +++++---- newrelic/config.py | 3 + newrelic/core/agent.py | 6 +- newrelic/core/application.py | 17 +- newrelic/core/attribute.py | 27 ++ newrelic/core/attribute_filter.py | 81 ++-- newrelic/core/config.py | 15 + newrelic/core/stats_engine.py | 57 ++- newrelic/hooks/logger_logging.py | 17 +- newrelic/hooks/logger_loguru.py | 31 +- newrelic/hooks/logger_structlog.py | 73 ++-- setup.cfg | 2 +- tests/agent_features/test_log_events.py | 358 ++++++++++++++---- tests/agent_features/test_logs_in_context.py | 98 ++--- tests/logger_logging/conftest.py | 10 +- tests/logger_logging/test_attributes.py | 90 +++++ tests/logger_logging/test_local_decorating.py | 15 +- tests/logger_logging/test_log_forwarding.py | 48 ++- tests/logger_logging/test_logging_handler.py | 76 +++- tests/logger_loguru/conftest.py | 16 +- tests/logger_loguru/test_attributes.py | 70 ++++ tests/logger_loguru/test_stack_inspection.py | 56 --- tests/logger_structlog/conftest.py | 13 +- .../test_attribute_forwarding.py | 49 --- tests/logger_structlog/test_attributes.py | 97 +++++ .../validators/validate_log_event_count.py | 10 +- ...ate_log_event_count_outside_transaction.py | 10 +- .../validators/validate_log_events.py | 38 +- ...validate_log_events_outside_transaction.py | 26 +- tox.ini | 4 +- 33 files changed, 1208 insertions(+), 503 deletions(-) create mode 100644 tests/logger_logging/test_attributes.py create mode 100644 tests/logger_loguru/test_attributes.py delete mode 100644 tests/logger_loguru/test_stack_inspection.py delete mode 100644 tests/logger_structlog/test_attribute_forwarding.py create mode 100644 tests/logger_structlog/test_attributes.py diff --git a/newrelic/api/application.py b/newrelic/api/application.py index e2e7be139f..ebc8356a76 100644 --- a/newrelic/api/application.py +++ b/newrelic/api/application.py @@ -22,7 +22,6 @@ class Application(object): - _lock = threading.Lock() _instances = {} @@ -162,9 +161,11 @@ def record_transaction(self, data): if self.active: self._agent.record_transaction(self._name, data) - def record_log_event(self, message, level=None, timestamp=None, priority=None): + def record_log_event(self, message, level=None, timestamp=None, attributes=None, priority=None): if self.active: - self._agent.record_log_event(self._name, message, level, timestamp, priority=priority) + self._agent.record_log_event( + self._name, message, level, timestamp, attributes=attributes, priority=priority + ) def normalize_name(self, name, rule_type="url"): if self.active: diff --git a/newrelic/api/log.py b/newrelic/api/log.py index 846ef275ab..f74339f46b 100644 --- a/newrelic/api/log.py +++ b/newrelic/api/log.py @@ -21,9 +21,11 @@ from newrelic.api.time_trace import get_linking_metadata from newrelic.api.transaction import current_transaction, record_log_event from newrelic.common import agent_http +from newrelic.common.encoding_utils import json_encode from newrelic.common.object_names import parse_exc_info from newrelic.core.attribute import truncate from newrelic.core.config import global_settings, is_expected_error +from newrelic.packages import six def format_exc_info(exc_info): @@ -42,8 +44,30 @@ def format_exc_info(exc_info): return formatted +def safe_json_encode(obj, ignore_string_types=False, **kwargs): + # Performs the same operation as json_encode but replaces unserializable objects with a string containing their class name. + # If ignore_string_types is True, do not encode string types further. + # Currently used for safely encoding logging attributes. + + if ignore_string_types and isinstance(obj, (six.string_types, six.binary_type)): + return obj + + # Attempt to run through JSON serialization + try: + return json_encode(obj, **kwargs) + except Exception: + pass + + # If JSON serialization fails then return a repr + try: + return repr(obj) + except Exception: + # If repr fails then default to an unprinatable object name + return "" % type(obj).__name__ + + class NewRelicContextFormatter(Formatter): - DEFAULT_LOG_RECORD_KEYS = frozenset(vars(LogRecord("", 0, "", 0, "", (), None))) + DEFAULT_LOG_RECORD_KEYS = frozenset(set(vars(LogRecord("", 0, "", 0, "", (), None))) | {"message"}) def __init__(self, *args, **kwargs): super(NewRelicContextFormatter, self).__init__() @@ -76,17 +100,12 @@ def log_record_to_dict(cls, record): return output def format(self, record): - def safe_str(object, *args, **kwargs): - """Convert object to str, catching any errors raised.""" - try: - return str(object, *args, **kwargs) - except: - return "" % type(object).__name__ - - return json.dumps(self.log_record_to_dict(record), default=safe_str, separators=(",", ":")) + return json.dumps(self.log_record_to_dict(record), default=safe_json_encode, separators=(",", ":")) class NewRelicLogForwardingHandler(logging.Handler): + DEFAULT_LOG_RECORD_KEYS = frozenset(set(vars(LogRecord("", 0, "", 0, "", (), None))) | {"message"}) + def emit(self, record): try: # Avoid getting local log decorated message @@ -95,10 +114,20 @@ def emit(self, record): else: message = record.getMessage() - record_log_event(message, record.levelname, int(record.created * 1000)) + attrs = self.filter_record_attributes(record) + record_log_event(message, record.levelname, int(record.created * 1000), attributes=attrs) except Exception: self.handleError(record) + @classmethod + def filter_record_attributes(cls, record): + record_attrs = vars(record) + DEFAULT_LOG_RECORD_KEYS = cls.DEFAULT_LOG_RECORD_KEYS + if len(record_attrs) > len(DEFAULT_LOG_RECORD_KEYS): + return {k: v for k, v in six.iteritems(vars(record)) if k not in DEFAULT_LOG_RECORD_KEYS} + else: + return None + class NewRelicLogHandler(logging.Handler): """ @@ -126,8 +155,8 @@ def __init__( "The contributed NewRelicLogHandler has been superseded by automatic instrumentation for " "logging in the standard lib. If for some reason you need to manually configure a handler, " "please use newrelic.api.log.NewRelicLogForwardingHandler to take advantage of all the " - "features included in application log forwarding such as proper batching.", - DeprecationWarning + "features included in application log forwarding such as proper batching.", + DeprecationWarning, ) super(NewRelicLogHandler, self).__init__(level=level) self.license_key = license_key or self.settings.license_key diff --git a/newrelic/api/transaction.py b/newrelic/api/transaction.py index 1fcb453b4a..f581d1a519 100644 --- a/newrelic/api/transaction.py +++ b/newrelic/api/transaction.py @@ -54,6 +54,7 @@ create_attributes, create_user_attributes, process_user_attribute, + resolve_logging_context_attributes, truncate, ) from newrelic.core.attribute_filter import ( @@ -1524,7 +1525,7 @@ def set_transaction_name(self, name, group=None, priority=None): self._group = group self._name = name - def record_log_event(self, message, level=None, timestamp=None, priority=None): + def record_log_event(self, message, level=None, timestamp=None, attributes=None, priority=None): settings = self.settings if not ( settings @@ -1537,18 +1538,62 @@ def record_log_event(self, message, level=None, timestamp=None, priority=None): timestamp = timestamp if timestamp is not None else time.time() level = str(level) if level is not None else "UNKNOWN" + context_attributes = attributes # Name reassigned for clarity - if not message or message.isspace(): - _logger.debug("record_log_event called where message was missing. No log event will be sent.") - return + # Unpack message and attributes from dict inputs + if isinstance(message, dict): + message_attributes = {k: v for k, v in message.items() if k != "message"} + message = message.get("message", "") + else: + message_attributes = None + + if message is not None: + # Coerce message into a string type + if not isinstance(message, six.string_types): + try: + message = str(message) + except Exception: + # Exit early for invalid message type after unpacking + _logger.debug( + "record_log_event called where message could not be converted to a string type. No log event will be sent." + ) + return + + # Truncate the now unpacked and string converted message + message = truncate(message, MAX_LOG_MESSAGE_LENGTH) + + # Collect attributes from linking metadata, context data, and message attributes + collected_attributes = {} + if settings and settings.application_logging.forwarding.context_data.enabled: + if context_attributes: + context_attributes = resolve_logging_context_attributes( + context_attributes, settings.attribute_filter, "context." + ) + if context_attributes: + collected_attributes.update(context_attributes) + + if message_attributes: + message_attributes = resolve_logging_context_attributes( + message_attributes, settings.attribute_filter, "message." + ) + if message_attributes: + collected_attributes.update(message_attributes) + + # Exit early if no message or attributes found after filtering + if (not message or message.isspace()) and not context_attributes and not message_attributes: + _logger.debug( + "record_log_event called where no message and no attributes were found. No log event will be sent." + ) + return - message = truncate(message, MAX_LOG_MESSAGE_LENGTH) + # Finally, add in linking attributes after checking that there is a valid message or at least 1 attribute + collected_attributes.update(get_linking_metadata()) event = LogEventNode( timestamp=timestamp, level=level, message=message, - attributes=get_linking_metadata(), + attributes=collected_attributes, ) self._log_events.add(event, priority=priority) @@ -2062,7 +2107,7 @@ def record_ml_event(event_type, params, application=None): application.record_ml_event(event_type, params) -def record_log_event(message, level=None, timestamp=None, application=None, priority=None): +def record_log_event(message, level=None, timestamp=None, attributes=None, application=None, priority=None): """Record a log event. Args: @@ -2073,12 +2118,12 @@ def record_log_event(message, level=None, timestamp=None, application=None, prio if application is None: transaction = current_transaction() if transaction: - transaction.record_log_event(message, level, timestamp) + transaction.record_log_event(message, level, timestamp, attributes=attributes) else: application = application_instance(activate=False) if application and application.enabled: - application.record_log_event(message, level, timestamp, priority=priority) + application.record_log_event(message, level, timestamp, attributes=attributes, priority=priority) else: _logger.debug( "record_log_event has been called but no transaction or application was running. As a result, " @@ -2089,7 +2134,7 @@ def record_log_event(message, level=None, timestamp=None, application=None, prio timestamp, ) elif application.enabled: - application.record_log_event(message, level, timestamp, priority=priority) + application.record_log_event(message, level, timestamp, attributes=attributes, priority=priority) def accept_distributed_trace_payload(payload, transport_type="HTTP"): diff --git a/newrelic/common/encoding_utils.py b/newrelic/common/encoding_utils.py index bee53df1a9..3b7a519a2b 100644 --- a/newrelic/common/encoding_utils.py +++ b/newrelic/common/encoding_utils.py @@ -31,14 +31,14 @@ from newrelic.packages import six -HEXDIGLC_RE = re.compile('^[0-9a-f]+$') -DELIMITER_FORMAT_RE = re.compile('[ \t]*,[ \t]*') +HEXDIGLC_RE = re.compile("^[0-9a-f]+$") +DELIMITER_FORMAT_RE = re.compile("[ \t]*,[ \t]*") PARENT_TYPE = { - '0': 'App', - '1': 'Browser', - '2': 'Mobile', + "0": "App", + "1": "Browser", + "2": "Mobile", } -BASE64_DECODE_STR = getattr(base64, 'decodestring', None) +BASE64_DECODE_STR = getattr(base64, "decodestring", None) # Functions for encoding/decoding JSON. These wrappers are used in order @@ -48,6 +48,7 @@ # be supplied as key word arguments to allow the wrappers to supply # defaults. + def json_encode(obj, **kwargs): _kwargs = {} @@ -79,21 +80,21 @@ def json_encode(obj, **kwargs): # The third is eliminate white space after separators to trim the # size of the data being sent. - if type(b'') is type(''): # NOQA - _kwargs['encoding'] = 'latin-1' + if type(b"") is type(""): # noqa, pylint: disable=C0123 + _kwargs["encoding"] = "latin-1" def _encode(o): if isinstance(o, bytes): - return o.decode('latin-1') + return o.decode("latin-1") elif isinstance(o, types.GeneratorType): return list(o) - elif hasattr(o, '__iter__'): + elif hasattr(o, "__iter__"): return list(iter(o)) - raise TypeError(repr(o) + ' is not JSON serializable') + raise TypeError(repr(o) + " is not JSON serializable") - _kwargs['default'] = _encode + _kwargs["default"] = _encode - _kwargs['separators'] = (',', ':') + _kwargs["separators"] = (",", ":") # We still allow supplied arguments to override internal defaults if # necessary, but the caller must be sure they aren't dependent on @@ -111,6 +112,7 @@ def json_decode(s, **kwargs): return json.loads(s, **kwargs) + # Functions for obfuscating/deobfuscating text string based on an XOR # cipher. @@ -124,7 +126,7 @@ def xor_cipher_genkey(key, length=None): """ - return bytearray(key[:length], encoding='ascii') + return bytearray(key[:length], encoding="ascii") def xor_cipher_encrypt(text, key): @@ -190,8 +192,8 @@ def xor_cipher_encrypt_base64(text, key): # isn't UTF-8 and so fail with a Unicode decoding error. if isinstance(text, bytes): - text = text.decode('latin-1') - text = text.encode('utf-8').decode('latin-1') + text = text.decode("latin-1") + text = text.encode("utf-8").decode("latin-1") result = base64.b64encode(bytes(xor_cipher_encrypt(text, key))) @@ -202,7 +204,7 @@ def xor_cipher_encrypt_base64(text, key): # produces characters within that codeset. if six.PY3: - return result.decode('ascii') + return result.decode("ascii") return result @@ -223,7 +225,7 @@ def xor_cipher_decrypt_base64(text, key): result = xor_cipher_decrypt(bytearray(base64.b64decode(text)), key) - return bytes(result).decode('utf-8') + return bytes(result).decode("utf-8") obfuscate = xor_cipher_encrypt_base64 @@ -240,13 +242,13 @@ def unpack_field(field): """ if not isinstance(field, bytes): - field = field.encode('UTF-8') + field = field.encode("UTF-8") - data = getattr(base64, 'decodebytes', BASE64_DECODE_STR)(field) + data = getattr(base64, "decodebytes", BASE64_DECODE_STR)(field) data = zlib.decompress(data) if isinstance(data, bytes): - data = data.decode('Latin-1') + data = data.decode("Latin-1") data = json_decode(data) return data @@ -260,13 +262,13 @@ def generate_path_hash(name, seed): """ - rotated = ((seed << 1) | (seed >> 31)) & 0xffffffff + rotated = ((seed << 1) | (seed >> 31)) & 0xFFFFFFFF if not isinstance(name, bytes): - name = name.encode('UTF-8') + name = name.encode("UTF-8") - path_hash = (rotated ^ int(hashlib.md5(name).hexdigest()[-8:], base=16)) - return '%08x' % path_hash + path_hash = rotated ^ int(hashlib.md5(name).hexdigest()[-8:], base=16) # nosec + return "%08x" % path_hash def base64_encode(text): @@ -291,11 +293,11 @@ def base64_encode(text): # and so fail with a Unicode decoding error. if isinstance(text, bytes): - text = text.decode('latin-1') - text = text.encode('utf-8').decode('latin-1') + text = text.decode("latin-1") + text = text.encode("utf-8").decode("latin-1") # Re-encode as utf-8 when passing to b64 encoder - result = base64.b64encode(text.encode('utf-8')) + result = base64.b64encode(text.encode("utf-8")) # The result from base64 encoding will be a byte string but since # dealing with byte strings in Python 2 and Python 3 is quite @@ -304,7 +306,7 @@ def base64_encode(text): # produces characters within that codeset. if six.PY3: - return result.decode('ascii') + return result.decode("ascii") return result @@ -314,7 +316,7 @@ def base64_decode(text): the decoded text is UTF-8 encoded. """ - return base64.b64decode(text).decode('utf-8') + return base64.b64decode(text).decode("utf-8") def gzip_compress(text): @@ -325,9 +327,9 @@ def gzip_compress(text): compressed_data = io.BytesIO() if six.PY3 and isinstance(text, str): - text = text.encode('utf-8') + text = text.encode("utf-8") - with gzip.GzipFile(fileobj=compressed_data, mode='wb') as f: + with gzip.GzipFile(fileobj=compressed_data, mode="wb") as f: f.write(text) return compressed_data.getvalue() @@ -340,7 +342,7 @@ def gzip_decompress(payload): """ data_bytes = io.BytesIO(payload) decoded_data = gzip.GzipFile(fileobj=data_bytes).read() - return decoded_data.decode('utf-8') + return decoded_data.decode("utf-8") def serverless_payload_encode(payload): @@ -358,7 +360,7 @@ def serverless_payload_encode(payload): def ensure_str(s): if not isinstance(s, six.string_types): try: - s = s.decode('utf-8') + s = s.decode("utf-8") except Exception: return return s @@ -370,8 +372,8 @@ def serverless_payload_decode(text): Python object. """ - if hasattr(text, 'decode'): - text = text.decode('utf-8') + if hasattr(text, "decode"): + text = text.decode("utf-8") decoded_bytes = base64.b64decode(text) uncompressed_data = gzip_decompress(decoded_bytes) @@ -384,8 +386,7 @@ def decode_newrelic_header(encoded_header, encoding_key): decoded_header = None if encoded_header: try: - decoded_header = json_decode(deobfuscate( - encoded_header, encoding_key)) + decoded_header = json_decode(deobfuscate(encoded_header, encoding_key)) except Exception: pass @@ -402,7 +403,6 @@ def convert_to_cat_metadata_value(nr_headers): class DistributedTracePayload(dict): - version = (0, 1) def text(self): @@ -437,17 +437,16 @@ def decode(cls, payload): class W3CTraceParent(dict): - def text(self): - if 'id' in self: - guid = self['id'] + if "id" in self: + guid = self["id"] else: - guid = '{:016x}'.format(random.getrandbits(64)) + guid = "{:016x}".format(random.getrandbits(64)) - return '00-{}-{}-{:02x}'.format( - self['tr'].lower().zfill(32), + return "00-{}-{}-{:02x}".format( + self["tr"].lower().zfill(32), guid, - int(self.get('sa', 0)), + int(self.get("sa", 0)), ) @classmethod @@ -456,7 +455,7 @@ def decode(cls, payload): if len(payload) < 55: return None - fields = payload.split('-', 4) + fields = payload.split("-", 4) # Expect that there are at least 4 fields if len(fields) < 4: @@ -469,11 +468,11 @@ def decode(cls, payload): return None # Version 255 is invalid - if version == 'ff': + if version == "ff": return None # Expect exactly 4 fields if version 00 - if version == '00' and len(fields) != 4: + if version == "00" and len(fields) != 4: return None # Check field lengths and values @@ -483,18 +482,15 @@ def decode(cls, payload): # trace_id or parent_id of all 0's are invalid trace_id, parent_id = fields[1:3] - if parent_id == '0' * 16 or trace_id == '0' * 32: + if parent_id == "0" * 16 or trace_id == "0" * 32: return None return cls(tr=trace_id, id=parent_id) class W3CTraceState(OrderedDict): - def text(self, limit=32): - return ','.join( - '{}={}'.format(k, v) - for k, v in itertools.islice(self.items(), limit)) + return ",".join("{}={}".format(k, v) for k, v in itertools.islice(self.items(), limit)) @classmethod def decode(cls, tracestate): @@ -502,9 +498,8 @@ def decode(cls, tracestate): vendors = cls() for entry in entries: - vendor_value = entry.split('=', 2) - if (len(vendor_value) != 2 or - any(len(v) > 256 for v in vendor_value)): + vendor_value = entry.split("=", 2) + if len(vendor_value) != 2 or any(len(v) > 256 for v in vendor_value): continue vendor, value = vendor_value @@ -514,36 +509,38 @@ def decode(cls, tracestate): class NrTraceState(dict): - FIELDS = ('ty', 'ac', 'ap', 'id', 'tx', 'sa', 'pr') + FIELDS = ("ty", "ac", "ap", "id", "tx", "sa", "pr") def text(self): - pr = self.get('pr', '') + pr = self.get("pr", "") if pr: - pr = ('%.6f' % pr).rstrip('0').rstrip('.') - - payload = '-'.join(( - '0-0', - self['ac'], - self['ap'], - self.get('id', ''), - self.get('tx', ''), - '1' if self.get('sa') else '0', - pr, - str(self['ti']), - )) - return '{}@nr={}'.format( - self.get('tk', self['ac']), + pr = ("%.6f" % pr).rstrip("0").rstrip(".") + + payload = "-".join( + ( + "0-0", + self["ac"], + self["ap"], + self.get("id", ""), + self.get("tx", ""), + "1" if self.get("sa") else "0", + pr, + str(self["ti"]), + ) + ) + return "{}@nr={}".format( + self.get("tk", self["ac"]), payload, ) @classmethod def decode(cls, payload, tk): - fields = payload.split('-', 9) + fields = payload.split("-", 9) if len(fields) >= 9 and all(fields[:4]) and fields[8]: data = cls(tk=tk) try: - data['ti'] = int(fields[8]) + data["ti"] = int(fields[8]) except: return @@ -551,24 +548,24 @@ def decode(cls, payload, tk): if value: data[name] = value - if data['ty'] in PARENT_TYPE: - data['ty'] = PARENT_TYPE[data['ty']] + if data["ty"] in PARENT_TYPE: + data["ty"] = PARENT_TYPE[data["ty"]] else: return - if 'sa' in data: - if data['sa'] == '1': - data['sa'] = True - elif data['sa'] == '0': - data['sa'] = False + if "sa" in data: + if data["sa"] == "1": + data["sa"] = True + elif data["sa"] == "0": + data["sa"] = False else: - data['sa'] = None + data["sa"] = None - if 'pr' in data: + if "pr" in data: try: - data['pr'] = float(fields[7]) + data["pr"] = float(fields[7]) except: - data['pr'] = None + data["pr"] = None return data @@ -586,7 +583,7 @@ def capitalize(string): def camel_case(string, upper=False): """ Convert a string of snake case to camel case. - + Setting upper=True will capitalize the first letter. Defaults to False, where no change is made to the first letter. """ string = ensure_str(string) @@ -607,6 +604,8 @@ def camel_case(string, upper=False): _snake_case_re = re.compile(r"([A-Z]+[a-z]*)") + + def snake_case(string): """Convert a string of camel case to snake case. Assumes no repeated runs of capital letters.""" string = ensure_str(string) diff --git a/newrelic/config.py b/newrelic/config.py index d63381e44b..2528d84d35 100644 --- a/newrelic/config.py +++ b/newrelic/config.py @@ -552,6 +552,9 @@ def _process_configuration(section): _process_setting(section, "application_logging.enabled", "getboolean", None) _process_setting(section, "application_logging.forwarding.max_samples_stored", "getint", None) _process_setting(section, "application_logging.forwarding.enabled", "getboolean", None) + _process_setting(section, "application_logging.forwarding.context_data.enabled", "getboolean", None) + _process_setting(section, "application_logging.forwarding.context_data.include", "get", _map_inc_excl_attributes) + _process_setting(section, "application_logging.forwarding.context_data.exclude", "get", _map_inc_excl_attributes) _process_setting(section, "application_logging.metrics.enabled", "getboolean", None) _process_setting(section, "application_logging.local_decorating.enabled", "getboolean", None) diff --git a/newrelic/core/agent.py b/newrelic/core/agent.py index 9d9aadab16..31cef43e89 100644 --- a/newrelic/core/agent.py +++ b/newrelic/core/agent.py @@ -339,7 +339,6 @@ def activate_application(self, app_name, linked_applications=None, timeout=None, with self._lock: application = self._applications.get(app_name, None) if not application: - process_id = os.getpid() if process_id != self._process_id: @@ -449,7 +448,6 @@ def register_data_source(self, source, application=None, name=None, settings=Non instance.register_data_source(source, name, settings, **properties) def remove_thread_utilization(self): - _logger.debug("Removing thread utilization data source from all applications") source_name = thread_utilization_data_source.__name__ @@ -565,12 +563,12 @@ def record_ml_event(self, app_name, event_type, params): application.record_ml_event(event_type, params) - def record_log_event(self, app_name, message, level=None, timestamp=None, priority=None): + def record_log_event(self, app_name, message, level=None, timestamp=None, attributes=None, priority=None): application = self._applications.get(app_name, None) if application is None or not application.active: return - application.record_log_event(message, level, timestamp, priority=priority) + application.record_log_event(message, level, timestamp, attributes=attributes, priority=priority) def record_transaction(self, app_name, data): """Processes the raw transaction data, generating and recording diff --git a/newrelic/core/application.py b/newrelic/core/application.py index 82cdf8a9a0..42cff443ed 100644 --- a/newrelic/core/application.py +++ b/newrelic/core/application.py @@ -939,15 +939,16 @@ def record_ml_event(self, event_type, params): self._global_events_account += 1 self._stats_engine.record_ml_event(event) - def record_log_event(self, message, level=None, timestamp=None, priority=None): + def record_log_event(self, message, level=None, timestamp=None, attributes=None, priority=None): if not self._active_session: return - if message: - with self._stats_custom_lock: - event = self._stats_engine.record_log_event(message, level, timestamp, priority=priority) - if event: - self._global_events_account += 1 + with self._stats_custom_lock: + event = self._stats_engine.record_log_event( + message, level, timestamp, attributes=attributes, priority=priority + ) + if event: + self._global_events_account += 1 def record_transaction(self, data): """Record a single transaction against this application.""" @@ -1506,7 +1507,9 @@ def harvest(self, shutdown=False, flexible=False): # Send metrics self._active_session.send_metric_data(self._period_start, period_end, metric_data) if dimensional_metric_data: - self._active_session.send_dimensional_metric_data(self._period_start, period_end, dimensional_metric_data) + self._active_session.send_dimensional_metric_data( + self._period_start, period_end, dimensional_metric_data + ) _logger.debug("Done sending data for harvest of %r.", self._app_name) diff --git a/newrelic/core/attribute.py b/newrelic/core/attribute.py index 10ae8e4597..04dfdec345 100644 --- a/newrelic/core/attribute.py +++ b/newrelic/core/attribute.py @@ -18,6 +18,7 @@ from newrelic.core.attribute_filter import ( DST_ALL, DST_ERROR_COLLECTOR, + DST_LOG_EVENT_CONTEXT_DATA, DST_SPAN_EVENTS, DST_TRANSACTION_EVENTS, DST_TRANSACTION_SEGMENTS, @@ -174,6 +175,32 @@ def resolve_agent_attributes(attr_dict, attribute_filter, target_destination, at return a_attrs +def resolve_logging_context_attributes(attr_dict, attribute_filter, attr_prefix, attr_class=dict): + """ + Helper function for processing logging context attributes that require a prefix. Correctly filters attribute names + before applying the required prefix, and then applies the process_user_attribute after the prefix is applied to + correctly check length requirements. + """ + c_attrs = attr_class() + + for attr_name, attr_value in attr_dict.items(): + dest = attribute_filter.apply(attr_name, DST_LOG_EVENT_CONTEXT_DATA) + + if dest & DST_LOG_EVENT_CONTEXT_DATA: + try: + attr_name, attr_value = process_user_attribute(attr_prefix + attr_name, attr_value) + if attr_name: + c_attrs[attr_name] = attr_value + except Exception: + _logger.debug( + "Log event context attribute failed to validate for unknown reason. Dropping context attribute: %s. Check traceback for clues.", + attr_name, + exc_info=True, + ) + + return c_attrs + + def create_user_attributes(attr_dict, attribute_filter): destinations = DST_ALL return create_attributes(attr_dict, destinations, attribute_filter) diff --git a/newrelic/core/attribute_filter.py b/newrelic/core/attribute_filter.py index 8d4a93843b..8cd26cb30f 100644 --- a/newrelic/core/attribute_filter.py +++ b/newrelic/core/attribute_filter.py @@ -15,16 +15,17 @@ # Attribute "destinations" represented as bitfields. DST_NONE = 0x0 -DST_ALL = 0x3F -DST_TRANSACTION_EVENTS = 1 << 0 -DST_TRANSACTION_TRACER = 1 << 1 -DST_ERROR_COLLECTOR = 1 << 2 -DST_BROWSER_MONITORING = 1 << 3 -DST_SPAN_EVENTS = 1 << 4 +DST_ALL = 0x7F +DST_TRANSACTION_EVENTS = 1 << 0 +DST_TRANSACTION_TRACER = 1 << 1 +DST_ERROR_COLLECTOR = 1 << 2 +DST_BROWSER_MONITORING = 1 << 3 +DST_SPAN_EVENTS = 1 << 4 DST_TRANSACTION_SEGMENTS = 1 << 5 +DST_LOG_EVENT_CONTEXT_DATA = 1 << 6 -class AttributeFilter(object): +class AttributeFilter(object): # Apply filtering rules to attributes. # # Upon initialization, an AttributeFilter object will take all attribute @@ -59,46 +60,45 @@ class AttributeFilter(object): # 4. Return the resulting bitfield after all rules have been applied. def __init__(self, flattened_settings): - self.enabled_destinations = self._set_enabled_destinations(flattened_settings) self.rules = self._build_rules(flattened_settings) self.cache = {} def __repr__(self): - return "" % ( - bin(self.enabled_destinations), self.rules) + return "" % (bin(self.enabled_destinations), self.rules) def _set_enabled_destinations(self, settings): - # Determines and returns bitfield representing attribute destinations enabled. enabled_destinations = DST_NONE - if settings.get('transaction_segments.attributes.enabled', None): + if settings.get("transaction_segments.attributes.enabled", None): enabled_destinations |= DST_TRANSACTION_SEGMENTS - if settings.get('span_events.attributes.enabled', None): + if settings.get("span_events.attributes.enabled", None): enabled_destinations |= DST_SPAN_EVENTS - if settings.get('transaction_tracer.attributes.enabled', None): + if settings.get("transaction_tracer.attributes.enabled", None): enabled_destinations |= DST_TRANSACTION_TRACER - if settings.get('transaction_events.attributes.enabled', None): + if settings.get("transaction_events.attributes.enabled", None): enabled_destinations |= DST_TRANSACTION_EVENTS - if settings.get('error_collector.attributes.enabled', None): + if settings.get("error_collector.attributes.enabled", None): enabled_destinations |= DST_ERROR_COLLECTOR - if settings.get('browser_monitoring.attributes.enabled', None): + if settings.get("browser_monitoring.attributes.enabled", None): enabled_destinations |= DST_BROWSER_MONITORING - if not settings.get('attributes.enabled', None): + if settings.get("application_logging.forwarding.context_data.enabled", None): + enabled_destinations |= DST_LOG_EVENT_CONTEXT_DATA + + if not settings.get("attributes.enabled", None): enabled_destinations = DST_NONE return enabled_destinations def _build_rules(self, settings): - # "Rule Templates" below are used for building AttributeFilterRules. # # Each tuple includes: @@ -107,26 +107,27 @@ def _build_rules(self, settings): # 3. Boolean that represents whether the setting is an "include" or not. rule_templates = ( - ('attributes.include', DST_ALL, True), - ('attributes.exclude', DST_ALL, False), - ('transaction_events.attributes.include', DST_TRANSACTION_EVENTS, True), - ('transaction_events.attributes.exclude', DST_TRANSACTION_EVENTS, False), - ('transaction_tracer.attributes.include', DST_TRANSACTION_TRACER, True), - ('transaction_tracer.attributes.exclude', DST_TRANSACTION_TRACER, False), - ('error_collector.attributes.include', DST_ERROR_COLLECTOR, True), - ('error_collector.attributes.exclude', DST_ERROR_COLLECTOR, False), - ('browser_monitoring.attributes.include', DST_BROWSER_MONITORING, True), - ('browser_monitoring.attributes.exclude', DST_BROWSER_MONITORING, False), - ('span_events.attributes.include', DST_SPAN_EVENTS, True), - ('span_events.attributes.exclude', DST_SPAN_EVENTS, False), - ('transaction_segments.attributes.include', DST_TRANSACTION_SEGMENTS, True), - ('transaction_segments.attributes.exclude', DST_TRANSACTION_SEGMENTS, False), + ("attributes.include", DST_ALL, True), + ("attributes.exclude", DST_ALL, False), + ("transaction_events.attributes.include", DST_TRANSACTION_EVENTS, True), + ("transaction_events.attributes.exclude", DST_TRANSACTION_EVENTS, False), + ("transaction_tracer.attributes.include", DST_TRANSACTION_TRACER, True), + ("transaction_tracer.attributes.exclude", DST_TRANSACTION_TRACER, False), + ("error_collector.attributes.include", DST_ERROR_COLLECTOR, True), + ("error_collector.attributes.exclude", DST_ERROR_COLLECTOR, False), + ("browser_monitoring.attributes.include", DST_BROWSER_MONITORING, True), + ("browser_monitoring.attributes.exclude", DST_BROWSER_MONITORING, False), + ("span_events.attributes.include", DST_SPAN_EVENTS, True), + ("span_events.attributes.exclude", DST_SPAN_EVENTS, False), + ("transaction_segments.attributes.include", DST_TRANSACTION_SEGMENTS, True), + ("transaction_segments.attributes.exclude", DST_TRANSACTION_SEGMENTS, False), + ("application_logging.forwarding.context_data.include", DST_LOG_EVENT_CONTEXT_DATA, True), + ("application_logging.forwarding.context_data.exclude", DST_LOG_EVENT_CONTEXT_DATA, False), ) rules = [] - for (setting_name, destination, is_include) in rule_templates: - + for setting_name, destination, is_include in rule_templates: for setting in settings.get(setting_name) or (): rule = AttributeFilterRule(setting, destination, is_include) rules.append(rule) @@ -157,16 +158,15 @@ def apply(self, name, default_destinations): self.cache[cache_index] = destinations return destinations -class AttributeFilterRule(object): +class AttributeFilterRule(object): def __init__(self, name, destinations, is_include): - self.name = name.rstrip('*') + self.name = name.rstrip("*") self.destinations = destinations self.is_include = is_include - self.is_wildcard = name.endswith('*') + self.is_wildcard = name.endswith("*") def _as_sortable(self): - # Represent AttributeFilterRule as a tuple that will sort properly. # # Sorting rules: @@ -207,8 +207,7 @@ def __ge__(self, other): return self._as_sortable() >= other._as_sortable() def __repr__(self): - return '(%s, %s, %s, %s)' % (self.name, bin(self.destinations), - self.is_wildcard, self.is_include) + return "(%s, %s, %s, %s)" % (self.name, bin(self.destinations), self.is_wildcard, self.is_include) def name_match(self, name): if self.is_wildcard: diff --git a/newrelic/core/config.py b/newrelic/core/config.py index 9bb0ca7747..f627356c72 100644 --- a/newrelic/core/config.py +++ b/newrelic/core/config.py @@ -301,6 +301,10 @@ class ApplicationLoggingForwardingSettings(Settings): pass +class ApplicationLoggingForwardingContextDataSettings(Settings): + pass + + class ApplicationLoggingMetricsSettings(Settings): pass @@ -398,6 +402,8 @@ class EventHarvestConfigHarvestLimitSettings(Settings): _settings.agent_limits = AgentLimitsSettings() _settings.application_logging = ApplicationLoggingSettings() _settings.application_logging.forwarding = ApplicationLoggingForwardingSettings() +_settings.application_logging.forwarding.context_data = ApplicationLoggingForwardingContextDataSettings() +_settings.application_logging.metrics = ApplicationLoggingMetricsSettings() _settings.application_logging.local_decorating = ApplicationLoggingLocalDecoratingSettings() _settings.application_logging.metrics = ApplicationLoggingMetricsSettings() _settings.machine_learning = MachineLearningSettings() @@ -893,6 +899,15 @@ def default_otlp_host(host): _settings.application_logging.forwarding.enabled = _environ_as_bool( "NEW_RELIC_APPLICATION_LOGGING_FORWARDING_ENABLED", default=True ) +_settings.application_logging.forwarding.context_data.enabled = _environ_as_bool( + "NEW_RELIC_APPLICATION_LOGGING_FORWARDING_CONTEXT_DATA_ENABLED", default=False +) +_settings.application_logging.forwarding.context_data.include = _environ_as_set( + "NEW_RELIC_APPLICATION_LOGGING_FORWARDING_CONTEXT_DATA_INCLUDE", default="" +) +_settings.application_logging.forwarding.context_data.exclude = _environ_as_set( + "NEW_RELIC_APPLICATION_LOGGING_FORWARDING_CONTEXT_DATA_EXCLUDE", default="" +) _settings.application_logging.metrics.enabled = _environ_as_bool( "NEW_RELIC_APPLICATION_LOGGING_METRICS_ENABLED", default=True ) diff --git a/newrelic/core/stats_engine.py b/newrelic/core/stats_engine.py index ebebe7dbe1..a13450585e 100644 --- a/newrelic/core/stats_engine.py +++ b/newrelic/core/stats_engine.py @@ -43,6 +43,7 @@ create_agent_attributes, create_user_attributes, process_user_attribute, + resolve_logging_context_attributes, truncate, ) from newrelic.core.attribute_filter import DST_ERROR_COLLECTOR @@ -1218,7 +1219,7 @@ def record_transaction(self, transaction): ): self._log_events.merge(transaction.log_events, priority=transaction.priority) - def record_log_event(self, message, level=None, timestamp=None, priority=None): + def record_log_event(self, message, level=None, timestamp=None, attributes=None, priority=None): settings = self.__settings if not ( settings @@ -1231,18 +1232,62 @@ def record_log_event(self, message, level=None, timestamp=None, priority=None): timestamp = timestamp if timestamp is not None else time.time() level = str(level) if level is not None else "UNKNOWN" + context_attributes = attributes # Name reassigned for clarity - if not message or message.isspace(): - _logger.debug("record_log_event called where message was missing. No log event will be sent.") - return + # Unpack message and attributes from dict inputs + if isinstance(message, dict): + message_attributes = {k: v for k, v in message.items() if k != "message"} + message = message.get("message", "") + else: + message_attributes = None + + if message is not None: + # Coerce message into a string type + if not isinstance(message, six.string_types): + try: + message = str(message) + except Exception: + # Exit early for invalid message type after unpacking + _logger.debug( + "record_log_event called where message could not be converted to a string type. No log event will be sent." + ) + return + + # Truncate the now unpacked and string converted message + message = truncate(message, MAX_LOG_MESSAGE_LENGTH) + + # Collect attributes from linking metadata, context data, and message attributes + collected_attributes = {} + if settings and settings.application_logging.forwarding.context_data.enabled: + if context_attributes: + context_attributes = resolve_logging_context_attributes( + context_attributes, settings.attribute_filter, "context." + ) + if context_attributes: + collected_attributes.update(context_attributes) + + if message_attributes: + message_attributes = resolve_logging_context_attributes( + message_attributes, settings.attribute_filter, "message." + ) + if message_attributes: + collected_attributes.update(message_attributes) + + # Exit early if no message or attributes found after filtering + if (not message or message.isspace()) and not context_attributes and not message_attributes: + _logger.debug( + "record_log_event called where no message and no attributes were found. No log event will be sent." + ) + return - message = truncate(message, MAX_LOG_MESSAGE_LENGTH) + # Finally, add in linking attributes after checking that there is a valid message or at least 1 attribute + collected_attributes.update(get_linking_metadata()) event = LogEventNode( timestamp=timestamp, level=level, message=message, - attributes=get_linking_metadata(), + attributes=collected_attributes, ) if priority is None: diff --git a/newrelic/hooks/logger_logging.py b/newrelic/hooks/logger_logging.py index 67fb46525c..7b320cd911 100644 --- a/newrelic/hooks/logger_logging.py +++ b/newrelic/hooks/logger_logging.py @@ -24,6 +24,9 @@ from urllib.parse import quote +IGNORED_LOG_RECORD_KEYS = set(["message", "msg"]) + + def add_nr_linking_metadata(message): available_metadata = get_linking_metadata() entity_name = quote(available_metadata.get("entity.name", "")) @@ -74,8 +77,18 @@ def wrap_callHandlers(wrapped, instance, args, kwargs): if settings.application_logging.forwarding and settings.application_logging.forwarding.enabled: try: - message = record.getMessage() - record_log_event(message, level_name, int(record.created * 1000)) + message = record.msg + if not isinstance(message, dict): + # Allow python to convert the message to a string and template it with args. + message = record.getMessage() + + # Grab and filter context attributes from log record + record_attrs = vars(record) + context_attrs = {k: record_attrs[k] for k in record_attrs if k not in IGNORED_LOG_RECORD_KEYS} + + record_log_event( + message=message, level=level_name, timestamp=int(record.created * 1000), attributes=context_attrs + ) except Exception: pass diff --git a/newrelic/hooks/logger_loguru.py b/newrelic/hooks/logger_loguru.py index dc9843b204..2676859072 100644 --- a/newrelic/hooks/logger_loguru.py +++ b/newrelic/hooks/logger_loguru.py @@ -18,19 +18,24 @@ from newrelic.api.application import application_instance from newrelic.api.transaction import current_transaction, record_log_event from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.common.package_version_utils import get_package_version_tuple from newrelic.common.signature import bind_args from newrelic.core.config import global_settings from newrelic.hooks.logger_logging import add_nr_linking_metadata -from newrelic.packages import six _logger = logging.getLogger(__name__) -is_pypy = hasattr(sys, "pypy_version_info") +IS_PYPY = hasattr(sys, "pypy_version_info") +LOGURU_VERSION = get_package_version_tuple("loguru") +LOGURU_FILTERED_RECORD_ATTRS = {"extra", "message", "time", "level", "_nr_original_message", "record"} +ALLOWED_LOGURU_OPTIONS_LENGTHS = frozenset((8, 9)) -def loguru_version(): - from loguru import __version__ - return tuple(int(x) for x in __version__.split(".")) +def _filter_record_attributes(record): + attrs = {k: v for k, v in record.items() if k not in LOGURU_FILTERED_RECORD_ATTRS} + extra_attrs = dict(record.get("extra", {})) + attrs.update({"extra.%s" % k: v for k, v in extra_attrs.items()}) + return attrs def _nr_log_forwarder(message_instance): @@ -59,15 +64,17 @@ def _nr_log_forwarder(message_instance): application.record_custom_metric("Logging/lines/%s" % level_name, {"count": 1}) if settings.application_logging.forwarding and settings.application_logging.forwarding.enabled: + attrs = _filter_record_attributes(record) + try: - record_log_event(message, level_name, int(record["time"].timestamp())) + time = record.get("time", None) + if time: + time = int(time.timestamp()) + record_log_event(message, level_name, time, attributes=attrs) except Exception: pass -ALLOWED_LOGURU_OPTIONS_LENGTHS = frozenset((8, 9)) - - def wrap_log(wrapped, instance, args, kwargs): try: bound_args = bind_args(wrapped, args, kwargs) @@ -78,7 +85,7 @@ def wrap_log(wrapped, instance, args, kwargs): # Loguru looks into the stack trace to find the caller's module and function names. # options[1] tells loguru how far up to look in the stack trace to find the caller. # Because wrap_log is an extra call in the stack trace, loguru needs to look 1 level higher. - if not is_pypy: + if not IS_PYPY: options[1] += 1 else: # PyPy inspection requires an additional frame of offset, as the wrapt internals seem to @@ -109,7 +116,7 @@ def _nr_log_patcher(record): record["_nr_original_message"] = message = record["message"] record["message"] = add_nr_linking_metadata(message) - if loguru_version() > (0, 6, 0): + if LOGURU_VERSION > (0, 6, 0): if original_patcher is not None: patchers = [p for p in original_patcher] # Consumer iterable into list so we can modify # Wipe out reference so patchers aren't called twice, as the framework will handle calling other patchers. @@ -135,7 +142,7 @@ def patch_loguru_logger(logger): logger.add(_nr_log_forwarder, format="{message}") logger._core._nr_instrumented = True elif not hasattr(logger, "_nr_instrumented"): # pragma: no cover - for _, handler in six.iteritems(logger._handlers): + for _, handler in logger._handlers.items(): if handler._writer is _nr_log_forwarder: logger._nr_instrumented = True return diff --git a/newrelic/hooks/logger_structlog.py b/newrelic/hooks/logger_structlog.py index 06d13aa4aa..abb1f44cb9 100644 --- a/newrelic/hooks/logger_structlog.py +++ b/newrelic/hooks/logger_structlog.py @@ -12,19 +12,23 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.common.object_wrapper import wrap_function_wrapper +import functools + +from newrelic.api.application import application_instance from newrelic.api.transaction import current_transaction, record_log_event +from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.common.signature import bind_args from newrelic.core.config import global_settings -from newrelic.api.application import application_instance from newrelic.hooks.logger_logging import add_nr_linking_metadata -from newrelic.common.signature import bind_args +@functools.lru_cache(maxsize=None) def normalize_level_name(method_name): # Look up level number for method name, using result to look up level name for that level number. # Convert result to upper case, and default to UNKNOWN in case of errors or missing values. try: from structlog._log_levels import _LEVEL_TO_NAME, _NAME_TO_LEVEL + return _LEVEL_TO_NAME[_NAME_TO_LEVEL[method_name]].upper() except Exception: return "UNKNOWN" @@ -34,14 +38,7 @@ def bind_process_event(method_name, event, event_kw): return method_name, event, event_kw -def wrap__process_event(wrapped, instance, args, kwargs): - try: - method_name, event, event_kw = bind_process_event(*args, **kwargs) - except TypeError: - return wrapped(*args, **kwargs) - - original_message = event # Save original undecorated message - +def new_relic_event_consumer(logger, level, event): transaction = current_transaction() if transaction: @@ -50,16 +47,27 @@ def wrap__process_event(wrapped, instance, args, kwargs): settings = global_settings() # Return early if application logging not enabled - if settings and settings.application_logging and settings.application_logging.enabled: - if settings.application_logging.local_decorating and settings.application_logging.local_decorating.enabled: - event = add_nr_linking_metadata(event) + if settings and settings.application_logging.enabled: + if isinstance(event, (str, bytes, bytearray)): + message = original_message = event + event_attrs = {} + elif isinstance(event, dict): + message = original_message = event.get("event", "") + event_attrs = {k: v for k, v in event.items() if k != "event"} + else: + # Unclear how to proceed, ignore log. Avoid logging an error message or we may incur an infinite loop. + return event + + if settings.application_logging.local_decorating.enabled: + message = add_nr_linking_metadata(message) + if isinstance(event, (str, bytes, bytearray)): + event = message + elif isinstance(event, dict) and "event" in event: + event["event"] = message - # Send log to processors for filtering, allowing any DropEvent exceptions that occur to prevent instrumentation from recording the log event. - result = wrapped(method_name, event, event_kw) - - level_name = normalize_level_name(method_name) + level_name = normalize_level_name(level) - if settings.application_logging.metrics and settings.application_logging.metrics.enabled: + if settings.application_logging.metrics.enabled: if transaction: transaction.record_custom_metric("Logging/lines", {"count": 1}) transaction.record_custom_metric("Logging/lines/%s" % level_name, {"count": 1}) @@ -69,15 +77,34 @@ def wrap__process_event(wrapped, instance, args, kwargs): application.record_custom_metric("Logging/lines", {"count": 1}) application.record_custom_metric("Logging/lines/%s" % level_name, {"count": 1}) - if settings.application_logging.forwarding and settings.application_logging.forwarding.enabled: + if settings.application_logging.forwarding.enabled: try: - record_log_event(original_message, level_name) + record_log_event(original_message, level_name, attributes=event_attrs) except Exception: pass - # Return the result from wrapped after we've recorded the resulting log event. - return result + return event + + +def wrap__process_event(wrapped, instance, args, kwargs): + transaction = current_transaction() + + if transaction: + settings = transaction.settings + else: + settings = global_settings() + + # Return early if application logging not enabled + if settings and settings.application_logging.enabled: + processors = instance._processors + if not processors: + instance._processors = [new_relic_event_consumer] + elif processors[-1] != new_relic_event_consumer: + # Remove our processor if it exists and add it to the end + if new_relic_event_consumer in processors: + processors.remove(new_relic_event_consumer) + processors.append(new_relic_event_consumer) return wrapped(*args, **kwargs) diff --git a/setup.cfg b/setup.cfg index 006265c364..8a41f1534d 100644 --- a/setup.cfg +++ b/setup.cfg @@ -5,4 +5,4 @@ license_files = [flake8] max-line-length = 120 -extend-ignore = E122,E126,E127,E128,E203,E501,E722,F841,W504,E731 +extend-ignore = E122,E126,E127,E128,E203,E501,E722,F841,W504,E731,F811 diff --git a/tests/agent_features/test_log_events.py b/tests/agent_features/test_log_events.py index bb173d6c4e..fb9991a823 100644 --- a/tests/agent_features/test_log_events.py +++ b/tests/agent_features/test_log_events.py @@ -12,14 +12,48 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.api.background_task import background_task -from newrelic.api.time_trace import current_trace -from newrelic.api.transaction import current_transaction, record_log_event, ignore_transaction -from testing_support.fixtures import override_application_settings, reset_core_stats_engine +import pytest +from testing_support.fixtures import ( + override_application_settings, + reset_core_stats_engine, +) from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.validators.validate_log_event_count_outside_transaction import validate_log_event_count_outside_transaction +from testing_support.validators.validate_log_event_count_outside_transaction import ( + validate_log_event_count_outside_transaction, +) from testing_support.validators.validate_log_events import validate_log_events -from testing_support.validators.validate_log_events_outside_transaction import validate_log_events_outside_transaction +from testing_support.validators.validate_log_events_outside_transaction import ( + validate_log_events_outside_transaction, +) + +from newrelic.api.background_task import background_task +from newrelic.api.time_trace import current_trace +from newrelic.api.transaction import ( + current_transaction, + ignore_transaction, + record_log_event, +) +from newrelic.core.config import _parse_attributes + + +class NonPrintableObject(object): + def __str__(self): + raise RuntimeError("Unable to print object.") + + __repr__ = __str__ + + +class NonSerializableObject(object): + def __str__(self): + return "<%s object>" % self.__class__.__name__ + + __repr__ = __str__ + + +def combine_dicts(defaults, overrides): + combined = defaults.copy() + combined.update(overrides) + return combined def set_trace_ids(): @@ -31,155 +65,333 @@ def set_trace_ids(): trace.guid = "abcdefgh" -def exercise_record_log_event(message="A"): +def exercise_record_log_event(): set_trace_ids() - record_log_event(message, "ERROR") - -enable_log_forwarding = override_application_settings({"application_logging.forwarding.enabled": True}) + record_log_event("no_other_arguments") + record_log_event("keyword_arguments", timestamp=1234, level="ERROR", attributes={"key": "value"}) + record_log_event("positional_arguments", "WARNING", 2345, {"key": "value"}) + record_log_event("serialized_attributes", attributes=_serialized_attributes) + record_log_event(None, attributes={"attributes_only": "value"}) + record_log_event({"attributes_only": "value"}) + record_log_event({"message": "dict_message"}) + record_log_event({"message": 123}) + + # Unsent due to message content missing + record_log_event("") + record_log_event(" ") + record_log_event(NonPrintableObject()) + record_log_event({"message": ""}) + record_log_event({"message": NonPrintableObject()}) + record_log_event({"filtered_attribute": "should_be_removed"}) + record_log_event(None) + + +enable_log_forwarding = override_application_settings( + { + "application_logging.forwarding.enabled": True, + "application_logging.forwarding.context_data.enabled": True, + "application_logging.forwarding.context_data.exclude": ["filtered_attribute"], + } +) disable_log_forwarding = override_application_settings({"application_logging.forwarding.enabled": False}) -_common_attributes_service_linking = {"timestamp": None, "hostname": None, "entity.name": "Python Agent Test (agent_features)", "entity.guid": None} +disable_log_attributes = override_application_settings( + {"application_logging.forwarding.enabled": True, "application_logging.forwarding.context_data.enabled": False} +) + +_common_attributes_service_linking = { + "timestamp": None, + "hostname": None, + "entity.name": "Python Agent Test (agent_features)", + "entity.guid": None, +} _common_attributes_trace_linking = {"span.id": "abcdefgh", "trace.id": "abcdefgh12345678"} _common_attributes_trace_linking.update(_common_attributes_service_linking) -_test_record_log_event_inside_transaction_events = [{"message": "A", "level": "ERROR"}] -_test_record_log_event_inside_transaction_events[0].update(_common_attributes_trace_linking) + +_serialized_attributes = { + "str_attr": "Value", + "bytes_attr": b"value", + "int_attr": 1, + "dict_attr": {"key": "value"}, + "non_serializable_attr": NonSerializableObject(), + "non_printable_attr": NonPrintableObject(), + "attr_value_too_long": "*" * 256, + "attr_name_too_long_" + ("*" * 237): "value", + "attr_name_with_prefix_too_long_" + ("*" * 220): "value", +} + +_exercise_record_log_event_events = [ + {"message": "no_other_arguments", "level": "UNKNOWN"}, + {"message": "keyword_arguments", "level": "ERROR", "timestamp": 1234, "context.key": "value"}, + {"message": "positional_arguments", "level": "WARNING", "timestamp": 2345, "context.key": "value"}, + { + "message": "serialized_attributes", + "context.str_attr": "Value", + "context.bytes_attr": b"value", + "context.int_attr": 1, + "context.dict_attr": "{'key': 'value'}", + "context.non_serializable_attr": "", + "context.attr_value_too_long": "*" * 255, + }, + {"context.attributes_only": "value"}, + {"message.attributes_only": "value"}, + {"message": "dict_message"}, + {"message": "123"}, +] +_exercise_record_log_event_inside_transaction_events = [ + combine_dicts(_common_attributes_trace_linking, log) for log in _exercise_record_log_event_events +] +_exercise_record_log_event_outside_transaction_events = [ + combine_dicts(_common_attributes_service_linking, log) for log in _exercise_record_log_event_events +] +_exercise_record_log_event_forgone_attrs = [ + "context.non_printable_attr", + "attr_name_too_long_", + "attr_name_with_prefix_too_long_", +] + + +# Test Log Forwarding + @enable_log_forwarding def test_record_log_event_inside_transaction(): - @validate_log_events(_test_record_log_event_inside_transaction_events) - @validate_log_event_count(1) + @validate_log_events( + _exercise_record_log_event_inside_transaction_events, forgone_attrs=_exercise_record_log_event_forgone_attrs + ) + @validate_log_event_count(len(_exercise_record_log_event_inside_transaction_events)) @background_task() def test(): exercise_record_log_event() - - test() + test() -_test_record_log_event_outside_transaction_events = [{"message": "A", "level": "ERROR"}] -_test_record_log_event_outside_transaction_events[0].update(_common_attributes_service_linking) @enable_log_forwarding @reset_core_stats_engine() def test_record_log_event_outside_transaction(): - @validate_log_events_outside_transaction(_test_record_log_event_outside_transaction_events) - @validate_log_event_count_outside_transaction(1) + @validate_log_events_outside_transaction( + _exercise_record_log_event_outside_transaction_events, forgone_attrs=_exercise_record_log_event_forgone_attrs + ) + @validate_log_event_count_outside_transaction(len(_exercise_record_log_event_outside_transaction_events)) def test(): exercise_record_log_event() test() -_test_record_log_event_unknown_level_inside_transaction_events = [{"message": "A", "level": "UNKNOWN"}] -_test_record_log_event_unknown_level_inside_transaction_events[0].update(_common_attributes_trace_linking) +@enable_log_forwarding +def test_ignored_transaction_logs_not_forwarded(): + @validate_log_event_count(0) + @background_task() + def test(): + ignore_transaction() + exercise_record_log_event() + + test() + + +# Test Message Truncation + +_test_log_event_truncation_events = [{"message": "A" * 32768}] + @enable_log_forwarding -def test_record_log_event_unknown_level_inside_transaction(): - @validate_log_events(_test_record_log_event_unknown_level_inside_transaction_events) +def test_log_event_truncation_inside_transaction(): + @validate_log_events(_test_log_event_truncation_events) @validate_log_event_count(1) @background_task() def test(): - set_trace_ids() - record_log_event("A") - - test() + record_log_event("A" * 33000) + test() -_test_record_log_event_unknown_level_outside_transaction_events = [{"message": "A", "level": "UNKNOWN"}] -_test_record_log_event_unknown_level_outside_transaction_events[0].update(_common_attributes_service_linking) @enable_log_forwarding @reset_core_stats_engine() -def test_record_log_event_unknown_level_outside_transaction(): - @validate_log_events_outside_transaction(_test_record_log_event_unknown_level_outside_transaction_events) +def test_log_event_truncation_outside_transaction(): + @validate_log_events_outside_transaction(_test_log_event_truncation_events) @validate_log_event_count_outside_transaction(1) def test(): - set_trace_ids() - record_log_event("A") + record_log_event("A" * 33000) test() -@enable_log_forwarding -def test_record_log_event_empty_message_inside_transaction(): +# Test Log Forwarding Settings + + +@disable_log_forwarding +def test_disabled_record_log_event_inside_transaction(): @validate_log_event_count(0) @background_task() def test(): - exercise_record_log_event("") - + exercise_record_log_event() + test() -@enable_log_forwarding +@disable_log_forwarding @reset_core_stats_engine() -def test_record_log_event_empty_message_outside_transaction(): +def test_disabled_record_log_event_outside_transaction(): @validate_log_event_count_outside_transaction(0) def test(): - exercise_record_log_event("") + exercise_record_log_event() test() -@enable_log_forwarding -def test_record_log_event_whitespace_inside_transaction(): - @validate_log_event_count(0) +# Test Log Attribute Settings + + +@disable_log_attributes +def test_attributes_disabled_inside_transaction(): + @validate_log_events([{"message": "A"}], forgone_attrs=["context.key"]) + @validate_log_event_count(1) @background_task() def test(): - exercise_record_log_event(" ") + record_log_event("A", attributes={"key": "value"}) test() -@enable_log_forwarding +@disable_log_attributes @reset_core_stats_engine() -def test_record_log_event_whitespace_outside_transaction(): - @validate_log_event_count_outside_transaction(0) +def test_attributes_disabled_outside_transaction(): + @validate_log_events_outside_transaction([{"message": "A"}], forgone_attrs=["context.key"]) + @validate_log_event_count_outside_transaction(1) def test(): - exercise_record_log_event(" ") + record_log_event("A", attributes={"key": "value"}) test() -@enable_log_forwarding -def test_ignored_transaction_logs_not_forwarded(): - @validate_log_event_count(0) +_test_record_log_event_context_attribute_filtering_params = [ + ("", "", "A", True), + ("", "A", "A", False), + ("", "A", "B", True), + ("A B", "*", "A", True), + ("A B", "*", "B", True), + ("A B", "*", "C", False), + ("A B", "C", "A", True), + ("A B", "C", "C", False), + ("A B", "B", "A", True), + ("A B", "B", "B", False), + ("A", "A *", "A", False), + ("A", "A *", "B", False), + ("A*", "", "A", True), + ("A*", "", "AB", True), + ("", "A*", "A", False), + ("", "A*", "B", True), + ("A*", "AB", "AC", True), + ("A*", "AB", "AB", False), + ("AB", "A*", "AB", True), + ("A*", "AB*", "ACB", True), + ("A*", "AB*", "ABC", False), +] + + +@pytest.mark.parametrize("prefix", ("context", "message")) +@pytest.mark.parametrize("include,exclude,attr,expected", _test_record_log_event_context_attribute_filtering_params) +def test_record_log_event_context_attribute_filtering_inside_transaction(include, exclude, attr, expected, prefix): + if expected: + expected_event = {"required_attrs": [".".join((prefix, attr))]} + else: + expected_event = {"forgone_attrs": [".".join((prefix, attr))]} + + @override_application_settings( + { + "application_logging.forwarding.enabled": True, + "application_logging.forwarding.context_data.enabled": True, + "application_logging.forwarding.context_data.include": _parse_attributes(include), + "application_logging.forwarding.context_data.exclude": _parse_attributes(exclude), + } + ) + @validate_log_events(**expected_event) + @validate_log_event_count(1) @background_task() def test(): - ignore_transaction() - exercise_record_log_event() + if prefix == "context": + record_log_event("A", attributes={attr: 1}) + else: + record_log_event({"message": "A", attr: 1}) test() -_test_log_event_truncation_events = [{"message": "A" * 32768, "level": "ERROR"}] -_test_log_event_truncation_events[0].update(_common_attributes_trace_linking) - -@enable_log_forwarding -def test_log_event_truncation(): - @validate_log_events(_test_log_event_truncation_events) - @validate_log_event_count(1) - @background_task() +@pytest.mark.parametrize("prefix", ("context", "message")) +@pytest.mark.parametrize("include,exclude,attr,expected", _test_record_log_event_context_attribute_filtering_params) +@reset_core_stats_engine() +def test_record_log_event_context_attribute_filtering_outside_transaction(include, exclude, attr, expected, prefix): + if expected: + expected_event = {"required_attrs": [".".join((prefix, attr))]} + else: + expected_event = {"forgone_attrs": [".".join((prefix, attr))]} + + @override_application_settings( + { + "application_logging.forwarding.enabled": True, + "application_logging.forwarding.context_data.enabled": True, + "application_logging.forwarding.context_data.include": _parse_attributes(include), + "application_logging.forwarding.context_data.exclude": _parse_attributes(exclude), + } + ) + @validate_log_events_outside_transaction(**expected_event) + @validate_log_event_count_outside_transaction(1) def test(): - exercise_record_log_event("A" * 33000) + if prefix == "context": + record_log_event("A", attributes={attr: 1}) + else: + record_log_event({"message": "A", attr: 1}) test() -@disable_log_forwarding -def test_record_log_event_inside_transaction(): - @validate_log_event_count(0) +_test_record_log_event_linking_attribute_no_filtering_params = [ + ("", ""), + ("", "entity.name"), + ("", "*"), +] + + +@pytest.mark.parametrize("include,exclude", _test_record_log_event_linking_attribute_no_filtering_params) +def test_record_log_event_linking_attribute_no_filtering_inside_transaction(include, exclude): + attr = "entity.name" + + @override_application_settings( + { + "application_logging.forwarding.enabled": True, + "application_logging.forwarding.context_data.enabled": True, + "application_logging.forwarding.context_data.include": _parse_attributes(include), + "application_logging.forwarding.context_data.exclude": _parse_attributes(exclude), + } + ) + @validate_log_events(required_attrs=[attr]) + @validate_log_event_count(1) @background_task() def test(): - exercise_record_log_event() - + record_log_event("A") + test() -@disable_log_forwarding +@pytest.mark.parametrize("include,exclude", _test_record_log_event_linking_attribute_no_filtering_params) @reset_core_stats_engine() -def test_record_log_event_outside_transaction(): - @validate_log_event_count_outside_transaction(0) +def test_record_log_event_linking_attribute_filtering_outside_transaction(include, exclude): + attr = "entity.name" + + @override_application_settings( + { + "application_logging.forwarding.enabled": True, + "application_logging.forwarding.context_data.enabled": True, + "application_logging.forwarding.context_data.include": _parse_attributes(include), + "application_logging.forwarding.context_data.exclude": _parse_attributes(exclude), + } + ) + @validate_log_events_outside_transaction(required_attrs=[attr]) + @validate_log_event_count_outside_transaction(1) def test(): - exercise_record_log_event() + record_log_event("A") test() diff --git a/tests/agent_features/test_logs_in_context.py b/tests/agent_features/test_logs_in_context.py index 90b6c92672..8693c0f083 100644 --- a/tests/agent_features/test_logs_in_context.py +++ b/tests/agent_features/test_logs_in_context.py @@ -51,8 +51,14 @@ class NonPrintableObject(object): def __str__(self): raise RuntimeError("Unable to print object.") - def __repr__(self): - raise RuntimeError("Unable to print object.") + __repr__ = __str__ + + +class NonSerializableObject(object): + def __str__(self): + return "<%s object>" % self.__class__.__name__ + + __repr__ = __str__ def test_newrelic_logger_no_error(log_buffer): @@ -63,14 +69,15 @@ def test_newrelic_logger_no_error(log_buffer): "null": None, "array": [1, 2, 3], "bool": True, - "non_serializable": {"set"}, + "set": {"set"}, + "non_serializable": NonSerializableObject(), "non_printable": NonPrintableObject(), "object": { "first": "bar", "second": "baz", }, } - _logger.info(u"Hello %s", u"World", extra=extra) + _logger.info("Hello %s", "World", extra=extra) log_buffer.seek(0) message = json.load(log_buffer) @@ -88,24 +95,25 @@ def test_newrelic_logger_no_error(log_buffer): assert isinstance(line_number, int) expected = { - u"entity.name": u"Python Agent Test (agent_features)", - u"entity.type": u"SERVICE", - u"message": u"Hello World", - u"log.level": u"INFO", - u"logger.name": u"test_logs_in_context", - u"thread.name": u"MainThread", - u"process.name": u"MainProcess", - u"extra.string": u"foo", - u"extra.integer": 1, - u"extra.float": 1.23, - u"extra.null": None, - u"extra.array": [1, 2, 3], - u"extra.bool": True, - u"extra.non_serializable": u"set(['set'])" if six.PY2 else u"{'set'}", - u"extra.non_printable": u"", - u"extra.object": { - u"first": u"bar", - u"second": u"baz", + "entity.name": "Python Agent Test (agent_features)", + "entity.type": "SERVICE", + "message": "Hello World", + "log.level": "INFO", + "logger.name": "test_logs_in_context", + "thread.name": "MainThread", + "process.name": "MainProcess", + "extra.string": "foo", + "extra.integer": 1, + "extra.float": 1.23, + "extra.null": None, + "extra.array": [1, 2, 3], + "extra.bool": True, + "extra.set": '["set"]', + "extra.non_serializable": "", + "extra.non_printable": "", + "extra.object": { + "first": "bar", + "second": "baz", }, } expected_extra_txn_keys = ( @@ -119,7 +127,6 @@ def test_newrelic_logger_no_error(log_buffer): assert set(message.keys()) == set(expected_extra_txn_keys) - class ExceptionForTest(ValueError): pass @@ -129,7 +136,7 @@ def test_newrelic_logger_error_inside_transaction(log_buffer): try: raise ExceptionForTest except ExceptionForTest: - _logger.exception(u"oops") + _logger.exception("oops") log_buffer.seek(0) message = json.load(log_buffer) @@ -147,16 +154,16 @@ def test_newrelic_logger_error_inside_transaction(log_buffer): assert isinstance(line_number, int) expected = { - u"entity.name": u"Python Agent Test (agent_features)", - u"entity.type": u"SERVICE", - u"message": u"oops", - u"log.level": u"ERROR", - u"logger.name": u"test_logs_in_context", - u"thread.name": u"MainThread", - u"process.name": u"MainProcess", - u"error.class": u"test_logs_in_context:ExceptionForTest", - u"error.message": u"", - u"error.expected": False, + "entity.name": "Python Agent Test (agent_features)", + "entity.type": "SERVICE", + "message": "oops", + "log.level": "ERROR", + "logger.name": "test_logs_in_context", + "thread.name": "MainThread", + "process.name": "MainProcess", + "error.class": "test_logs_in_context:ExceptionForTest", + "error.message": "", + "error.expected": False, } expected_extra_txn_keys = ( "trace.id", @@ -175,7 +182,7 @@ def test_newrelic_logger_error_outside_transaction(log_buffer): try: raise ExceptionForTest except ExceptionForTest: - _logger.exception(u"oops") + _logger.exception("oops") log_buffer.seek(0) message = json.load(log_buffer) @@ -193,15 +200,15 @@ def test_newrelic_logger_error_outside_transaction(log_buffer): assert isinstance(line_number, int) expected = { - u"entity.name": u"Python Agent Test (agent_features)", - u"entity.type": u"SERVICE", - u"message": u"oops", - u"log.level": u"ERROR", - u"logger.name": u"test_logs_in_context", - u"thread.name": u"MainThread", - u"process.name": u"MainProcess", - u"error.class": u"test_logs_in_context:ExceptionForTest", - u"error.message": u"", + "entity.name": "Python Agent Test (agent_features)", + "entity.type": "SERVICE", + "message": "oops", + "log.level": "ERROR", + "logger.name": "test_logs_in_context", + "thread.name": "MainThread", + "process.name": "MainProcess", + "error.class": "test_logs_in_context:ExceptionForTest", + "error.message": "", } expected_extra_txn_keys = ( "entity.guid", @@ -214,14 +221,13 @@ def test_newrelic_logger_error_outside_transaction(log_buffer): assert set(message.keys()) == set(expected_extra_txn_keys) - EXPECTED_KEYS_TXN = ( "trace.id", "span.id", "entity.name", "entity.type", "entity.guid", - "hostname", + "hostname", ) EXPECTED_KEYS_NO_TXN = EXPECTED_KEYS_TRACE_ENDED = ( diff --git a/tests/logger_logging/conftest.py b/tests/logger_logging/conftest.py index 46e8f4ec3a..4e308eedfc 100644 --- a/tests/logger_logging/conftest.py +++ b/tests/logger_logging/conftest.py @@ -13,10 +13,12 @@ # limitations under the License. import logging -import pytest - -from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 +import pytest +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -26,6 +28,7 @@ "debug.record_transaction_failure": True, "application_logging.enabled": True, "application_logging.forwarding.enabled": True, + "application_logging.forwarding.context_data.enabled": True, "application_logging.metrics.enabled": True, "application_logging.local_decorating.enabled": True, "event_harvest_config.harvest_limits.log_event_data": 100000, @@ -42,6 +45,7 @@ class CaplogHandler(logging.StreamHandler): To prevent possible issues with pytest's monkey patching use a custom Caplog handler to capture all records """ + def __init__(self, *args, **kwargs): self.records = [] super(CaplogHandler, self).__init__(*args, **kwargs) diff --git a/tests/logger_logging/test_attributes.py b/tests/logger_logging/test_attributes.py new file mode 100644 index 0000000000..2a8240e120 --- /dev/null +++ b/tests/logger_logging/test_attributes.py @@ -0,0 +1,90 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import pytest +from testing_support.validators.validate_log_event_count import validate_log_event_count +from testing_support.validators.validate_log_events import validate_log_events + +from newrelic.api.background_task import background_task +from newrelic.packages import six + + +@validate_log_events( + [ + { # Fixed attributes + "message": "context_attrs: arg1", + "context.args": "('arg1',)", + "context.filename": "test_attributes.py", + "context.funcName": "test_logging_default_context_attributes", + "context.levelname": "ERROR", + "context.levelno": 40, + "context.module": "test_attributes", + "context.name": "my_app", + "context.pathname": str(__file__), + "context.processName": "MainProcess", + "context.threadName": "MainThread", + } + ], + required_attrs=[ # Variable attributes + "context.created", + "context.lineno", + "context.msecs", + "context.process", + "context.relativeCreated", + "context.thread", + ], +) +@validate_log_event_count(1) +@background_task() +def test_logging_default_context_attributes(logger): + logger.error("context_attrs: %s", "arg1") + + +@validate_log_events([{"message": "extras", "context.extra_attr": 1}]) +@validate_log_event_count(1) +@background_task() +def test_logging_extra_attributes(logger): + logger.error("extras", extra={"extra_attr": 1}) + + +@validate_log_events([{"message": "exc_info"}], required_attrs=["context.exc_info", "context.exc_text"]) +@validate_log_event_count(1) +@background_task() +def test_logging_exc_info_context_attributes(logger): + try: + raise RuntimeError("Oops") + except Exception: + logger.error("exc_info", exc_info=True) + + +@pytest.mark.skipif(six.PY2, reason="stack_info on log messages not available in Python 2.") +@validate_log_events([{"message": "stack_info"}], required_attrs=["context.stack_info"]) +@validate_log_event_count(1) +@background_task() +def test_logging_stack_info_context_attributes(logger): + logger.error("stack_info", stack_info=True) + + +@validate_log_events([{"message": "A", "message.attr": 1}]) +@validate_log_event_count(1) +@background_task() +def test_logging_dict_message_and_attributes(logger): + logger.error({"message": "A", "attr": 1}) + + +@validate_log_events([{"message.attr": 1}]) +@validate_log_event_count(1) +@background_task() +def test_logging_dict_attributes_only(logger): + logger.error({"attr": 1}) diff --git a/tests/logger_logging/test_local_decorating.py b/tests/logger_logging/test_local_decorating.py index d4917eff59..7b420375cb 100644 --- a/tests/logger_logging/test_local_decorating.py +++ b/tests/logger_logging/test_local_decorating.py @@ -35,10 +35,10 @@ def set_trace_ids(): trace.guid = "abcdefgh" -def exercise_logging(logger): +def exercise_logging(logger, message="C"): set_trace_ids() - logger.warning("C") + logger.warning(message) def get_metadata_string(log_message, is_txn): @@ -82,3 +82,14 @@ def test(): assert logger.caplog.records[0] == get_metadata_string("C", False) test() + + +@reset_core_stats_engine() +def test_local_log_decoration_dict_message(logger): + @validate_log_event_count(1) + @background_task() + def test(): + exercise_logging(logger, {"message": "dict_message"}) + assert logger.caplog.records[0] == get_metadata_string("{'message': 'dict_message'}", True) + + test() diff --git a/tests/logger_logging/test_log_forwarding.py b/tests/logger_logging/test_log_forwarding.py index ee6c689b76..e9effdee46 100644 --- a/tests/logger_logging/test_log_forwarding.py +++ b/tests/logger_logging/test_log_forwarding.py @@ -14,14 +14,19 @@ import logging -from newrelic.api.background_task import background_task -from newrelic.api.time_trace import current_trace -from newrelic.api.transaction import current_transaction from testing_support.fixtures import reset_core_stats_engine from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.validators.validate_log_event_count_outside_transaction import validate_log_event_count_outside_transaction +from testing_support.validators.validate_log_event_count_outside_transaction import ( + validate_log_event_count_outside_transaction, +) from testing_support.validators.validate_log_events import validate_log_events -from testing_support.validators.validate_log_events_outside_transaction import validate_log_events_outside_transaction +from testing_support.validators.validate_log_events_outside_transaction import ( + validate_log_events_outside_transaction, +) + +from newrelic.api.background_task import background_task +from newrelic.api.time_trace import current_trace +from newrelic.api.transaction import current_transaction def set_trace_ids(): @@ -32,6 +37,7 @@ def set_trace_ids(): if trace: trace.guid = "abcdefgh" + def exercise_logging(logger): set_trace_ids() @@ -39,16 +45,22 @@ def exercise_logging(logger): logger.info("B") logger.warning("C") logger.error("D") - logger.critical("E") - + logger.critical({"message": "E"}) + assert len(logger.caplog.records) == 3 + def update_all(events, attrs): for event in events: event.update(attrs) -_common_attributes_service_linking = {"timestamp": None, "hostname": None, "entity.name": "Python Agent Test (logger_logging)", "entity.guid": None} +_common_attributes_service_linking = { + "timestamp": None, + "hostname": None, + "entity.name": "Python Agent Test (logger_logging)", + "entity.guid": None, +} _common_attributes_trace_linking = {"span.id": "abcdefgh", "trace.id": "abcdefgh12345678"} _common_attributes_trace_linking.update(_common_attributes_service_linking) @@ -60,14 +72,11 @@ def update_all(events, attrs): update_all(_test_logging_inside_transaction_events, _common_attributes_trace_linking) +@validate_log_events(_test_logging_inside_transaction_events) +@validate_log_event_count(3) +@background_task() def test_logging_inside_transaction(logger): - @validate_log_events(_test_logging_inside_transaction_events) - @validate_log_event_count(3) - @background_task() - def test(): - exercise_logging(logger) - - test() + exercise_logging(logger) _test_logging_outside_transaction_events = [ @@ -79,13 +88,10 @@ def test(): @reset_core_stats_engine() +@validate_log_events_outside_transaction(_test_logging_outside_transaction_events) +@validate_log_event_count_outside_transaction(3) def test_logging_outside_transaction(logger): - @validate_log_events_outside_transaction(_test_logging_outside_transaction_events) - @validate_log_event_count_outside_transaction(3) - def test(): - exercise_logging(logger) - - test() + exercise_logging(logger) @reset_core_stats_engine() diff --git a/tests/logger_logging/test_logging_handler.py b/tests/logger_logging/test_logging_handler.py index 9bf488582e..a9028459fb 100644 --- a/tests/logger_logging/test_logging_handler.py +++ b/tests/logger_logging/test_logging_handler.py @@ -12,25 +12,35 @@ # See the License for the specific language governing permissions and # limitations under the License. -from conftest import logger as conf_logger import logging + import pytest +from conftest import logger as conf_logger # noqa, pylint: disable=W0611 +from testing_support.fixtures import ( + override_application_settings, + reset_core_stats_engine, +) +from testing_support.validators.validate_function_called import validate_function_called +from testing_support.validators.validate_log_event_count import validate_log_event_count +from testing_support.validators.validate_log_event_count_outside_transaction import ( + validate_log_event_count_outside_transaction, +) +from testing_support.validators.validate_log_events import validate_log_events +from testing_support.validators.validate_log_events_outside_transaction import ( + validate_log_events_outside_transaction, +) from newrelic.api.background_task import background_task from newrelic.api.log import NewRelicLogForwardingHandler -from testing_support.fixtures import reset_core_stats_engine -from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.validators.validate_log_event_count_outside_transaction import validate_log_event_count_outside_transaction -from testing_support.validators.validate_function_called import validate_function_called - - +from newrelic.api.time_trace import current_trace +from newrelic.api.transaction import current_transaction @pytest.fixture(scope="function") def uninstrument_logging(): instrumented = logging.Logger.callHandlers while hasattr(logging.Logger.callHandlers, "__wrapped__"): - logging.Logger.callHandlers = logging.Logger.callHandlers.__wrapped__ + logging.Logger.callHandlers = logging.Logger.callHandlers.__wrapped__ # noqa, pylint: disable=E1101 yield logging.Logger.callHandlers = instrumented @@ -43,12 +53,42 @@ def logger(conf_logger, uninstrument_logging): conf_logger.removeHandler(handler) +def set_trace_ids(): + txn = current_transaction() + if txn: + txn._trace_id = "abcdefgh12345678" + trace = current_trace() + if trace: + trace.guid = "abcdefgh" + + def exercise_logging(logger): - logger.warning("C") + set_trace_ids() + logger.warning("C", extra={"key": "value"}) assert len(logger.caplog.records) == 1 +@override_application_settings( + { + "application_logging.forwarding.context_data.enabled": True, + } +) def test_handler_inside_transaction(logger): + @validate_log_events( + [ + { + "message": "C", + "level": "WARNING", + "timestamp": None, + "hostname": None, + "entity.name": "Python Agent Test (logger_logging)", + "entity.guid": None, + "span.id": "abcdefgh", + "trace.id": "abcdefgh12345678", + "context.key": "value", + } + ] + ) @validate_log_event_count(1) @validate_function_called("newrelic.api.log", "NewRelicLogForwardingHandler.emit") @background_task() @@ -59,7 +99,25 @@ def test(): @reset_core_stats_engine() +@override_application_settings( + { + "application_logging.forwarding.context_data.enabled": True, + } +) def test_handler_outside_transaction(logger): + @validate_log_events_outside_transaction( + [ + { + "message": "C", + "level": "WARNING", + "timestamp": None, + "hostname": None, + "entity.name": "Python Agent Test (logger_logging)", + "entity.guid": None, + "context.key": "value", + } + ] + ) @validate_log_event_count_outside_transaction(1) @validate_function_called("newrelic.api.log", "NewRelicLogForwardingHandler.emit") def test(): diff --git a/tests/logger_loguru/conftest.py b/tests/logger_loguru/conftest.py index 65eaf4ab89..d87f7060d0 100644 --- a/tests/logger_loguru/conftest.py +++ b/tests/logger_loguru/conftest.py @@ -13,10 +13,12 @@ # limitations under the License. import logging -import pytest - -from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 +import pytest +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -28,6 +30,7 @@ "application_logging.forwarding.enabled": True, "application_logging.metrics.enabled": True, "application_logging.local_decorating.enabled": True, + "application_logging.forwarding.context_data.enabled": True, "event_harvest_config.harvest_limits.log_event_data": 100000, } @@ -42,6 +45,7 @@ class CaplogHandler(logging.StreamHandler): To prevent possible issues with pytest's monkey patching use a custom Caplog handler to capture all records """ + def __init__(self, *args, **kwargs): self.records = [] super(CaplogHandler, self).__init__(*args, **kwargs) @@ -53,10 +57,16 @@ def emit(self, record): @pytest.fixture(scope="function") def logger(): import loguru + _logger = loguru.logger + _logger.configure(extra={"global_extra": 3}) + _logger = _logger.opt(record=True) + caplog = CaplogHandler() handler_id = _logger.add(caplog, level="WARNING", format="{message}") _logger.caplog = caplog + yield _logger + del caplog.records[:] _logger.remove(handler_id) diff --git a/tests/logger_loguru/test_attributes.py b/tests/logger_loguru/test_attributes.py new file mode 100644 index 0000000000..ff8151e0bd --- /dev/null +++ b/tests/logger_loguru/test_attributes.py @@ -0,0 +1,70 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from testing_support.validators.validate_log_event_count import validate_log_event_count +from testing_support.validators.validate_log_events import validate_log_events + +from newrelic.api.background_task import background_task + + +@validate_log_events( + [ + { # Fixed attributes + "message": "context_attrs: arg1", + "context.file": "(name='%s', path='%s')" % ("test_attributes.py", str(__file__)), + "context.function": "test_loguru_default_context_attributes", + "context.extra.bound_attr": 1, + "context.extra.contextual_attr": 2, + "context.extra.global_extra": 3, + "context.extra.kwarg_attr": 4, + "context.patched_attr": 5, + "context.module": "test_attributes", + "context.name": "test_attributes", + } + ], + required_attrs=[ # Variable attributes + "context.elapsed", + "context.line", + "context.process", + "context.thread", + ], +) +@validate_log_event_count(1) +@background_task() +def test_loguru_default_context_attributes(logger): + def _patcher(d): + d["patched_attr"] = 5 + return d + + bound_logger = logger.bind(bound_attr=1) + bound_logger = bound_logger.patch(_patcher) + with bound_logger.contextualize(contextual_attr=2): + bound_logger.error("context_attrs: {}", "arg1", kwarg_attr=4) + + +@validate_log_events([{"message": "exc_info"}], required_attrs=["context.exception"]) +@validate_log_event_count(1) +@background_task() +def test_loguru_exception_context_attributes(logger): + try: + raise RuntimeError("Oops") + except Exception: + logger.error("exc_info") + + +@validate_log_events([{"context.extra.attr": 1}]) +@validate_log_event_count(1) +@background_task() +def test_loguru_attributes_only(logger): + logger.error("", attr=1) diff --git a/tests/logger_loguru/test_stack_inspection.py b/tests/logger_loguru/test_stack_inspection.py deleted file mode 100644 index fb2738ac2a..0000000000 --- a/tests/logger_loguru/test_stack_inspection.py +++ /dev/null @@ -1,56 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import pytest - -from conftest import CaplogHandler - -from newrelic.api.background_task import background_task -from testing_support.fixtures import reset_core_stats_engine -from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.validators.validate_log_events import validate_log_events -from testing_support.fixtures import override_application_settings - - - -@pytest.fixture(scope="function") -def filepath_logger(): - import loguru - _logger = loguru.logger - caplog = CaplogHandler() - handler_id = _logger.add(caplog, level="WARNING", format="{file}:{function} - {message}") - _logger.caplog = caplog - yield _logger - del caplog.records[:] - _logger.remove(handler_id) - - -@override_application_settings({ - "application_logging.local_decorating.enabled": False, -}) -@reset_core_stats_engine() -def test_filepath_inspection(filepath_logger): - # Test for regression in stack inspection that caused log messages. - # See https://github.com/newrelic/newrelic-python-agent/issues/603 - - @validate_log_events([{"message": "A", "level": "ERROR"}]) - @validate_log_event_count(1) - @background_task() - def test(): - filepath_logger.error("A") - assert len(filepath_logger.caplog.records) == 1 - record = filepath_logger.caplog.records[0] - assert record == "test_stack_inspection.py:test - A", record - - test() diff --git a/tests/logger_structlog/conftest.py b/tests/logger_structlog/conftest.py index 191d687c9c..4ead726e14 100644 --- a/tests/logger_structlog/conftest.py +++ b/tests/logger_structlog/conftest.py @@ -12,15 +12,15 @@ # See the License for the specific language governing permissions and # limitations under the License. -import logging import pytest -from newrelic.api.time_trace import current_trace -from newrelic.api.transaction import current_transaction -from testing_support.fixtures import ( +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 collector_agent_registration_fixture, collector_available_fixture, ) +from newrelic.api.time_trace import current_trace +from newrelic.api.transaction import current_transaction + _default_settings = { "transaction_tracer.explain_threshold": 0.0, "transaction_tracer.transaction_threshold": 0.0, @@ -31,6 +31,7 @@ "application_logging.forwarding.enabled": True, "application_logging.metrics.enabled": True, "application_logging.local_decorating.enabled": True, + "application_logging.forwarding.context_data.enabled": True, "event_harvest_config.harvest_limits.log_event_data": 100000, } @@ -103,9 +104,7 @@ def logger(structlog_caplog): def filtering_logger(structlog_caplog): import structlog - structlog.configure( - processors=[drop_event_processor], logger_factory=lambda *args, **kwargs: structlog_caplog - ) + structlog.configure(processors=[drop_event_processor], logger_factory=lambda *args, **kwargs: structlog_caplog) _filtering_logger = structlog.get_logger() return _filtering_logger diff --git a/tests/logger_structlog/test_attribute_forwarding.py b/tests/logger_structlog/test_attribute_forwarding.py deleted file mode 100644 index eb555cca14..0000000000 --- a/tests/logger_structlog/test_attribute_forwarding.py +++ /dev/null @@ -1,49 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -from newrelic.api.background_task import background_task -from testing_support.fixtures import override_application_settings, reset_core_stats_engine -from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.validators.validate_log_event_count_outside_transaction import validate_log_event_count_outside_transaction -from testing_support.validators.validate_log_events import validate_log_events -from testing_support.validators.validate_log_events_outside_transaction import validate_log_events_outside_transaction - - -_event_attributes = {"message": "A"} - - -@override_application_settings({ - "application_logging.forwarding.context_data.enabled": True, -}) -def test_attributes_inside_transaction(exercise_logging_single_line): - @validate_log_events([_event_attributes]) - @validate_log_event_count(1) - @background_task() - def test(): - exercise_logging_single_line() - - test() - - -@reset_core_stats_engine() -@override_application_settings({ - "application_logging.forwarding.context_data.enabled": True, -}) -def test_attributes_outside_transaction(exercise_logging_single_line): - @validate_log_events_outside_transaction([_event_attributes]) - @validate_log_event_count_outside_transaction(1) - def test(): - exercise_logging_single_line() - - test() diff --git a/tests/logger_structlog/test_attributes.py b/tests/logger_structlog/test_attributes.py new file mode 100644 index 0000000000..67d0d06b92 --- /dev/null +++ b/tests/logger_structlog/test_attributes.py @@ -0,0 +1,97 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import pytest +from testing_support.validators.validate_log_event_count import validate_log_event_count +from testing_support.validators.validate_log_events import validate_log_events + +from newrelic.api.background_task import background_task + + +@pytest.fixture(scope="function") +def logger(structlog_caplog): + import structlog + + structlog.configure( + processors=[ + structlog.contextvars.merge_contextvars, + structlog.processors.format_exc_info, + structlog.processors.StackInfoRenderer(), + structlog.processors.CallsiteParameterAdder(), + ], + logger_factory=lambda *args, **kwargs: structlog_caplog, + ) + + _callsite_logger = structlog.get_logger(logger_attr=2) + structlog.contextvars.bind_contextvars(context_attr=3) + yield _callsite_logger + structlog.contextvars.clear_contextvars() + + +@validate_log_events( + [ + { # Fixed attributes + "message": "context_attrs: arg1", + "context.kwarg_attr": 1, + "context.logger_attr": 2, + "context.context_attr": 3, + "context.filename": "test_attributes.py", + "context.func_name": "test_structlog_default_context_attributes", + "context.module": "test_attributes", + "context.pathname": str(__file__), + "context.process_name": "MainProcess", + "context.thread_name": "MainThread", + } + ], + required_attrs=[ # Variable attributes + "context.lineno", + "context.process", + "context.thread", + ], +) +@validate_log_event_count(1) +@background_task() +def test_structlog_default_context_attributes(logger): + logger.error("context_attrs: %s", "arg1", kwarg_attr=1) + + +@validate_log_events([{"message": "exc_info"}], required_attrs=["context.exception"]) +@validate_log_event_count(1) +@background_task() +def test_structlog_exc_info_context_attributes(logger): + try: + raise RuntimeError("Oops") + except Exception: + logger.error("exc_info", exc_info=True) + + +@validate_log_events([{"message": "stack_info"}], required_attrs=["context.stack"]) +@validate_log_event_count(1) +@background_task() +def test_structlog_stack_info_context_attributes(logger): + logger.error("stack_info", stack_info=True) + + +@validate_log_events([{"message": "A", "message.attr": 1}]) +@validate_log_event_count(1) +@background_task() +def test_structlog_message_attributes(logger): + logger.error({"message": "A", "attr": 1}) + + +@validate_log_events([{"message.attr": 1}]) +@validate_log_event_count(1) +@background_task() +def test_structlog_attributes_only(logger): + logger.error({"attr": 1}) diff --git a/tests/testing_support/validators/validate_log_event_count.py b/tests/testing_support/validators/validate_log_event_count.py index 243364d2d6..13f6c4e478 100644 --- a/tests/testing_support/validators/validate_log_event_count.py +++ b/tests/testing_support/validators/validate_log_event_count.py @@ -14,14 +14,14 @@ import copy -from newrelic.common.object_wrapper import (transient_function_wrapper, - function_wrapper) from testing_support.fixtures import catch_background_exceptions +from newrelic.common.object_wrapper import function_wrapper, transient_function_wrapper + + def validate_log_event_count(count=1): @function_wrapper def _validate_wrapper(wrapped, instance, args, kwargs): - record_called = [] recorded_logs = [] @@ -43,11 +43,11 @@ def _validate_log_event_count(wrapped, instance, args, kwargs): if count: assert record_called logs = copy.copy(recorded_logs) - + record_called[:] = [] recorded_logs[:] = [] - assert count == len(logs), len(logs) + assert count == len(logs), "Expected %d, Got %d" % (count, len(logs)) return val diff --git a/tests/testing_support/validators/validate_log_event_count_outside_transaction.py b/tests/testing_support/validators/validate_log_event_count_outside_transaction.py index 8f1337df81..e04c9df85a 100644 --- a/tests/testing_support/validators/validate_log_event_count_outside_transaction.py +++ b/tests/testing_support/validators/validate_log_event_count_outside_transaction.py @@ -14,14 +14,14 @@ import copy -from newrelic.common.object_wrapper import (transient_function_wrapper, - function_wrapper) from testing_support.fixtures import catch_background_exceptions +from newrelic.common.object_wrapper import function_wrapper, transient_function_wrapper + + def validate_log_event_count_outside_transaction(count=1): @function_wrapper def _validate_wrapper(wrapped, instance, args, kwargs): - record_called = [] recorded_logs = [] @@ -44,11 +44,11 @@ def _validate_log_event_count_outside_transaction(wrapped, instance, args, kwarg if count: assert record_called logs = copy.copy(recorded_logs) - + record_called[:] = [] recorded_logs[:] = [] - assert count == len(logs), len(logs) + assert count == len(logs), "Expected %d, Got %d" % (count, len(logs)) return val diff --git a/tests/testing_support/validators/validate_log_events.py b/tests/testing_support/validators/validate_log_events.py index aa6159512e..4008d0e52d 100644 --- a/tests/testing_support/validators/validate_log_events.py +++ b/tests/testing_support/validators/validate_log_events.py @@ -14,16 +14,21 @@ import copy +from testing_support.fixtures import catch_background_exceptions + +from newrelic.common.object_wrapper import function_wrapper, transient_function_wrapper from newrelic.packages import six -from newrelic.common.object_wrapper import (transient_function_wrapper, - function_wrapper) -from testing_support.fixtures import catch_background_exceptions -def validate_log_events(events): +def validate_log_events(events=None, required_attrs=None, forgone_attrs=None): + events = events or [ + {} + ] # Empty event allows assertions based on only required or forgone attrs to still run and validate + required_attrs = required_attrs or [] + forgone_attrs = forgone_attrs or [] + @function_wrapper def _validate_wrapper(wrapped, instance, args, kwargs): - record_called = [] recorded_logs = [] @@ -41,12 +46,11 @@ def _validate_log_events(wrapped, instance, args, kwargs): return result - _new_wrapper = _validate_log_events(wrapped) val = _new_wrapper(*args, **kwargs) assert record_called logs = copy.copy(recorded_logs) - + record_called[:] = [] recorded_logs[:] = [] @@ -54,14 +58,13 @@ def _validate_log_events(wrapped, instance, args, kwargs): matching_log_events = 0 mismatches = [] for captured in logs: - if _check_log_attributes(expected, captured, mismatches): + if _check_log_attributes(expected, required_attrs, forgone_attrs, captured, mismatches): matching_log_events += 1 assert matching_log_events == 1, _log_details(matching_log_events, logs, mismatches) return val - - def _check_log_attributes(expected, captured, mismatches): + def _check_log_attributes(expected, required_attrs, forgone_attrs, captured, mismatches): for key, value in six.iteritems(expected): if hasattr(captured, key): captured_value = getattr(captured, key, None) @@ -76,6 +79,21 @@ def _check_log_attributes(expected, captured, mismatches): mismatches.append("key: %s, value:<%s><%s>" % (key, value, captured_value)) return False + for key in required_attrs: + if not hasattr(captured, key) and key not in captured.attributes: + mismatches.append("required_key: %s" % key) + return False + + for key in forgone_attrs: + if hasattr(captured, key) or key in captured.attributes: + if hasattr(captured, key): + captured_value = getattr(captured, key, None) + elif key in captured.attributes: + captured_value = captured.attributes[key] + + mismatches.append("forgone_key: %s, value:<%s>" % (key, captured_value)) + return False + return True def _log_details(matching_log_events, captured, mismatches): diff --git a/tests/testing_support/validators/validate_log_events_outside_transaction.py b/tests/testing_support/validators/validate_log_events_outside_transaction.py index 4bc9419654..6c17d089b5 100644 --- a/tests/testing_support/validators/validate_log_events_outside_transaction.py +++ b/tests/testing_support/validators/validate_log_events_outside_transaction.py @@ -20,10 +20,13 @@ from newrelic.packages import six -def validate_log_events_outside_transaction(events): +def validate_log_events_outside_transaction(events=None, required_attrs=None, forgone_attrs=None): + events = events or [{}] # Empty event allows assertions based on only forgone attrs to still run and validate + required_attrs = required_attrs or [] + forgone_attrs = forgone_attrs or [] + @function_wrapper def _validate_wrapper(wrapped, instance, args, kwargs): - record_called = [] recorded_logs = [] @@ -52,13 +55,13 @@ def _validate_log_events_outside_transaction(wrapped, instance, args, kwargs): matching_log_events = 0 mismatches = [] for captured in logs: - if _check_log_attributes(expected, captured, mismatches): + if _check_log_attributes(expected, required_attrs, forgone_attrs, captured, mismatches): matching_log_events += 1 assert matching_log_events == 1, _log_details(matching_log_events, logs, mismatches) return val - def _check_log_attributes(expected, captured, mismatches): + def _check_log_attributes(expected, required_attrs, forgone_attrs, captured, mismatches): for key, value in six.iteritems(expected): if hasattr(captured, key): captured_value = getattr(captured, key, None) @@ -73,6 +76,21 @@ def _check_log_attributes(expected, captured, mismatches): mismatches.append("key: %s, value:<%s><%s>" % (key, value, captured_value)) return False + for key in required_attrs: + if not hasattr(captured, key) and key not in captured.attributes: + mismatches.append("required_key: %s" % key) + return False + + for key in forgone_attrs: + if hasattr(captured, key) or key in captured.attributes: + if hasattr(captured, key): + captured_value = getattr(captured, key, None) + elif key in captured.attributes: + captured_value = captured.attributes[key] + + mismatches.append("forgone_key: %s, value:<%s>" % (key, captured_value)) + return False + return True def _log_details(matching_log_events, captured, mismatches): diff --git a/tox.ini b/tox.ini index b25b9e7adc..1497c6babb 100644 --- a/tox.ini +++ b/tox.ini @@ -328,7 +328,7 @@ deps = framework_starlette-starlette0019: starlette<0.20 framework_starlette-starlette002001: starlette==0.20.1 framework_starlette-starlette0028: starlette<0.29 - framework_starlette-starlettelatest: starlette + framework_starlette-starlettelatest: starlette<0.35 framework_strawberry: starlette framework_strawberry-strawberrylatest: strawberry-graphql framework_tornado: pycurl @@ -482,4 +482,4 @@ source = directory = ${TOX_ENV_DIR-.}/htmlcov [coverage:xml] -output = ${TOX_ENV_DIR-.}/coverage.xml \ No newline at end of file +output = ${TOX_ENV_DIR-.}/coverage.xml