Note: The default branch has been renamed from master to main. Please execute on your local repository:
git branch -m master main
git fetch origin
git branch -u origin/main main
git remote set-head origin -a
This is a collection of support libraries for Java applications (Java 8 and above) that serves three main purposes:
- Provide means to emit structured application log messages
- Instrument parts of your application stack to collect request metrics
- Allow production of custom metrics.
The libraries started out to support applications running on Cloud Foundry. This integration has become optional. The library can be used in any runtime environment such as Kubernetes or Kyma.
When we say structured, we actually mean in JSON format. In that sense, it shares ideas with logstash-logback-encoder, but takes a simpler approach as we want to ensure that these structured messages adhere to standardized formats. With such standardized formats in place, it becomes much easier to ingest, process and search such messages in log analysis stacks such as ELK.
If you're interested in the specifications of these standardized formats, you may want to have a closer look at the fields.yml
files in the beats folder.
While logstash-logback-encoder is tied to logback, we've tried to keep implementation neutral and have implemented the core functionality on top of slf4j, but provided implementations for both logback and log4j2 (and we're open to contributions that would support other implementations).
The instrumentation part is currently focusing on providing request filters for Java Servlets, but again, we're open to contributions for other APIs and frameworks.
The custom metrics instrumentation allows users to easily define and emit custom metrics. The different modules configure all necessary components and make it possible to define custom metrics with minimal code change. Once collected, custom metrics are sent as special log message.
Lastly, there is also a project on node.js logging support.
As you can see from the structure of this repository, we're not providing one uber JAR that contains everything, but provide each feature separately. We also try to stay away from wiring up too many dependencies by tagging almost all of them as provided. As a consequence, it's your task to get all runtime dependencies resolved in your application POM file.
All in all, you should do the following:
- Make up your mind which features you actually need.
- Adjust your Maven dependencies accordingly.
- Pick your favorite logging implementation. And
- Adjust your logging configuration accordingly.
Let's say you want to make use of the servlet filter feature, then you need to add the following dependency to your POM with property cf-logging-version
referring to the latest nexus version (currently 3.8.4
):
<properties>
<cf-logging-version>3.8.4</cf-logging-version>
</properties>
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-java-logging-support-servlet</artifactId>
<version>${cf-logging-version}</version>
</dependency>
This feature only depends on the servlet API which you have included in your POM anyhow. You can find more information about the servlet filter feature (like e.g. how to adjust the web.xml) in the Wiki. Note, that we provide two different servlet instrumentations:
- cf-java-logging-support-servlet linked against
javax.servlet
- cf-java-logging-support-servlet-jakarta linked against
jakarta.servlet
Both modules build on the same code but use the respective API.
If you want to use the custom metrics
, just define the following dependency:
- Spring Boot Support:
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-custom-metrics-clients-spring-boot</artifactId>
<version>${cf-logging-version}</version>
</dependency>
- Plain Java Support:
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-custom-metrics-clients-java</artifactId>
<version>${cf-logging-version}</version>
</dependency>
The core feature (on which all other features rely) is just using the org.slf4j
API, but to actually get logs written, you need to pick an implementation feature. As stated above, we have two implementations:
cf-java-logging-support-logback
based on logback, andcf-java-logging-support-log4j2
based on log4j2.
Again, we don't include dependencies to those implementation backends ourselves, so you need to provide the corresponding dependencies in your POM file:
Using logback:
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-java-logging-support-logback</artifactId>
<version>${cf-logging-version}</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.11</version>
</dependency>
Using log4j2:
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-java-logging-support-log4j2</artifactId>
<version>${cf-logging-version}</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<version>2.20.0</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.20.0</version>
</dependency>
As they have slightly differ in configuration, you again will need to do that yourself. But we hope that we've found an easy way to accomplish that. The one thing you have to do is pick our encoder in your logback.xml
if you're using logback
or our layout
in your log4j2.xml
if you're using log4j2
.
Here are the minimal configurations you'd need:
logback.xml:
<configuration debug="false" scan="false">
<appender name="STDOUT-JSON" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="com.sap.hcp.cf.logback.encoder.JsonEncoder"/>
</appender>
<!-- for local development, you may want to switch to a more human-readable layout -->
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date %-5level [%thread] - [%logger] [%mdc] - %msg%n</pattern>
</encoder>
</appender>
<root level="${LOG_ROOT_LEVEL:-WARN}">
<!-- Use 'STDOUT' instead for human-readable output -->
<appender-ref ref="STDOUT-JSON" />
</root>
<!-- request metrics are reported using INFO level, so make sure the instrumentation loggers are set to that level -->
<logger name="com.sap.hcp.cf" level="INFO" />
</configuration>
log4j2.xml:
<Configuration
status="warn" strict="true"
packages="com.sap.hcp.cf.log4j2.converter,com.sap.hcp.cf.log4j2.layout">
<Appenders>
<Console name="STDOUT-JSON" target="SYSTEM_OUT" follow="true">
<JsonPatternLayout charset="utf-8"/>
</Console>
<Console name="STDOUT" target="SYSTEM_OUT" follow="true">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} [%mdc] - %msg%n"/>
</Console>
</Appenders>
<Loggers>
<Root level="${LOG_ROOT_LEVEL:-WARN}">
<!-- Use 'STDOUT' instead for human-readable output -->
<AppenderRef ref="STDOUT-JSON" />
</Root>
<!-- request metrics are reported using INFO level, so make sure the instrumentation loggers are set to that level -->
<Logger name="com.sap.hcp.cf" level="INFO"/>
</Loggers>
</Configuration>
With the custom metrics feature you can send metrics defined inside your code. Metrics are emitted as log messages when your application is bound to a service called application-logs. This is done because of the special format supported by the SAP BTP Application Logging Service and for compatibility with the prior apporach. To use the feature you'd need:
- Instrumenting Spring Boot 2 applications:
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-custom-metrics-clients-spring-boot</artifactId>
<version>${cf-logging-version}</version>
</dependency>
The Spring Boot instrumentation uses Spring Boot Actuator
which allows to read predefined metrics and write custom metrics. The Actuator supports Micrometer and is part of Actuator's dependencies.
In your code you work directly with Micrometer
. Define your custom metrics and iterate with them:
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import io.micrometer.core.instrument.Counter;
import io.micrometer.core.instrument.LongTaskTimer;
import io.micrometer.core.instrument.Metrics;
import io.micrometer.core.instrument.Tag;
@RestController
public class DemoController {
private Counter counter;
private AtomicInteger concurrentHttpRequests;
private LongTaskTimer longTimer;
DemoController() {
this.counter = Metrics.counter("demo.controller.number.of.requests", "unit", "requests");
List<Tag> tags = new ArrayList<Tag>(Arrays.asList(new Tag[] { Tag.of("parallel", "clients") }));
this.concurrentHttpRequests = Metrics.gauge("demo.controller.number.of.clients.being.served", tags,
new AtomicInteger(0));
this.longTimer = Metrics.more().longTaskTimer("demo.controller.time.spends.in.serving.clients");
}
@RequestMapping("/")
public String index() {
longTimer.record(() -> {
this.counter.increment();
concurrentHttpRequests.addAndGet(1);
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
LOGGER.error(e);
} finally {
concurrentHttpRequests.addAndGet(-1);
}
});
return "Greetings from Custom Metrics!";
}
}
In the example above, three custom metrics are defined and used. The metrics are Counter
, LongTaskTimer
and Gauge
.
- Instrumenting plain Java applications:
<dependency>
<groupId>com.sap.hcp.cf.logging</groupId>
<artifactId>cf-custom-metrics-clients-java</artifactId>
<version>${cf-logging-version}</version>
</dependency>
The Java instrumentation uses Dropwizard which allows to define all kind of metrics supports by Dropwizard. The following metrics are available: com.codahale.metrics.Gauge
, com.codahale.metrics.Counter
, com.codahale.metrics.Histogram
, com.codahale.metrics.Meter
and com.codahale.metrics.Timer
. More information about the metric types and their usage. Define your custom metrics and iterate with them:
import java.io.IOException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.codahale.metrics.Counter;
import com.codahale.metrics.Meter;
import com.sap.cloud.cf.monitoring.java.CustomMetricRegistry;
public class CustomMetricsServlet extends HttpServlet {
private static Counter counter = CustomMetricRegistry.get().counter("custom.metric.request.count");
private static Meter meter = CustomMetricRegistry.get().meter("custom.metric.request.meter");
@Override
public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
counter.inc(3);
meter.mark();
response.getWriter().println("<p>Greetings from Custom Metrics</p>");
}
}
This library supports the following configurations regarding sending custom metrics:
interval
: the interval for sending metrics, in millis. Default value:60000
enabled
: enables or disables the sending of metrics. Default value:true
metrics
: array of whitelisted metric names. Only mentioned metrics would be processed and sent. If it is an empty array all metrics are being sent. Default value:[]
metricQuantiles
: enables or disables the sending of metric's quantiles like median, 95th percentile, 99th percentile. SpringBoot does not support this configuration. Default value:false
Configurations are read from environment variable named CUSTOM_METRICS
. To change the default values, you should override the environment variable with your custom values. Example:
{
"interval": 30000,
"enabled": true,
"metrics": [
"my.whitelist.metric.1",
"my.whitelist.metric.2"
],
"metricQuantiles":true
}
This library provides the possibility to change the log-level threshold for a single thread by adding a token in the header of a request. A detailed description about how to apply this feature can be found here.
Stacktraces can be logged within one log message. Further details can be found here.
In order to illustrate how the different features are used, this repository includes two sample applications:
- a Jersey implementation in the ./sample folder
- a Spring Boot implementation in the ./sample-spring-boot folder
More info on the actual implementation can be found in the Wiki.
Please see our LICENSE for copyright and license information. Detailed information including third-party components and their licensing/copyright information is available via the REUSE tool.