Java Logs are Boring Powerful and Simple Techniques to Make Them Valuable!.

What can you do with your java logs? First, Java offers a lot of logging libraries starting from Java Util Logging to more sophisticated ones such as SLF4J and Log4J2. Then, to make things even more complex, each open source library comes with its own logging standards which adds a layer of complexity to your java logs proper configuration and interpretation.

From our own experience, Java logs can be quickly organized and become an extremely powerful tool for Devs and Ops who take care of apps in production.

In this blogpost, we are going to cover:

  • Why and how logging in JSON format with Java Applications from to Log4J or Slf4J (Logback)
  • How to speed your troubleshooting
  • How to attach valuable attributes to your log events thanks to the Key-Value parser or Java’s MDC (Mapped Diagnostic Contexts)

If you are lost with your java logs and you don’t really know what to do with them this article is for you!

Logging in JSON is a really good idea for your Java logs!

Using json format will help you deal correctly with your java logs. Indeed, as java logs are split into multiples lines, it makes it difficult for parsers to associate them to the original log event:

1 2 3 4 5//4 events generated when only one is expected! Exception in thread "main"java.lang.NullPointerException at com.example.myproject.Book.getTitle(Book.java:16) at com.example.myproject.Author.getBookTitles(Author.java:25) at com.example.myproject.Bootstrap.main(Bootstrap.java:14)

Whether it be ELK combined with Logstash or Logmatic.io, any log management system needs to ingest events. And multi-line logs are very inefficient in that sense.
Java logs are thus quite complex to handle for any log management system, but json format can help you deal properly with this phenomenon. This is especially true if your are dealing with complex infrastructures and numerous services.

Logging in Json also provides additional advantages, such as the following examples:

  • Ensuring to have a stack_trace properly wrapped into proper LogEvent
  • Ensuring all the attributes of a log event is properly extracted (severity, logger name, thread name, etc…)
  • Accessing the MDC which provides you with attributes to attach to any log event

We’re going to show you how to incorporate Json while using Log4j2 and Logback, the two logger solutions we find the more efficient and simple to use.

I. JSON logging with Log4j2

1) Log in Json

The best open source JSON library we’ve found for Log4j2 is the log4j2-logstash-jsonevent-layout. This library holds several strengths:

  • Similarity to the default JSON Layout of Log4j2
  • Compatibility with Logstash v1 format
  • Capability to easily add context and meta information

However, it is not an official maven artifact. So if you use maven you’ll have to somehow include it into your repository or build it directly from github as follows:

git clone https://github.com/majikthys/log4j2-logstash-jsonevent-layout 
cd log4j2-logstash-jsonevent-layout 
./gradlew clean build

The generated library should be here:
build/libs/log4j2-logstash-jsonevent-layout-x.y.z-SNAPSHOT.jar.

Once the library is in your classpath you can attach the following layout to any appender, your logs are ready-to-send to your favorite log management solution:

<LogStashJSONLayout>
<PatternLayout pattern="%msg%throwable{none}"/>
</LogStashJSONLayout>

2) Add as much context as you can

Log4j2-logstash-jsonevent-layout allows developers to specify context. If adding contextual information to your java logs is not yet a conditioned response for you, it should definitely become the case!
Adding fields can dramatically speed up your investigation and time to root-cause finding. So reopen your log configuration file, and start adding the pieces of information that are specifically useful to your app.

<LogStashJSONLayout>
         <PatternLayout pattern="%msg%throwable{none}"/>
<!-- Example of what you might do to add fields -->
<KeyValuePair key="application_name" value="${sys:application.name}"/>
<KeyValuePair key="application_version" value="${sys:application.version}"/>
 
<!--Example of using environment property substitution -->
<KeyValuePair key="environment_user" value="${env:USER}"/>

The library uses the log4j2 lookup API, so you can enrich your events with many useful fields and mechanism.

3) Send your smart java logs to your favorite log manager

Finally, keeping your logs into a file is surely not the best way to make good use of them. To ship your java logs straight to your log management solution, you can:

  1. Use your favorite log shipper with a “tail” module for instance, rsyslog or fluentd are good to handle this
  2. Use a TCP connection to an endpoint like logstash or Logmatic.io

The best way to send these java logs is probably to write them into a local file and send it: all the information related to the machine and the process will then come in a standardized format. But if you went with the second option, just wrap the log4j2-logstash-jsonevent-layout as follows:

<Socket name="TCP" host="endpoint.example.foo" port="10514" reconnectionDelay="5000">
<LogStashJSONLayout>
<!-- your configuration -->
</LogStashJSONLayout>
</Socket>

And finally create an endpoint with logstash for instance:

input {
tcp {
codec =>json_line { charset => "UTF-8" }
    # 4560 is default log4j socket appender port
port => 10514
  }
}
output {
stdout { codec =>rubydebug }
}

II. JSON logging with Slf4J Logback

We usually use the logstash-logback-encoder library as it comes ready to use. And on the plus side, it is part of the main Maven repository.

To add it into your classpath, simply add the following dependency (version 3.3 on the example):

<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>3.3</version>
</dependency>

The last step is to configure appenders with the specific LogBack wrapper. If you want to log through console or straight in TCP, have a look at the example below:

<appender name="JSON" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"logmaticKey":"<your_api_key>"}</customFields>
</encoder>
</appender>
 
<appender name="JSON_TCP" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<remoteHost>api.logmatic.io</remoteHost>
<port>10514</port>
<keepAliveDuration>1 minute</keepAliveDuration>
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"logmaticKey":"<your_api_key>"}</customFields>
</encoder>
</appender>
<root level="debug">
<appender-ref ref="JSON_TCP" />
<appender-ref ref="JSON" />
</root>

Let’s add some steroids and deep dive into your java logs

I. Build your severity overview

The first information you need is the ratio between debug, info, warning and error. You can quickly ensure that everything looks fine with a share of voice graph split by severity level for example:

Splitting severity level by each class is even more useful for dev. troubleshooting purposes. Select logger_name as a first field and the level as the second field within your tool and you would get something similar to illustration below. You can now keep an eye at the each class health over one or multiple running servers.

II. Keep on enriching your java log events with valuable attributes!

We find MDC (Mapped Diagnostic Contexts) to be a usually underestimated feature to attach information to your logs in general, and java logs in particular. It is both available for LogBack and Log4j.

This feature allows you to contextualize your event.

1 2 3 4 5 6 7 8... MDC.put("plateform-id", "prod30"); MDC.put("quantity", "1001"); MDC.put("duration", "93.0");   logger.info("Emitted 1001 messages during the last 93 seconds"); ... MDC.clear();

Wrapping up

I hope you learnt some valuable things here! If you want to learn more or standardize open source libraries log streams into a single & configurable SL4J stream. I leave you with this and hope you’ll implement some Java logging techniques soon!

  •  
Visited 2 times, 1 visit(s) today

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.