I am recently working on shipping the application logs to elasticsearch database, so that mainly the http requests traversal through multiple services could be tracked down based on their requestId.
I am using flume agent to read the application log, read line by line modify few json elements and publish it to the elasticsearch database.
I first updated my application log to be in json format as flume easily recognizes json events. I am using log4j2 for that.
BUT, found out I can not read the multiple line json object from flume,
{ "timeMillis" : 1474611652491, "thread" : "main", "level" : "DEBUG", "loggerName" : "suppliesLogger", "message" : "I'm Hunter Thomson and I'm alive.", "endOfBatch" : false, "loggerFqcn" : "org.apache.logging.log4j.spi.AbstractLogger", "threadId" : 1, "threadPriority" : 5 }
So, instead of writing stupid logic to read multiline json object on flume side, I updated my log4j2 config to write each log in one line in my application itself.
The config is as below,
{ "configuration": { "name": "logggg", "packages" : "org.apache.logging", "appenders": { "RollingFile": { "name":"rollingStone", "fileName":"supply_chain_rolled.log", "filePattern":"%d{MM-dd-yy-HH-mm-ss}-%i.log.gz", "JSONLayout": { "complete" : false, "compact" : true, "eventEol" : true }, "Policies": { "SizeBasedTriggeringPolicy": { "size":"10 MB" } }, "DefaultRolloverStrategy": { "max":"10" } } }, "loggers": { "root": { "level":"debug", "appender-ref": { "ref":"rollingStone" } } } } }
I basically needed to make the json object to be compact with compact:true. But that will write all the events in one fking line.
So had to add EOL after each event with eventEol : true.
The application after compacting it with EOL is
{"timeMillis":1478588550167,"thread":"main","level":"DEBUG","loggerName":"org.apache.logging.SupplyChainLogger","message":"I'm Hunter Thomson","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5} {"timeMillis":1478588550569,"thread":"main","level":"DEBUG","loggerName":"org.apache.logging.SupplyChainLogger","message":"artist=porcupine tree,address=UK","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5} {"timeMillis":1478588550571,"thread":"main","level":"DEBUG","loggerName":"org.apache.logging.SupplyChainLogger","message":"Exception occured ","thrown":{"commonElementCount":0,"localizedMessage":"some exception","message":"some exception","name":"java.lang.Exception","extendedStackTrace":[{"class":"org.apache.logging.SupplyChainLogger","method":"main","file":"SupplyChainLogger.java","line":17,"exact":true,"location":"classes/","version":"?"},{"class":"sun.reflect.NativeMethodAccessorImpl","method":"invoke0","file":"NativeMethodAccessorImpl.java","line":-2,"exact":false,"location":"?","version":"1.8.0_101"},{"class":"sun.reflect.NativeMethodAccessorImpl","method":"invoke","file":"NativeMethodAccessorImpl.java","line":62,"exact":false,"location":"?","version":"1.8.0_101"},{"class":"sun.reflect.DelegatingMethodAccessorImpl","method":"invoke","file":"DelegatingMethodAccessorImpl.java","line":43,"exact":false,"location":"?","version":"1.8.0_101"},{"class":"java.lang.reflect.Method","method":"invoke","file":"Method.java","line":498,"exact":false,"location":"?","version":"1.8.0_101"},{"class":"com.intellij.rt.execution.application.AppMain","method":"main","file":"AppMain.java","line":147,"exact":true,"location":"idea_rt.jar","version":"?"}]},"endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5}
Resource
-----------------------
http://logging.apache.org/log4j/2.0/log4j-core/apidocs/org/apache/logging/log4j/core/layout/JsonLayout.html
No comments:
Post a Comment