2023年7月17日发(作者:)
slf4j-logback⽇志以json格式导⼊ELK同事整理的,在此分享。logback,log4j2 等slf4j的⽇志实现都可以以json格式输出⽇志, 这⾥采⽤的是logback。当然也可以以⽂本⾏的格式输出,然后在logstash⾥通过grok解析,但是直接以json格式输出,在logstash处理时效率会⾼⼀点。Logback 输出 Json格式⽇志⽂件 为了让 logback 输出JSON 格式的⽇志⽂件,需要在 加⼊如下依赖 class="gFileAppender">
Json 字段说明:名称tagstimestampprojectlog_levelthreadclass_nameline_numbermessagestack_tracereq_idelapsed_time说明⽤于说明这条⽇志是属于哪⼀类⽇志⽇志记录时间系统名称,该⽇志来⾃于哪个系统输出⽇志级别输出产⽣⽇志的线程名。输出执⾏记录请求的调⽤者的全限定名输出执⾏⽇志请求的⾏号输出应⽤程序提供的信息异常栈信息请求ID,⽤于追踪请求该⽅法执⾏时间,单位: 毫秒
备注
需要引⼊aop-logging需要引⼊aop-logging%X{key}: 表⽰该项来⾃于SLF4j MDC,需要引⼊ aop-logging
针对web应⽤,在 中加⼊ ReqIdFilter,该过滤器会在MDC 加⼊ reqId
or register in springboot like this:
@Beanpublic
FilterRegistrationBean getDemoFilter(){ ReqIdFilter reqIdFilter=new
ReqIdFilter(); FilterRegistrationBean registrationBean=new
FilterRegistrationBean(); ter(reqIdFilter); List
ArrayList
registrationBean;}
如果需要记录该⽅法执⾏时间: elapsed_time,如果在该类或者⽅法上加⼊如下注解:
import
ug;import
o;
@LogInfo
// 当logger 设为level=INFO 会输出@LogException(value = {@Exc(value = , stacktrace = false)}, warn = {@Exc({})}) //当logger 设为level=error 会输出
针对dubbo 消费者的⽇志记录,dubbo消费者是通过 javassist ⽣成的动态类型,如果要监控该dubbo接⼝的传⼊参数,返回值,和调⽤时间 需要引⼊aop-logging,以及在 eye-rpc包中的接⼝上给对应的类或⽅法 加上上⾯的注解。dubbo 消费者的⽇志会输出如下配置:
level="INFO" additivity="false">
ElasticSearch 模板设置curl -XPUT localhost:9200/_template/log -d '{ "mappings": { "_default_": { "_all": { "enabled": false }, "_meta": { "version": "5.1.1" }, "dynamic_templates": [ { "strings_as_keyword": { "mapping": { "ignore_above": 1024, "type": "keyword" }, "match_mapping_type": "string" } } ], "properties": { "@timestamp": { "type": "date" }, "beat": { "properties": { "hostname": { "ignore_above": 1024, "type": "keyword" }, "name": { "ignore_above": 1024, "type": "keyword" }, "version": { "ignore_above": 1024, "type": "keyword" } } }, "input_type": { "ignore_above": 1024, "type": "keyword" }, "message": { "norms": false, "type": "text" }, "offset": { "type": "long" }, "source": { "ignore_above": 1024, "type": "keyword" }, "tags": { "ignore_above": 1024, "type": "keyword" }, "type": { "ignore_above": 1024, "type": "keyword" } } } }, "order": 0, "settings": { "h_interval": "5s" }, "template": "log-*"}'
curl -XPUT localhost:9200/_template/log-java -d '
{ "mappings": { "_default_": { "properties": { "log_level": { "ignore_above": 1024, "type": "keyword" }, "project": { "ignore_above": 1024, "type": "keyword" }, "thread": { "ignore_above": 1024, "type": "keyword" }, "req_id": { "ignore_above": 1024, "type": "keyword" }, "class_name": { "ignore_above": 1024, "type": "keyword" }, "line_number": { "type": "long" }, "exception_class":{ "ignore_above": 1024, "type": "keyword" }, "elapsed_time": { "type": "long" },
"stack_trace": { "type": "keyword" } } } }, "order": 1, "settings": { "h_interval": "5s" }, "template": "log-java-*"}'logstatsh 设置logstash-java-logif
[fields][logType] == "java"
{ json { source => "message" remove_field => ["offset"] } date { match => ["timestamp","yyyy-MM-dd'T'HH:mm:ss,SSSZ"] remove_field => ["timestamp"] } if
[stack_trace] { mutate { add_field => { "exception_class"
=> "%{stack_trace}"
} } } if
[exception_class] { mutate { gsub => [ "exception_class", "n", "", "exception_class", ":.*", "" ] } }}filebeat 设置ctors:- input_type: log paths: - /eyebiz/logs/eyebiz-service/elk/*.log # eyebiz-service ⽇志 - /eyebiz/logs/eyebiz-web/elk/*.log # eyebiz-web ⽇志 fields: logType: "java" docType: "log-java-dev"
发布者:admin,转转请注明出处:http://www.yc00.com/web/1689545168a264968.html
评论列表(0条)