日誌分析實踐與應用
阿新 • • 發佈:2019-02-08
validator def ace 過濾 細節 internal safe factor 分析 日誌分析實踐與應用
這個場景是,日誌系統平時為了系統處理能力,我們使用INFO級別或ERROR級別,當發現問題時,我們需要不停服務的動態的將日誌級別變更為DEBUG以便在執行細節發現問題,下面列舉了動態變更的操作,需要借助代碼、定時和配置中心服務。
在微服務的場景,日誌是散落在各個服務集群節點中,不方便查看,所以我們需要通過集中收集到一處保存、查看和分析。
應用程序中日誌的配置
logback.xml
- 1.在configuration中配置include,引用defaults.xml、console-appender.xml和file-appender.xml基礎配置,可以復用變量、默認配置和策略。
- 2.定義一個stash的appender,配置目標主機和端口以及轉碼器用什麽。通過配置將日誌發送到統一日誌管理平臺進一步分析與保存。
- 註:對於推薦使用logback-spring.xml不使用logback.xml,官方也沒有給出推薦理由,經測試logback.xml配置依然可用,也可以在變更後自動重啟,所以沒換,只是註意configuration屬性scan不能設置為true,由spring來掃描即可。
具體配置如下:
<?xml version="1.0" encoding="UTF-8"?> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml" /> <include resource="org/springframework/boot/logging/logback/console-appender.xml "/> <include resource="org/springframework/boot/logging/logback/file-appender.xml "/> <!-- 控制臺打印日誌的相關配置 --> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <!-- 日誌格式 --> <encoder> <pattern>%d{yyyy-MM-dd HH:mm:ss} [%level] - %m%n</pattern> </encoder> </appender> <!-- 文件保存日誌的相關配置 --> <appender name="ERROR-OUT" class="ch.qos.logback.core.rolling.RollingFileAppender"> <!-- 保存日誌文件的路徑 --> <file>d:/logs/error.log</file> <!-- 日誌格式 --> <encoder> <pattern>%d{yyyy-MM-dd HH:mm:ss} [%class:%line] - %m%n</pattern> </encoder> <!-- 日誌級別過濾器 --> <filter class="ch.qos.logback.classic.filter.LevelFilter"> <!-- 過濾的級別 --> <level>ERROR</level> <!-- 匹配時的操作:接收(記錄) --> <onMatch>ACCEPT</onMatch> <!-- 不匹配時的操作:拒絕(不記錄) --> <onMismatch>DENY</onMismatch> </filter> <!-- 循環策略:基於時間創建日誌文件 --> <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy"> <!-- 日誌文件名格式 --> <fileNamePattern>error.%d{yyyy-MM-dd}.log</fileNamePattern> <!-- 最大保存時間:30天--> <maxHistory>30</maxHistory> </rollingPolicy> </appender> <!-- stash日誌保存(方案1) --> <appender name="STASH-OUT" class="net.logstash.logback.appender.LogstashTcpSocketAppender"> <destination>localhost:4560</destination> <encoder class="net.logstash.logback.encoder.LogstashEncoder" /> </appender> <!-- fluent日誌保存(方案2) --> <appender name="FLUENT" class="cn.qos.logback.more.appenders.DataFluentAppender"> <remoteHost>${fluentHost}</remoteHost> </appender> <!-- 基於dubug處理日誌:具體控制臺或者文件對日誌級別的處理還要看所在appender配置的filter,如果沒有配置filter,則使用root配置 --> <root level="debug"> <appender-ref ref="STDOUT" /> <appender-ref ref="ERROR-OUT" /> <appender-ref ref="STASH-OUT" /> </root> </configuration>
定時更新日誌代碼
/** * 日誌級別更新定時2分鐘一刷新 */ @Scheduled(fixedRate=1000*60*2) public void refresh() { String moduleKey = "com.ftsafe"; //判斷只有logback的日誌實現才適用此方法 if (log instanceof ch.qos.logback.classic.Logger) { Config applicationConfig = ConfigService.getAppConfig(); String levelConfig = applicationConfig.getProperty("logger.level."+ moduleKey, null); ch.qos.logback.classic.Logger classicLog = (ch.qos.logback.classic.Logger) log; ch.qos.logback.classic.Logger logger = classicLog.getLoggerContext().getLogger(moduleKey); logger.setLevel(Level.toLevel(levelConfig)); log.debug("logger modify level {}", levelConfig); log.info("logger modify level {}", levelConfig); } log.info("logger refresh invoked!"); log.debug("logger refresh invoked!"); }
集中日誌管理(win環境)
Elasticsearch
是一個搜索和分析引擎
解壓
https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.0.0.zip
啟動
d:/baiduYun/java/elasticsearch-6.0.0/bin/elasticsearch.bat
Kibana
允許用戶在 Elasticsearch 中使用圖表和圖表可視化數據
解壓
https://artifacts.elastic.co/downloads/kibana/kibana-6.0.0-windows-x86_64.zip
啟動
d:/baiduYun/java/kibana-6.0.0-windows-x86_64/bin/kibana.bat
logstash
是一個服務器端的數據處理管道,可以同時從多個源獲取數據,將其轉換為Elasticsearch之類的“stash”
解壓
https://artifacts.elastic.co/downloads/logstash/Logstash-6.0.0.zip
配置logstash.conf配置文件內容
input {
tcp {
port => 4560
host => localhost
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
啟動
d:/baiduYun/java/logstash-6.0.0/bin/logstash.bat -f d:\baiduYun\java\logstash-6.0.0\bin\logstash.conf
瀏覽日誌
訪問http://localhost:5601進入kibana界面,可以在Discover界面輸入檢索條件,查出希望檢索到的內容。
附錄:
spring logging
https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-logging.html
defaults.xml
<?xml version="1.0" encoding="UTF-8"?>
<!--
Default logback configuration provided for import, equivalent to the programmatic
initialization performed by Boot
-->
<included>
<conversionRule conversionWord="clr" converterClass="org.springframework.boot.logging.logback.ColorConverter" />
<conversionRule conversionWord="wex" converterClass="org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter" />
<conversionRule conversionWord="wEx" converterClass="org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter" />
<property name="CONSOLE_LOG_PATTERN" value="${CONSOLE_LOG_PATTERN:-%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/>
<property name="FILE_LOG_PATTERN" value="${FILE_LOG_PATTERN:-%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}} ${LOG_LEVEL_PATTERN:-%5p} ${PID:- } --- [%t] %-40.40logger{39} : %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/>
<logger name="org.apache.catalina.startup.DigesterFactory" level="ERROR"/>
<logger name="org.apache.catalina.util.LifecycleBase" level="ERROR"/>
<logger name="org.apache.coyote.http11.Http11NioProtocol" level="WARN"/>
<logger name="org.apache.sshd.common.util.SecurityUtils" level="WARN"/>
<logger name="org.apache.tomcat.util.net.NioSelectorPool" level="WARN"/>
<logger name="org.eclipse.jetty.util.component.AbstractLifeCycle" level="ERROR"/>
<logger name="org.hibernate.validator.internal.util.Version" level="WARN"/>
</included>
console-appender.xml
<?xml version="1.0" encoding="UTF-8"?>
<!--
Console appender logback configuration provided for import, equivalent to the programmatic
initialization performed by Boot
-->
<included>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>${CONSOLE_LOG_PATTERN}</pattern>
</encoder>
</appender>
</included>
file-appender.xml
<?xml version="1.0" encoding="UTF-8"?>
<!--
File appender logback configuration provided for import, equivalent to the programmatic
initialization performed by Boot
-->
<included>
<appender name="FILE"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<encoder>
<pattern>${FILE_LOG_PATTERN}</pattern>
</encoder>
<file>${LOG_FILE}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<fileNamePattern>${LOG_FILE}.%d{yyyy-MM-dd}.%i.gz</fileNamePattern>
<maxFileSize>${LOG_FILE_MAX_SIZE:-10MB}</maxFileSize>
<maxHistory>${LOG_FILE_MAX_HISTORY:-0}</maxHistory>
</rollingPolicy>
</appender>
</included>
參考內容
- https://www.jianshu.com/p/c9d9fe37256a
- 《51CTO下載-SpringCloud微服務架構開發實戰.pdf》
日誌分析實踐與應用