1. 程式人生 > 程式設計 >python logging通過json檔案配置的步驟

python logging通過json檔案配置的步驟

logconfig.json

{
 "version":1,"disable_existing_loggers":false,"formatters":{
 "simple":{
  "format":"[%(asctime)s - %(levelname)s - line(%(lineno)d) - %(filename)s]: %(message)s","datefmt":"%Y-%m-%d %H:%M:%S"
 }
 },"handlers":{
 "console":{
  "class":"logging.StreamHandler","level":"DEBUG","formatter":"simple","stream":"ext://sys.stdout"
 },"info_file_handler":{
  "class":"logging.handlers.TimedRotatingFileHandler","level":"INFO","filename":"../log/info.log","when":"H","interval":1,"backupCount":50,"encoding":"utf8"
 },"error_file_handler":{
  "class":"logging.handlers.TimedRotatingFileHandler","level":"ERROR","filename":"../log/errors.log","encoding":"utf8"
 }
 },"loggers":{
 "my_module":{
  "level":"ERROR","handlers":["info_file_handler"],"propagate":"no"
 }
 },"root":{
 "level":"INFO","handlers":["console","info_file_handler","error_file_handler"]
 }
}

log_utility.py

import os
import json
import logging
import logging.config


def setup_logging(default_path="logconfig.json",default_level=logging.DEBUG):
 path = default_path
 if os.path.exists(path):
 with open(path,"r") as f:
  config = json.load(f)
  logging.config.dictConfig(config)
 else:
 logging.basicConfig(level=default_level)

呼叫

config_path = sys.path[0] + '/logconfig.json'
log_utility.setup_logging(config_path)

補充知識:python logging定製logstash的json日誌格式

最近一直在折騰日誌的收集,現在算是收尾了。 寫一篇算python優化logstash的方案。

其實大家都知道logstash呼叫grok來解析日誌的話,是要消耗cpu的成本的,畢竟是需要正則的匹配的。

根據logstash調優的方案,咱們可以預先生成json的格式。 我這邊基本是python的程式,怎麼搞尼 ?

有兩種方法,第一種方法是生成json後,直接打入logstash的埠。 還有一種是生成json寫入檔案,讓logstash做tail操作的時候,把一行的日誌資料直接載入json就可以了。

python下的日誌除錯用得時logging,改成json也是很好改得。 另外不少老外已經考慮到這樣的需求,已經做了python logstash的模組。

import logging
import logstash
import sys

host = 'localhost'

test_logger = logging.getLogger('python-logstash-logger')
test_logger.setLevel(logging.INFO)
test_logger.addHandler(logstash.LogstashHandler(host,5959,version=1))
# test_logger.addHandler(logstash.TCPLogstashHandler(host,version=1))

test_logger.error('python-logstash: test logstash error message.')
test_logger.info('python-logstash: test logstash info message.')
test_logger.warning('python-logstash: test logstash warning message.')

# add extra field to logstash message
extra = {
 'test_string': 'python version: ' + repr(sys.version_info),'test_boolean': True,'test_dict': {'a': 1,'b': 'c'},'test_float': 1.23,'test_integer': 123,'test_list': [1,2,'3'],}

test_logger.info('python-logstash: test extra fields',extra=extra)

python-logstash自帶了amqp的方案

import logging
import logstash

# AMQP parameters
host = 'localhost'
username = 'guest'
password= 'guest'
exchange = 'logstash.py'

# get a logger and set logging level
test_logger = logging.getLogger('python-logstash-logger')
test_logger.setLevel(logging.INFO)

# add the handler
test_logger.addHandler(logstash.AMQPLogstashHandler(version=1,host=host,durable=True,username=username,password=password,exchange=exchange))

# log
test_logger.error('python-logstash: test logstash error message.')
test_logger.info('python-logstash: test logstash info message.')
test_logger.warning('python-logstash: test logstash warning message.')

try:
 1/0
except:
 test_logger.exception('python-logstash: test logstash exception with stack trace')

不管怎麼說,最後生成的格式是這樣就可以了。

{
 "@source"=>"unknown","@type"=>"nginx","@tags"=>[],"@fields"=>{
 "remote_addr"=>"192.168.0.1","remote_user"=>"-","body_bytes_sent"=>"13988","request_time"=>"0.122","status"=>"200","request"=>"GET /some/url HTTP/1.1","request_method"=>"GET","http_referrer"=>"http://www.example.org/some/url","http_user_agent"=>"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.1 (KHTML,like Gecko) Chrome/21.0.1180.79 Safari/537.1"
 },"@timestamp"=>"2012-08-23T10:49:14+02:00"
}

我這裡簡單提一下,這個模組用的不是很滿意,我在python下把日誌打成了json字串,我原本以為會像grok那樣,在Es裡面,我的這條日誌是個欄位的結構,而不是這個日誌都在message裡面…. 我想大家應該明白了我的意思,這樣很是不容易在kibana的搜尋…

在kibana搜尋,我經常上 source:xxx AND level:INFO 結果正像上面描述的那樣,整條日誌,都在@message裡面。

以上這篇python logging通過json檔案配置的步驟就是小編分享給大家的全部內容了,希望能給大家一個參考,也希望大家多多支援我們。