python日志采集分析_python实时分析日志的一个小脚本分享

python日志采集分析_python实时分析日志的一个小脚本分享

2023年7月17日发(作者:)

python⽇志采集分析_python实时分析⽇志的⼀个⼩脚本分享前⾔⼤家都知道Web运维总要关注相关域名的实时2xx/s、4xx/s、5xx/s、响应时间、带宽等这些指标,之前的⽇志是五分钟⼀分割,简单的⽤awk就可以了,现在由于要推送⽇志到ELK,继续之前五分钟⼀分割会有问题,就改为⼀天分割⼀次。改成⼀天⼀分割后,显然再继续⽤Shell就不合适了,于是就⽤Python写了下。⽅法如下:脚本主要运⽤了⽂件的seek和tell函数,原理如下:1.加⼊crontab,每5分钟执⾏⼀次2.只分析从上次读取⽇志⽂件的结束位置到这次读取⽂件时的末尾位置之间的⽇志,出结果可以使⽤zabbix_sender把结果发送到zabbix server或者直接使⽤zabbix agent来读取这个⽂件取数据,配合zabbix出图、做报警,代码如下:#!/usr/bin/env python#coding: utf-8from __future__ import divisionimport osLOG_FILE = '/data0/logs/nginx/xxxx-access_log'POSITION_FILE = '/tmp/'STATUS_FILE = '/tmp/http_status'#crontab 执⾏时间CRON_TIME = 300def get_position():#第⼀次读取⽇志⽂件,POSITION_FILE为空if not (POSITION_FILE):start_position = str(0)end_position = str(e(LOG_FILE))fh = open(POSITION_FILE,'w')('start_position: %sn' % start_position)('end_position: %sn' % end_position)()os._exit(1)else:fh = open(POSITION_FILE)se = nes()()#其他意外情况导致POSITION_FILE内容不是两⾏if len(se) != 2:(POSITION_FILE)os._exit(1)last_start_position,last_end_position = [(':')[1].strip() for item in se]start_position = last_end_positionend_position = str(e(LOG_FILE))#⽇志轮转导致start_position > end_position#print start_position,end_positionif start_position > end_position:start_position = 0#⽇志停⽌滚动时elif start_position == end_position:os._exit(1)#print start_position,end_positionfh = open(POSITION_FILE,'w')('start_position: %sn' % start_position)('end_position: %sn' % end_position)()return map(int,[start_position,end_position])def write_status(content):fh = open(STATUS_FILE,'w')(content)()def handle_log(start_position,end_position):log = open(LOG_FILE)(start_position,0)status_2xx,status_403,status_404,status_500,status_502,status_503,status_504,status_all,rt,bandwidth =0,0,0,0,0,0,0,0,0,0while True:current_position = ()if current_position >= end_position:breakline = ne()line = (' ')host,request_time,time_local,status,bytes_sent = line[1],line[3],line[5],line[10],line[11]#print host,request_time,time_local,status,bytes_sentstatus_all += 1try:rt += float(request_('s'))bandwidth += int(bytes_sent)except:passif status == '200' or status == '206':status_2xx += 1elif status == '403':status_403 += 1elif status == '404':status_404 += 1elif status == '500':status_500 += 1elif status == '502':status_502 += 1elif status == '503':status_503 += 1elif status == '504':status_504 += ()#print "status_2xx: %snstatus_403: %snstatus_404: %snstatus_500: %snstatus_502: %snstatus_503:%snstatus_504: %snstatus_all: %snrt: %snbandwidth: %sn" %(status_2xx/CRON_TIME,status_403/CRON_TIME,status_404/CRON_TIME,status_500/CRON_TIME,status_502/CRON_TIME,statuswrite_status("status_2xx: %snstatus_403: %snstatus_404: %snstatus_500: %snstatus_502: %snstatus_503:%snstatus_504: %snstatus_all: %snrt: %snbandwidth: %sn" %(status_2xx/CRON_TIME,status_403/CRON_TIME,status_404/CRON_TIME,status_500/CRON_TIME,status_502/CRON_TIME,statusif __name__ == '__main__':start_position,end_position = get_position()handle_log(start_position,end_position)看下分析的结果:cat /tmp/http_statusstatus_2xx: 17.3333333333status_403: 0.0status_404: 1.0status_500: 0.0status_502: 0.0status_503: 0.0status_504: 0.0status_all: 20.0rt: 0.3bandwidth: 204032.0后来发现有点问题,start_position、end_position 使⽤字符串⽐较会有问题,如下:In [5]: '99772400' > '100227572'Out[5]: TrueIn [6]: int('99772400') > int('100227572')Out[6]: False因此,更正为:#⽇志轮转导致start_position > end_position#print start_position,end_positionif int(start_position) > int(end_position):start_position = 0#⽇志停⽌滚动时elif int(start_position) == int(end_position):os._exit(1)总结以上就是这篇⽂章的全部内容了,希望本⽂的内容对⼤家的学习或者⼯作能带来⼀定的帮助,如果有疑问⼤家可以留⾔交流,谢谢⼤家对我们的⽀持。本⽂标题: python实时分析⽇志的⼀个⼩脚本分享

发布者:admin,转转请注明出处:http://www.yc00.com/web/1689542403a264719.html

相关推荐

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信