I have a script that read log files and parse the data to insert them to mysql table..
My script looks like
while read x;do
var=$(echo ${x}|cut -d+ -f1)
var2=$(echo ${x}|cut -d_ -f3)
...
echo "$var,$var2,.." >> mysql.infile
done<logfile
The Problem is that log files are thousands of lines and taking hours....
I read that awk
is better, I tried, but don't know the syntax to parse the variables...
EDIT:
inputs are structure firewall logs so they are pretty large files like
@timestamp $HOST reason="idle Timeout" source-address="x.x.x.x"
source-port="19219" destination-address="x.x.x.x"
destination-port="53" service-name="dns-udp" application="DNS"....
So I'm using a lot of grep
for ~60 variables e.g
sourceaddress=$(echo ${x}|grep -P -o '.{0,0}
source-address=".{0,50}'|cut -d" -f2)
if you think perl
will be better I'm open to suggestions and maybe a hint how to script it...
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…