2017-07-04 1 views
0

Mes journaux sontComment analyser multiligne ELK basé sur Timestamp

2017-07-04 10:19:52,896 - [INFO] - from application in ForkJoinPool-3-worker-1 

Resolving database... 

2017-07-04 10:19:52,897 - [INFO] - from application in ForkJoinPool-3-worker-1 

Resolving database... 

2017-07-04 10:19:52,897 - [DEBUG] - from application in ForkJoinPool-3-worker-1 

Json Body : {"took":2,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":0,"max_score":null,"hits":[]},"aggregations":{"fp":{"doc_count_error_upper_bound":0,"sum_other_doc_count":0,"buckets":[]}}} 

2017-07-04 10:19:52,898 - [DEBUG] - from application in application-akka.actor.default-dispatcher-53 

Successfully updated the transaction. 

2017-07-04 10:19:52,899 - [INFO] - from application in ForkJoinPool-3-worker-1 

Resolving database... 

2017-07-04 10:19:52,901 - [DEBUG] - from application in application-akka.actor.default-dispatcher-54 

Successfully updated the transaction. 

Je veux regrouper tous les journaux entre deux timestamp ensemble les match avec GREEDYDATA. J'utilise filebeat avec ELK

Répondre

0

Je l'ai résolu via la configuration suivante

-

paths: 
    - /var/www/aspserver/logs/application.log 
    document_type: asp 
    input_type: log 
    multiline: 
    pattern: '^[0-9]' 
    negate: true 
    match: after 

Matchin toutes les lignes après une ligne commençant par le nombre et les fusionner ensemble.

logstash filter : 
filter { 
    if [type] == "asp" { 
    grok { 
     patterns_dir => "/etc/logstash/conf.d/patterns" 
     match => { "message" => "%{JAVASTACKTRACEPART}" } 
    } 
    } 
} 

engloutissent tous les journaux