2017-08-22 2 views
0

Ceci est mon fichier filebeat.yml ... Je reçois une erreur: 1053 chaque fois que je lance le service de fichiers. peut-être une erreur que je fais dans ce fichier, s'il vous plaît corrigez-moi où je me trompe.Le service FileBeat ne démarre pas en raison de la configuration yml

###################### Filebeat Configuration Example ######################### 

# This file is an example configuration file highlighting only the most common 
# options. The filebeat.full.yml file from the same directory contains all the 
# supported options with more comments. You can use it as a reference. 
# 
# You can find the full configuration reference here: 
# https://www.elastic.co/guide/en/beats/filebeat/index.html 

#=========================== Filebeat prospectors ============================= 

filebeat.prospectors: 

# Each - is a prospector. Most options can be set at the prospector level, so 
# you can use different prospectors for various configurations. 
# Below are the prospector specific configurations. 



    # Paths that should be crawled and fetched. Glob based paths. 
paths: 
- E:\ELK-STACK\logstash-tutorial-dataset.log 
input_type: log 
document_type: apachelogs 
    # document_type: apachelogs 



    #paths: 
    # - E:\ELK-STACK\mylogs.log 
    #fields: {log_type: mypersonal-logs} 
     #- C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170810 
    # - C:\ECLIPSE WORKSPACE\jcgA1\jcgA1\logs-logstash.* 
    # Exclude lines. A list of regular expressions to match. It drops the lines that are 
    # matching any regular expression from the list. 
    #exclude_lines: ["^DBG"] 

    # Include lines. A list of regular expressions to match. It exports the lines that are 
    # matching any regular expression from the list. 
    #include_lines: ["^ERR", "^WARN"] 

    # Exclude files. A list of regular expressions to match. Filebeat drops the files that 
    # are matching any regular expression from the list. By default, no files are dropped. 
    #exclude_files: [".gz$"] 

    # Optional additional fields. These field can be freely picked 
    # to add additional information to the crawled log files for filtering 
    #fields: 
    # level: debug 
    # review: 1 

    ### Multiline options 

    # Mutiline can be used for log messages spanning multiple lines. This is common 
    # for Java Stack Traces or C-Line Continuation 

    # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [ 
    #multiline.pattern: ^\[ 

    # Defines if the pattern set under pattern should be negated or not. Default is false. 
    #multiline.negate: false 

    # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern 
    # that was (not) matched before or after or as long as a pattern is not matched based on negate. 
    # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash 
    #multiline.match: after 


#================================ General ===================================== 

# The name of the shipper that publishes the network data. It can be used to group 
# all the transactions sent by a single shipper in the web interface. 
#name: 

# The tags of the shipper are included in their own field with each 
# transaction published. 
#tags: ["service-X", "web-tier"] 

# Optional fields that you can specify to add additional information to the 
# output. 
#fields: 
# env: staging 

#================================ Outputs ===================================== 

# Configure what outputs to use when sending the data collected by the beat. 
# Multiple outputs may be used. 

#-------------------------- Elasticsearch output ------------------------------ 
#output.elasticsearch: 
    # Array of hosts to connect to. 
# hosts: ["localhost:9200"] 

    # Optional protocol and basic auth credentials. 
    #protocol: "https" 
    #username: "elastic" 
    #password: "changeme" 

#----------------------------- Logstash output -------------------------------- 
output.logstash: 
    # The Logstash hosts 
    hosts: ["localhost:5043"] 

    # Optional SSL. By default is off. 
    # List of root certificates for HTTPS server verifications 
    #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"] 

    # Certificate for SSL client authentication 
    #ssl.certificate: "/etc/pki/client/cert.pem" 

    # Client Certificate Key 
    #ssl.key: "/etc/pki/client/cert.key" 

#================================ Logging ===================================== 

# Sets log level. The default log level is info. 
# Available log levels are: critical, error, warning, info, debug 
#logging.level: debug 

# At debug level, you can selectively enable logging only for some components. 
# To enable all selectors use ["*"]. Examples of other selectors are "beat", 
# "publish", "service". 
#logging.selectors: ["*"] 

En fait ce que je suis en train de faire est, je suis en train d'utiliser plusieurs journaux spécifiant « document_type », si je retire « document_type » il fonctionne, mais pourquoi « document_type » (comme je vois ce depcreated dans filebeat 5.5) ou "fields" ne fonctionne pas.

aidez s'il vous plaît.

+0

Assurez-vous que votre champ YML est correctement formaté: http://yaml-online-parser.appspot.com/ – Val

+0

@Val oui j'ai fait certains changements et maintenant il est correctement formaté et encore une fois il montre une erreur lors du démarrage du service. –

+0

Sortie: - { "output.logstash": { "hôtes": [ "localhost: 5043" ] }, "filebeat.prospectors": { "-input_type": "log", " chemins ": [ "E: \\ ELK-STACK \\ logstash-tutorial-dataset.log" ] }, "document_type": "apachelogs" } –

Répondre

1

Vous avez une erreur de syntaxe dans votre fichier de configuration. Les clés filebeat.prospectors veulent une valeur de tableau, mais vous lui transmettez un hachage à la place.

De plus, vous avez des problèmes d'indentation.

Ceci est une version corrigée de votre fichier de configuration (sans commentaires par souci de brièveté)

filebeat.prospectors: 
- 
    paths: 
    - E:\ELK-STACK\logstash-tutorial-dataset.log 
    input_type: log 
    document_type: apachelogs 
output.logstash: 
    hosts: ["localhost:5043"] 
+0

merci !! il a travaillé Maintenant, j'ai une autre question que j'ai des journaux non structurés de mon projet authentique ... comment puis-je appliquer le motif grok sur elle comme il ne suit pas le même modèle en elle. –