无法使用Logstash conf文件将数据推送到Elasticsearch中Windows Powershell中显示执行操作错误失败

时间:2019-12-24 07:38:02

标签: sql-server elasticsearch logstash

当我按照这些步骤操作时,在Logstash中遇到错误,我正在遵循a blog将数据导入Kibana。

这是我的配置文件:

    input { 
      file { 
       path => "C:/SalesJan2009/SalesJan2009.csv"  
       type => "csv"
       start_position => "beginning" 
       sincedb_path => "C:/SalesJan2009/sinceDb" }  
     } 
     filter { 
      csv { 
         separator => "," 
         columns => ["Transaction_date","Product","Price","Payment_Type","Name","City","State","Country","Account_Created","Last_Login","Latitude","Longitude"] 
         skip_empty_columns => "true"
         } 
         mutate { 
            convert => [ "Product" => "string" ]
            convert => [ "Price" => "float" ]
            convert => [ "Payment_Type" => "string" ]
            convert => [ "Name" => "string" ]
            convert => [ "City" => "string" ]
            convert => [ "State" => "string" ]
            convert => [ "Country" => "string" ]
            convert => [ "Longitude" => "float" ] 
            convert => [ "Latitude" => "float" ] 
         }
         date 
            {
                match => ["Transaction_date", "dd-MM-yyyyHH:mm:ss"]
                match => ["Account_Created", "dd-MM-yyyyHH:mm:ss"]
                match => ["Last_Login", "dd-MM-yyyyHH:mm:ss"]   
            }    
       }
        output {
            elasticsearch { hosts => ["http://localhost:9200"]
             index => "salestansactions2009"
        }   
      stdout {
        codec => dots
      }
         }

错误是:

PS A:\elk\logstash\logstash-7.4.2\bin> ./logstash -f C:\SalesJan2009\testdata.conf                                                                                      Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to A:/elk/logstash/logstash-7.4.2/logs which is now configured via log4j2.properties
[2019-12-24T12:54:26,310][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-12-24T12:54:26,354][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.4.2"}
[2019-12-24T12:54:29,991][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 15, column 32 (byte 465) after filter { \r\n  csv { \r\n\t separator => \",\" \r\n     columns => [\"Transaction_date\",\"Product\",\"Price\",\"Payment_Type\",\"Name\",\"City\",\"State\",\"Country\",\"Account_Created\",\"Last_Login\",\"Latitude\",\"Longitude\"] \r\n\t skip_empty_columns => \"true\"\r\n\t } \r\n     mutate { \r\n       \tconvert => [ \"Product\" ", :backtrace=>["A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2584:in `map'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:153:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:26:in `initialize'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "A:/elk/logstash/logstash-7.4.2/logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}
[2019-12-24T12:54:30,882][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-12-24T12:54:35,502][INFO ][logstash.runner          ] Logstash shut down.

2 个答案:

答案 0 :(得分:0)

mutate-convert的语法错误。 错误消息显示:

  

“在过滤器{\ r \ n csv {\ r \ n \ t分隔符=> \”,\“ \ r \之后,第15行第32列(字节465)的#,{,,,]之一n列=> [\“交易日期\”,\“产品\”,\“价格\”,\“付款类型\”,\“名称\”,\“城市\”,\“州\”,\“国家/地区\“,\” Account_Created \“,\” Last_Login \“,\” Latitude \“,\” Longitude \“] \ r \ n \ t skip_empty_columns => \” true \“ \ r \ n \ t} \ r \ n mutate {\ r \ n \ tconvert => [\“ Product \”“,...

所以错误是在转换=> [[产品]

之后

看看documentation

值类型为哈希。因此,您对方括号([])的使用是错误的。应该是:

input { 
  file { 
    path => "C:/SalesJan2009/SalesJan2009.csv"  
    type => "csv"
    start_position => "beginning" 
    sincedb_path => "C:/SalesJan2009/sinceDb" 
  }  
}

filter { 
  csv { 
    separator => "," 
    columns => ["Transaction_date","Product","Price","Payment_Type","Name","City","State","Country","Account_Created","Last_Login","Latitude","Longitude"] 
    skip_empty_columns => "true"
  }

  mutate { 
    convert => {
      "Product" => "string"
      "Price" => "float"
      "Payment_Type" => "string"
      "Name" => "string"
      "City" => "string"
      "State" => "string"
      "Country" => "string"
      "Longitude" => "float"
      "Latitude" => "float"
    }
  }

  date {
    match => ["Transaction_date", "dd-MM-yyyyHH:mm:ss"]
    match => ["Account_Created", "dd-MM-yyyyHH:mm:ss"]
    match => ["Last_Login", "dd-MM-yyyyHH:mm:ss"]   
  }    
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "salestansactions2009"
  }   
  stdout {
    codec => dots
  }
}

答案 1 :(得分:0)

相反,

date {
    match => ["Transaction_date", "dd-MM-yyyyHH:mm:ss"]
    match => ["Account_Created", "dd-MM-yyyyHH:mm:ss"]
    match => ["Last_Login", "dd-MM-yyyyHH:mm:ss"]   
  }   

尝试配置以下代码,

     date 
        {
            match => ["Transaction_date", "dd-MM-yyyyHH:mm:ss"]
      }
    date 
        {
            match => ["Account_Created", "dd-MM-yyyyHH:mm:ss"]
     }
     date 
        {
            match => ["Last_Login", "dd-MM-yyyyHH:mm:ss"]
     }