Logstash: Multiple plugins in logstash input












1















I'm currently using logstash and vulnwhisperer ( to extract openvas reports in json to a directory). This integrating went well.



Right now i'm having problems with the configuration file in logstash.
Initialy it only recieved inputs from the folder directory but i need to parse information that i can obtain by querying the elasticsearch. So i'm trying to use two plugins in the logstash input of the configuration file.



As you can see below, the logstash is not working properly, he keeps starting and shutting down due to an error in the configuration file.



Below you can see both the logstash status and logs. I'm new to logstash so a really appreciate the help. Thank you!



The ip's where marked as "X" just for this purpose



Logstash configuration file:



# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash

input {
file {
path => "/opt/VulnWhisperer/data/openvas/*.json"
type => json
codec => json
start_position => "beginning"
tags => [ "openvas_scan", "openvas" ]
}
elasticsearch {
hosts => "http://XX.XXX.XXX.XXX:9200" (http://XX.XXX.XXX.XXX:9200')
index => "metricbeat-*"
query => { "query": { "match": {"host.name" : "%{asset}" } } }
size => 1
docinfo => false
sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
}
}

filter {
if "openvas_scan" in [tags] {
mutate {
replace => [ "message", "%{message}" ]
gsub => [
"message", "|||", " ",
"message", "tt", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " ",
"message", "nan", " ",
"message",'n',''
]
}

grok {
match => { "path" => "openvas_scan_%{DATA:scan_id}_%{INT:last_updated}.json$" }
tag_on_failure =>
}

mutate {
add_field => { "risk_score" => "%{cvss}" }
}

if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}
mutate { replace => { "risk" => "info" }}
}
if [risk] == "2" {
mutate { add_field => { "risk_number" => 1 }}
mutate { replace => { "risk" => "low" }}
}
if [risk] == "3" {
mutate { add_field => { "risk_number" => 2 }}
mutate { replace => { "risk" => "medium" }}
}
if [risk] == "4" {
mutate { add_field => { "risk_number" => 3 }}
mutate { replace => { "risk" => "high" }}
}
if [risk] == "5" {
mutate { add_field => { "risk_number" => 4 }}
mutate { replace => { "risk" => "critical" }}
}

mutate {
remove_field => "message"
}

if [first_time_detected] {
date {
match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_detected"
}
}
if [first_time_tested] {
date {
match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_tested"
}
}
if [last_time_detected] {
date {
match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_detected"
}
}
if [last_time_tested] {
date {
match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_tested"
}
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => "last_updated"
}
mutate {
convert => { "plugin_id" => "integer"}
convert => { "id" => "integer"}
convert => { "risk_number" => "integer"}
convert => { "risk_score" => "float"}
convert => { "total_times_detected" => "integer"}
convert => { "cvss_temporal" => "float"}
convert => { "cvss" => "float"}
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
# Add your critical assets by subnet or by hostname. Comment this field out if you don't want to tag any, but the asset panel will break.
if [asset] =~ "^10.0.100." {
mutate {
add_tag => [ "critical_asset" ]
}
}
}
}
output {
if "openvas" in [tags] {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "XX.XXX.XXX.XXX:XXXX" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}


Service logstash status:



root@logstash:/etc/logstash/conf.d# service logstash status
● logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
Active: active (running) since Fri 2018-11-23 12:17:29 WET; 9s ago
Main PID: 7041 (java)
Tasks: 17 (limit: 4915)
CGroup: /system.slice/logstash.service
└─7041 /usr/bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedyna

Nov 23 12:17:29 logstash systemd[1]: logstash.service: Service hold-off time over, scheduling restart.
Nov 23 12:17:29 logstash systemd[1]: Stopped logstash.
Nov 23 12:17:29 logstash systemd[1]: Started logstash.


Logstash Log:



[2018-11-23T16:16:57,156][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:27,133][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:28,380][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:9200" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:17:28,801][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:58,602][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:59,808][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:XXXX" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:18:00,174][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}









share|improve this question























  • Why do you have a value in parenthesis on the hosts line in your elasticsearch input ?

    – Val
    Nov 23 '18 at 16:57











  • In the example i found they where using it like that, but i'm not really sure about the elasticsearch input sintax or the way i'm using two plugins.

    – slandtzyes
    Nov 26 '18 at 9:06











  • You need to remove that and only have hosts => "http://XX.XXX.XXX.XXX:9200", it's gonna work much better

    – Val
    Nov 26 '18 at 9:19











  • I have made those changes and unfortunately i'm receiving the same error in the status and logs

    – slandtzyes
    Nov 26 '18 at 11:27











  • Can you share the new error you get, there's probably another issue somewhere

    – Val
    Nov 26 '18 at 11:59


















1















I'm currently using logstash and vulnwhisperer ( to extract openvas reports in json to a directory). This integrating went well.



Right now i'm having problems with the configuration file in logstash.
Initialy it only recieved inputs from the folder directory but i need to parse information that i can obtain by querying the elasticsearch. So i'm trying to use two plugins in the logstash input of the configuration file.



As you can see below, the logstash is not working properly, he keeps starting and shutting down due to an error in the configuration file.



Below you can see both the logstash status and logs. I'm new to logstash so a really appreciate the help. Thank you!



The ip's where marked as "X" just for this purpose



Logstash configuration file:



# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash

input {
file {
path => "/opt/VulnWhisperer/data/openvas/*.json"
type => json
codec => json
start_position => "beginning"
tags => [ "openvas_scan", "openvas" ]
}
elasticsearch {
hosts => "http://XX.XXX.XXX.XXX:9200" (http://XX.XXX.XXX.XXX:9200')
index => "metricbeat-*"
query => { "query": { "match": {"host.name" : "%{asset}" } } }
size => 1
docinfo => false
sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
}
}

filter {
if "openvas_scan" in [tags] {
mutate {
replace => [ "message", "%{message}" ]
gsub => [
"message", "|||", " ",
"message", "tt", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " ",
"message", "nan", " ",
"message",'n',''
]
}

grok {
match => { "path" => "openvas_scan_%{DATA:scan_id}_%{INT:last_updated}.json$" }
tag_on_failure =>
}

mutate {
add_field => { "risk_score" => "%{cvss}" }
}

if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}
mutate { replace => { "risk" => "info" }}
}
if [risk] == "2" {
mutate { add_field => { "risk_number" => 1 }}
mutate { replace => { "risk" => "low" }}
}
if [risk] == "3" {
mutate { add_field => { "risk_number" => 2 }}
mutate { replace => { "risk" => "medium" }}
}
if [risk] == "4" {
mutate { add_field => { "risk_number" => 3 }}
mutate { replace => { "risk" => "high" }}
}
if [risk] == "5" {
mutate { add_field => { "risk_number" => 4 }}
mutate { replace => { "risk" => "critical" }}
}

mutate {
remove_field => "message"
}

if [first_time_detected] {
date {
match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_detected"
}
}
if [first_time_tested] {
date {
match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_tested"
}
}
if [last_time_detected] {
date {
match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_detected"
}
}
if [last_time_tested] {
date {
match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_tested"
}
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => "last_updated"
}
mutate {
convert => { "plugin_id" => "integer"}
convert => { "id" => "integer"}
convert => { "risk_number" => "integer"}
convert => { "risk_score" => "float"}
convert => { "total_times_detected" => "integer"}
convert => { "cvss_temporal" => "float"}
convert => { "cvss" => "float"}
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
# Add your critical assets by subnet or by hostname. Comment this field out if you don't want to tag any, but the asset panel will break.
if [asset] =~ "^10.0.100." {
mutate {
add_tag => [ "critical_asset" ]
}
}
}
}
output {
if "openvas" in [tags] {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "XX.XXX.XXX.XXX:XXXX" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}


Service logstash status:



root@logstash:/etc/logstash/conf.d# service logstash status
● logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
Active: active (running) since Fri 2018-11-23 12:17:29 WET; 9s ago
Main PID: 7041 (java)
Tasks: 17 (limit: 4915)
CGroup: /system.slice/logstash.service
└─7041 /usr/bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedyna

Nov 23 12:17:29 logstash systemd[1]: logstash.service: Service hold-off time over, scheduling restart.
Nov 23 12:17:29 logstash systemd[1]: Stopped logstash.
Nov 23 12:17:29 logstash systemd[1]: Started logstash.


Logstash Log:



[2018-11-23T16:16:57,156][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:27,133][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:28,380][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:9200" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:17:28,801][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:58,602][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:59,808][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:XXXX" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:18:00,174][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}









share|improve this question























  • Why do you have a value in parenthesis on the hosts line in your elasticsearch input ?

    – Val
    Nov 23 '18 at 16:57











  • In the example i found they where using it like that, but i'm not really sure about the elasticsearch input sintax or the way i'm using two plugins.

    – slandtzyes
    Nov 26 '18 at 9:06











  • You need to remove that and only have hosts => "http://XX.XXX.XXX.XXX:9200", it's gonna work much better

    – Val
    Nov 26 '18 at 9:19











  • I have made those changes and unfortunately i'm receiving the same error in the status and logs

    – slandtzyes
    Nov 26 '18 at 11:27











  • Can you share the new error you get, there's probably another issue somewhere

    – Val
    Nov 26 '18 at 11:59
















1












1








1


0






I'm currently using logstash and vulnwhisperer ( to extract openvas reports in json to a directory). This integrating went well.



Right now i'm having problems with the configuration file in logstash.
Initialy it only recieved inputs from the folder directory but i need to parse information that i can obtain by querying the elasticsearch. So i'm trying to use two plugins in the logstash input of the configuration file.



As you can see below, the logstash is not working properly, he keeps starting and shutting down due to an error in the configuration file.



Below you can see both the logstash status and logs. I'm new to logstash so a really appreciate the help. Thank you!



The ip's where marked as "X" just for this purpose



Logstash configuration file:



# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash

input {
file {
path => "/opt/VulnWhisperer/data/openvas/*.json"
type => json
codec => json
start_position => "beginning"
tags => [ "openvas_scan", "openvas" ]
}
elasticsearch {
hosts => "http://XX.XXX.XXX.XXX:9200" (http://XX.XXX.XXX.XXX:9200')
index => "metricbeat-*"
query => { "query": { "match": {"host.name" : "%{asset}" } } }
size => 1
docinfo => false
sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
}
}

filter {
if "openvas_scan" in [tags] {
mutate {
replace => [ "message", "%{message}" ]
gsub => [
"message", "|||", " ",
"message", "tt", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " ",
"message", "nan", " ",
"message",'n',''
]
}

grok {
match => { "path" => "openvas_scan_%{DATA:scan_id}_%{INT:last_updated}.json$" }
tag_on_failure =>
}

mutate {
add_field => { "risk_score" => "%{cvss}" }
}

if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}
mutate { replace => { "risk" => "info" }}
}
if [risk] == "2" {
mutate { add_field => { "risk_number" => 1 }}
mutate { replace => { "risk" => "low" }}
}
if [risk] == "3" {
mutate { add_field => { "risk_number" => 2 }}
mutate { replace => { "risk" => "medium" }}
}
if [risk] == "4" {
mutate { add_field => { "risk_number" => 3 }}
mutate { replace => { "risk" => "high" }}
}
if [risk] == "5" {
mutate { add_field => { "risk_number" => 4 }}
mutate { replace => { "risk" => "critical" }}
}

mutate {
remove_field => "message"
}

if [first_time_detected] {
date {
match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_detected"
}
}
if [first_time_tested] {
date {
match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_tested"
}
}
if [last_time_detected] {
date {
match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_detected"
}
}
if [last_time_tested] {
date {
match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_tested"
}
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => "last_updated"
}
mutate {
convert => { "plugin_id" => "integer"}
convert => { "id" => "integer"}
convert => { "risk_number" => "integer"}
convert => { "risk_score" => "float"}
convert => { "total_times_detected" => "integer"}
convert => { "cvss_temporal" => "float"}
convert => { "cvss" => "float"}
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
# Add your critical assets by subnet or by hostname. Comment this field out if you don't want to tag any, but the asset panel will break.
if [asset] =~ "^10.0.100." {
mutate {
add_tag => [ "critical_asset" ]
}
}
}
}
output {
if "openvas" in [tags] {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "XX.XXX.XXX.XXX:XXXX" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}


Service logstash status:



root@logstash:/etc/logstash/conf.d# service logstash status
● logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
Active: active (running) since Fri 2018-11-23 12:17:29 WET; 9s ago
Main PID: 7041 (java)
Tasks: 17 (limit: 4915)
CGroup: /system.slice/logstash.service
└─7041 /usr/bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedyna

Nov 23 12:17:29 logstash systemd[1]: logstash.service: Service hold-off time over, scheduling restart.
Nov 23 12:17:29 logstash systemd[1]: Stopped logstash.
Nov 23 12:17:29 logstash systemd[1]: Started logstash.


Logstash Log:



[2018-11-23T16:16:57,156][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:27,133][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:28,380][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:9200" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:17:28,801][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:58,602][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:59,808][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:XXXX" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:18:00,174][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}









share|improve this question














I'm currently using logstash and vulnwhisperer ( to extract openvas reports in json to a directory). This integrating went well.



Right now i'm having problems with the configuration file in logstash.
Initialy it only recieved inputs from the folder directory but i need to parse information that i can obtain by querying the elasticsearch. So i'm trying to use two plugins in the logstash input of the configuration file.



As you can see below, the logstash is not working properly, he keeps starting and shutting down due to an error in the configuration file.



Below you can see both the logstash status and logs. I'm new to logstash so a really appreciate the help. Thank you!



The ip's where marked as "X" just for this purpose



Logstash configuration file:



# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash

input {
file {
path => "/opt/VulnWhisperer/data/openvas/*.json"
type => json
codec => json
start_position => "beginning"
tags => [ "openvas_scan", "openvas" ]
}
elasticsearch {
hosts => "http://XX.XXX.XXX.XXX:9200" (http://XX.XXX.XXX.XXX:9200')
index => "metricbeat-*"
query => { "query": { "match": {"host.name" : "%{asset}" } } }
size => 1
docinfo => false
sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
}
}

filter {
if "openvas_scan" in [tags] {
mutate {
replace => [ "message", "%{message}" ]
gsub => [
"message", "|||", " ",
"message", "tt", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " ",
"message", "nan", " ",
"message",'n',''
]
}

grok {
match => { "path" => "openvas_scan_%{DATA:scan_id}_%{INT:last_updated}.json$" }
tag_on_failure =>
}

mutate {
add_field => { "risk_score" => "%{cvss}" }
}

if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}
mutate { replace => { "risk" => "info" }}
}
if [risk] == "2" {
mutate { add_field => { "risk_number" => 1 }}
mutate { replace => { "risk" => "low" }}
}
if [risk] == "3" {
mutate { add_field => { "risk_number" => 2 }}
mutate { replace => { "risk" => "medium" }}
}
if [risk] == "4" {
mutate { add_field => { "risk_number" => 3 }}
mutate { replace => { "risk" => "high" }}
}
if [risk] == "5" {
mutate { add_field => { "risk_number" => 4 }}
mutate { replace => { "risk" => "critical" }}
}

mutate {
remove_field => "message"
}

if [first_time_detected] {
date {
match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_detected"
}
}
if [first_time_tested] {
date {
match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_tested"
}
}
if [last_time_detected] {
date {
match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_detected"
}
}
if [last_time_tested] {
date {
match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_tested"
}
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => "last_updated"
}
mutate {
convert => { "plugin_id" => "integer"}
convert => { "id" => "integer"}
convert => { "risk_number" => "integer"}
convert => { "risk_score" => "float"}
convert => { "total_times_detected" => "integer"}
convert => { "cvss_temporal" => "float"}
convert => { "cvss" => "float"}
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
# Add your critical assets by subnet or by hostname. Comment this field out if you don't want to tag any, but the asset panel will break.
if [asset] =~ "^10.0.100." {
mutate {
add_tag => [ "critical_asset" ]
}
}
}
}
output {
if "openvas" in [tags] {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "XX.XXX.XXX.XXX:XXXX" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}


Service logstash status:



root@logstash:/etc/logstash/conf.d# service logstash status
● logstash.service - logstash
Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
Active: active (running) since Fri 2018-11-23 12:17:29 WET; 9s ago
Main PID: 7041 (java)
Tasks: 17 (limit: 4915)
CGroup: /system.slice/logstash.service
└─7041 /usr/bin/java -Xms1g -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedyna

Nov 23 12:17:29 logstash systemd[1]: logstash.service: Service hold-off time over, scheduling restart.
Nov 23 12:17:29 logstash systemd[1]: Stopped logstash.
Nov 23 12:17:29 logstash systemd[1]: Started logstash.


Logstash Log:



[2018-11-23T16:16:57,156][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:27,133][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:28,380][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:9200" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:17:28,801][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-23T16:17:58,602][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.3"}
[2018-11-23T16:17:59,808][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ", ', } at line 31, column 43 (byte 643) after input {n file {n path => "/opt/VulnWhisperer/data/openvas/*.json"n type => jsonn codec => jsonn start_position => "beginning"n tags => [ "openvas_scan", "openvas" ]n }n elasticsearch {n hosts => "http://XX.XXX.XXX.XXX:XXXX" ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:22:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-11-23T16:18:00,174][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}






security elasticsearch logstash






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 23 '18 at 16:21









slandtzyesslandtzyes

61




61













  • Why do you have a value in parenthesis on the hosts line in your elasticsearch input ?

    – Val
    Nov 23 '18 at 16:57











  • In the example i found they where using it like that, but i'm not really sure about the elasticsearch input sintax or the way i'm using two plugins.

    – slandtzyes
    Nov 26 '18 at 9:06











  • You need to remove that and only have hosts => "http://XX.XXX.XXX.XXX:9200", it's gonna work much better

    – Val
    Nov 26 '18 at 9:19











  • I have made those changes and unfortunately i'm receiving the same error in the status and logs

    – slandtzyes
    Nov 26 '18 at 11:27











  • Can you share the new error you get, there's probably another issue somewhere

    – Val
    Nov 26 '18 at 11:59





















  • Why do you have a value in parenthesis on the hosts line in your elasticsearch input ?

    – Val
    Nov 23 '18 at 16:57











  • In the example i found they where using it like that, but i'm not really sure about the elasticsearch input sintax or the way i'm using two plugins.

    – slandtzyes
    Nov 26 '18 at 9:06











  • You need to remove that and only have hosts => "http://XX.XXX.XXX.XXX:9200", it's gonna work much better

    – Val
    Nov 26 '18 at 9:19











  • I have made those changes and unfortunately i'm receiving the same error in the status and logs

    – slandtzyes
    Nov 26 '18 at 11:27











  • Can you share the new error you get, there's probably another issue somewhere

    – Val
    Nov 26 '18 at 11:59



















Why do you have a value in parenthesis on the hosts line in your elasticsearch input ?

– Val
Nov 23 '18 at 16:57





Why do you have a value in parenthesis on the hosts line in your elasticsearch input ?

– Val
Nov 23 '18 at 16:57













In the example i found they where using it like that, but i'm not really sure about the elasticsearch input sintax or the way i'm using two plugins.

– slandtzyes
Nov 26 '18 at 9:06





In the example i found they where using it like that, but i'm not really sure about the elasticsearch input sintax or the way i'm using two plugins.

– slandtzyes
Nov 26 '18 at 9:06













You need to remove that and only have hosts => "http://XX.XXX.XXX.XXX:9200", it's gonna work much better

– Val
Nov 26 '18 at 9:19





You need to remove that and only have hosts => "http://XX.XXX.XXX.XXX:9200", it's gonna work much better

– Val
Nov 26 '18 at 9:19













I have made those changes and unfortunately i'm receiving the same error in the status and logs

– slandtzyes
Nov 26 '18 at 11:27





I have made those changes and unfortunately i'm receiving the same error in the status and logs

– slandtzyes
Nov 26 '18 at 11:27













Can you share the new error you get, there's probably another issue somewhere

– Val
Nov 26 '18 at 11:59







Can you share the new error you get, there's probably another issue somewhere

– Val
Nov 26 '18 at 11:59














1 Answer
1






active

oldest

votes


















0














Please change the below setting



elasticsearch {
hosts => "localhost"
index => "metricbeat-*"
query => '{ "query": { "match": {"host.name" : "%{asset}" } } }'
size => 1
docinfo => false
#sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
}





share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53450023%2flogstash-multiple-plugins-in-logstash-input%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Please change the below setting



    elasticsearch {
    hosts => "localhost"
    index => "metricbeat-*"
    query => '{ "query": { "match": {"host.name" : "%{asset}" } } }'
    size => 1
    docinfo => false
    #sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
    }





    share|improve this answer




























      0














      Please change the below setting



      elasticsearch {
      hosts => "localhost"
      index => "metricbeat-*"
      query => '{ "query": { "match": {"host.name" : "%{asset}" } } }'
      size => 1
      docinfo => false
      #sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
      }





      share|improve this answer


























        0












        0








        0







        Please change the below setting



        elasticsearch {
        hosts => "localhost"
        index => "metricbeat-*"
        query => '{ "query": { "match": {"host.name" : "%{asset}" } } }'
        size => 1
        docinfo => false
        #sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
        }





        share|improve this answer













        Please change the below setting



        elasticsearch {
        hosts => "localhost"
        index => "metricbeat-*"
        query => '{ "query": { "match": {"host.name" : "%{asset}" } } }'
        size => 1
        docinfo => false
        #sort => "sort": [ { "@timestamp": { "order": "desc"} } ]
        }






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Dec 18 '18 at 9:30









        AngelAngel

        11




        11
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53450023%2flogstash-multiple-plugins-in-logstash-input%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Wiesbaden

            Marschland

            Dieringhausen