asmodaiOK, to restate my question from yesterday in a different way: so 6.x deprecated document_type - trying to be ahead of the curve I removed it, but this causes _type to be "doc" (as documented), whereas my [@metadata][type] is apache. Which causes mapping problems due to multiple types. Should I still keep using document_type => "%{[@metadata][type]}" or is there another solution I have not found yet?
_val_Hi everyone. Didn't I asked the same question in #elasticsearch so apologies for cross-posting. Can soneone explain what <189>Feb ... in this case means? What is <189> and how to put it in a seperate field?
IndrekIm sending logs with filebeat to logstash
Indreklog line "2018-02-07 13:06:55.584 [http-nio-8887-exec-6] INFO e.t.c.c.c.a.d.UserAuthenticationDao - User not found asdf, User not found!"
Indrekand logstash writes log log "[2018-02-07T15:12:39,665][WARN ][logstash.filters.json ] Error parsing json {:source=>"message"
Indrekwhere can i tell logstash, that this is not json
bjorn_In the beats input
Indrekinput {
Indrek beats {
Indrek port => 5050
Indrek codec => "plain"
Indrek }
Indreki have
Indrekit says that codec is plain
Indrekor should i change some other setting ?
bjorn_Do you parse it as json in a filter?
Indrekthis is my input + filter
bjorn_Indrek: That is your *active* configuration?
Indrekit's my second day on elastic
Indrek+ output to elasticsearch
Indrekand im tring to send multuline logs with filebeats
Indrekis it possible, that if i have 2 configuration files in logstash conf.d directory then one conf filter might affect another conf ?
bjorn_Logstash will read *.conf
Indrekbut it doesn't separate them per file ?
bjorn_Logstash merges them.
Indrekthat's my problem then
asmodai_val_: Does the src logfile have these characters as well in them?
ktosiekCan I send raw, unfiltered events to a separate output? I want this mostly for archiving
_val_asmodai: problem solved
_val_the sending device was including <189> in the message
_val_Probably some id, I don't know. I did a packet inspection using tcpdump.
_val_Thanks! Solved it with gsub => [ "message", "<.*>", "
_val_Thanks! Solved it with gsub => [ "message", "<.*>", "" ]
asmodai_val_: Ah ok, thought it might've been some escaped sequence.
_val_asmodai: Hmm in the logstash.conf the message field doesn't have that afaik
BlackMariais there a way to convert kibana dashboard files, to more recent versions? it appears there are tons of examples of kibana out there that wont run due to the json not being the correct format... or is there a special way to get them to install.. I always get 'Saved Objects: Saved objects file format is invalid and cannot be imported.'
_val_asmodai: so I have the "message" field which holds "<189>Feb 7 ...."
_val_How can I add a tag which contains <189> from the "message" field?
SpixxQuestion; if one needs logstash to use a specific temporary folder instead of /tmp (for which we are running noexec/nodev/nosuid) were should I look?
darkmoonvt_val: you can use a regex (grok), but that looks like it might be syslog message?
_val_darkmoonvt: yes.
_val_I am trying to figure it out though but failing
darkmoonvtSpixx: logstash.yml, or the docs for same, maybe?
darkmoonvtIf the default syslog parser works for you (built in to the syslog input), you can use that. Else it's not hard to write one (groks).
darkmoonvt(meeting, biab)
_val_tried a lot
darkmoonvt_val_: here's what we use:
_val_darkmoonvt: hmm too complicated :/
darkmoonvtI'm sorry. We accept documents from most of campus, so there are a lot of variants. We've also got a lot of metadata for tracking, which you could drop.
_val_darkmoonvt: the Idea is this:
_val_The "message" returns something like "<189>Feb 7 16:38:12.... " Now, I am trying to somehow cut / extract 189 from <..> tags and put it in a different tag/field
darkmoonvtAdd (<%{NONNEGINT:[syslog][facility]}>)? to the beginning of your pattern.
darkmoonvtSorry, for you, probably syslog_facility
_val_ match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } ? You mean here?>
darkmoonvtYes. Before the timestamp. If you also put a leading "^", it will force matches to start at the beginning, which will be a bit more efficient.
darkmoonvt"message" => "^(<%{NONNEGINT:syslog_facility}>)?%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
darkmoonvt Says this pattern works for your data: "(<%{NONNEGINT:syslog_facility}>)?%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?:? %{GREEDYDATA:syslog_message}"
darkmoonvtI had to make the colon optional, and it's putting the date in the program field. (your data isn't quite following the syslog standard.)
_val_darkmoonvt: hmm.. doesn't seem to strip that off.
darkmoonvtIt puts it in it's own field.
darkmoonvtDid you try the grok debugger?
darkmoonvtI get this as the results:
_val_This is how the filter looks like
tronyx_I have a machine which is pushing events into Redis using the filebeat Redis output. On the logstash side, I have the output index setting as, index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
tronyx_The logstash consumers complain with
_val_darkmoonvt: seems to work! Thanks
darkmoonvtnp, glad to help, _val_
darkmoonvttronyx_ - your metadata fields aren't making it through, or are empty strings.
tronyx_darkmoonvt: i was sort of assuming that, but wasn't sure how to check. would this have anything to do with a failed index template load?
_val_darkmoonvt: can I have multiple elasticsearch hosts in output filter?
_val_I mean not as in elasticsearch => { hosts => [host1, host2, host3] }.... but...
_val_twices elasticsearch
darkmoonvtYou mean output to two different elastic clusters?
darkmoonvt(or one cluster twice)
Indrekdoes anyone knwo what do do with error "Failed to create monitoring event {:message=>"For path: events. Map keys: [:pipelines, :reloads]", :error=>"LogStash::Instrument::MetricStore::MetricNotFound"}" ?
timvisheris it possible to filter based on a dynamic date range?
timvisheri'd like to essentially drop events that fall outside of now-7d
darkmoonvtYes, but you need to use ruby to do the date math.
timvisherdarkmoonvt: thanks. :) ended up figuring it out
timvisherit was less than easy :)
timvisherLogStash::Timestamp ≠ Time
darkmoonvtIndeed. It's a union, kinda.