_fatalisIs date filter updating also the @timestamp value even if your extracted timestamp goes to another field?
bjorn__fatalis: Unless you set "target", I think it does.
_fatalisthanks bjorn
_fatalisso when @timestamp at first it has the time the event was captured from beats right?
bjorn_IIRC, the @timestamp field is set when Logstash first sees the event.
bjorn_Independent of how the event arrives in Logstash.
_fataliswait though
_fatalisIf you check the beats log, you will see an @timestamp which is slightly after your messages timestamp
_fataliswhich is good as it needs to be processed
_fatalisthis timestamp then becomes the timestamp logstash receives the file or we are talking about the same time
_fatalisi guess the second
_fatalissince when the event is sent by beats
_fatalislogstash should have it
_fatalisOk indeed you have right
_fatalisBeats says Publish : { data } so this should be the time logstash sees the event
negevhi, trying to debug a throughput issue, i'm writing a ton of messages to a log really quickly, filebeat keeps up for a short while then starts falling behind with delays up to around an hour. the box running filebeat isn't overly loaded but i can see it dispatching blocks of 20MB or so continuously every few seconds long after the writes have finished
negevpresumably that means the bottleneck is somewhere in the logstash pipelines
jascbuhello all
jascbuI've use the logstash-filter-json to pull out json from a message and put it in to a target fieldname
jascbuso I now have a set of fields that looks something like [host][name], [host][ip], [host][location], etc
jascbuwhere "host" was the target in the json extraction
jascbuwhat I would like to do is remove all fields where the value == "-"
jascbuI want to do this without having to statically list each field name, so id like to do something similar to
jascbuif [host][location] == "-"
jascbu^^^ sorry ignore that, that would be the static version I don't want to do
jascbuI would like to do it dynamically similar to
jascbuif [host][*] == "-"
jascbu remove_field { [host][*] }
jascbu
jascbuBut after reading through all the plugins I think might be relevant, and reading a lot of elastic and other help chats, I can't see how to do it
jascbuThe only thing that seemed to be discussed that seemed potentially useful was writing Ruby to do it
jascbuCan someone give me their wisdom please?
jascbuIs there a plugin that can iterate through the fields dynamically?
jascbuOr if it comes to Ruby, can you advise me because I haven't written any Ruby in logstash conf before
jascbuThanks
Vapez#join #grok
VapezI have a problem, i delete all elasticsearch entries and now I have issues with modules
_fatalisany idea why the UNIX_MS when forwarded to @timestamp displays wrong conversion? Especially in ms?
torrancew_fatalis: I'm not clear on what you mean
torrancewcan you show your config + maybe some sample logs before + after parsing?
jascbu_fatalis, I have a field, "msec" => "1509557385.330"
jascbuI then change the @timestamp by
jascbu date {
jascbu timezone => "UTC"
jascbu match => ["msec", "UNIX"]
jascbu target => "@timestamp"
jascbu }
jascbuwhich works for milliseconds if the original field is formatted with a dot separator
jascbuAny good for you>
jascbu?
asimzaidiwhat do the workers do …and can I increase them from 2 to more value?