@timestamp field not overwritten when using filter { json { ... } }
Logstash information:
Please include the following information:
- Logstash version: 8.17.0, 8.17.9, 9.1.1
- Logstash installation source: docker
- How is Logstash being run: docker service, docker cli
Plugins installed: (bin/logstash-plugin list --verbose)
- no extra plugin installed logstash-codec-avro (3.4.1) logstash-codec-cef (6.2.8) logstash-codec-collectd (3.1.0) logstash-codec-dots (3.0.6) logstash-codec-edn (3.1.0) logstash-codec-edn_lines (3.1.0) logstash-codec-es_bulk (3.1.0) logstash-codec-fluent (3.4.3) logstash-codec-graphite (3.0.6) logstash-codec-json (3.1.1) logstash-codec-json_lines (3.2.2) logstash-codec-line (3.1.1) logstash-codec-msgpack (3.1.0) logstash-codec-multiline (3.1.2) logstash-codec-netflow (4.3.2) logstash-codec-plain (3.1.0) logstash-codec-rubydebug (3.1.0) logstash-filter-aggregate (2.10.0) logstash-filter-anonymize (3.0.7) logstash-filter-cidr (3.1.3) logstash-filter-clone (4.2.0) logstash-filter-csv (3.1.1) logstash-filter-date (3.1.15) logstash-filter-de_dot (1.1.0) logstash-filter-dissect (1.2.5) logstash-filter-dns (3.2.0) logstash-filter-drop (3.0.5) logstash-filter-elasticsearch (4.2.0) logstash-filter-fingerprint (3.4.4) logstash-filter-geoip (7.3.1) logstash-filter-grok (4.4.3) logstash-filter-http (2.0.0) logstash-filter-json (3.2.1) logstash-filter-kv (4.7.0) logstash-filter-memcached (1.2.0) logstash-filter-metrics (4.0.7) logstash-filter-mutate (3.5.8) logstash-filter-prune (3.0.4) logstash-filter-ruby (3.1.8) logstash-filter-sleep (3.0.7) logstash-filter-split (3.1.8) logstash-filter-syslog_pri (3.2.1) logstash-filter-throttle (4.0.4) logstash-filter-translate (3.4.3) logstash-filter-truncate (1.0.6) logstash-filter-urldecode (3.0.6) logstash-filter-useragent (3.3.5) logstash-filter-uuid (3.0.5) logstash-filter-xml (4.3.2) logstash-input-azure_event_hubs (1.5.2) logstash-input-beats (7.0.2) └── logstash-input-elastic_agent (alias) logstash-input-couchdb_changes (3.1.6) logstash-input-dead_letter_queue (2.0.1) logstash-input-elastic_serverless_forwarder (2.0.0) logstash-input-elasticsearch (5.2.0) logstash-input-exec (3.6.0) logstash-input-file (4.4.6) logstash-input-ganglia (3.1.4) logstash-input-gelf (3.3.2) logstash-input-generator (3.1.0) logstash-input-graphite (3.0.6) logstash-input-heartbeat (3.1.1) logstash-input-http (4.1.2) logstash-input-http_poller (6.0.0) logstash-input-jms (3.3.0) logstash-input-pipe (3.1.0) logstash-input-redis (3.7.1) logstash-input-stdin (3.4.0) logstash-input-syslog (3.7.1) logstash-input-tcp (7.0.2) logstash-input-twitter (4.1.1) logstash-input-udp (3.5.0) logstash-input-unix (3.1.2) logstash-integration-aws (7.2.1) ├── logstash-codec-cloudfront ├── logstash-codec-cloudtrail ├── logstash-input-cloudwatch ├── logstash-input-s3 ├── logstash-input-sqs ├── logstash-output-cloudwatch ├── logstash-output-s3 ├── logstash-output-sns └── logstash-output-sqs logstash-integration-jdbc (5.6.0) ├── logstash-input-jdbc ├── logstash-filter-jdbc_streaming └── logstash-filter-jdbc_static logstash-integration-kafka (11.6.3) ├── logstash-input-kafka └── logstash-output-kafka logstash-integration-logstash (1.0.4) ├── logstash-input-logstash └── logstash-output-logstash logstash-integration-rabbitmq (7.4.0) ├── logstash-input-rabbitmq └── logstash-output-rabbitmq logstash-integration-snmp (4.0.7) ├── logstash-input-snmp └── logstash-input-snmptrap logstash-output-csv (3.0.10) logstash-output-elasticsearch (12.0.6) logstash-output-email (4.1.3) logstash-output-file (4.3.0) logstash-output-graphite (3.1.6) logstash-output-http (6.0.0) logstash-output-lumberjack (3.1.9) logstash-output-nagios (3.0.6) logstash-output-null (3.0.5) logstash-output-pipe (3.0.6) logstash-output-redis (5.2.0) logstash-output-stdout (3.1.4) logstash-output-tcp (7.0.1) logstash-output-udp (3.2.0) logstash-output-webhdfs (3.1.0) logstash-patterns-core (4.3.4)
JVM (e.g. java -version):
bundle java
If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:
- JVM version (
java -version) - JVM installation source (e.g. from the Operating System's package manager, from source, etc).
- Value of the
LS_JAVA_HOMEenvironment variable if set.
OS version (uname -a if on a Unix-like system):
Linux 8bc32913ef31 6.10.14-linuxkit #1 SMP Tue Apr 15 16:00:54 UTC 2025 aarch64 aarch64 aarch64 GNU/Linux
Description of the problem including expected versus actual behavior:
When ingesting a JSON message that contains a valid ISO8601 @timestamp field, we expect Logstash to preserve the provided timestamp as the event timestamp.
- ✅ This works as expected when using
codec => jsonin the input plugin. - ❌ But when parsing the JSON using
filter { json { source => "message" } }, the@timestampvalue is ignored, and the event timestamp is set to the system current time instead.
This leads to inconsistent behavior depending on whether the JSON is parsed at input or via a filter.
Steps to reproduce:
-
Input a message containing a valid
@timestampfield:{"@timestamp":"2024-12-24T15:34:00Z","message":"Test event"} -
Use the following Logstash pipeline:
input { stdin { } } filter { json { source => "message" } } output { stdout { codec => rubydebug } } -
Observe that the resulting event's
@timestampis set to the current time instead of"2024-12-24T15:34:00Z".
✅ Expected:
The @timestamp value in the parsed JSON should be used as the event's timestamp.
❌ Actual:
The event's @timestamp is set to the current time, and the original timestamp is treated as a regular field (if not overwritten).
Provide logs (if relevant):
Example output with rubydebug:
{
"@timestamp" => 2025-08-08T09:21:53.328Z,
"message" => "{\"@timestamp\":\"2024-12-24T15:34:00Z\",\"message\":\"Test event\"}",
...
}
Suggested resolution:
Update the json filter plugin to recognize and coerce @timestamp fields (via LogStash::Timestamp.coerce) just like the json codec does.
Just to confirm, this is not observed in modern versions of Logstash, right?
/tmp/logstash-9.1.1 took 10s
❯ cat cfg
input {
stdin { }
}
filter {
json {
source => "message"
}
}
output {
stdout { codec => rubydebug }
}
/tmp/logstash-9.1.1
❯ bin/logstash -f cfg --log.level=error
Using bundled JDK: /tmp/logstash-9.1.1/jdk.app/Contents/Home
Sending Logstash logs to /tmp/logstash-9.1.1/logs which is now configured via log4j2.properties
The stdin plugin is now waiting for input:
{"@timestamp":"2024-12-24T15:34:00Z","message":"Test event"}
{
"message" => "Test event",
"@version" => "1",
"@timestamp" => 2024-12-24T15:34:00.000Z,
"event" => {
"original" => "{\"@timestamp\":\"2024-12-24T15:34:00Z\",\"message\":\"Test event\"}"
},
"host" => {
"hostname" => "Joao’s-Black-MacBook-Pro"
}
}
The only way right now for the @timestamp to receive a non timestamp object is through the java setField:
/tmp/logstash-9.1.1 took 1m13s
❯ bin/logstash -i irb
3.1.0 :001 > e = LogStash::Event.new
=> #<LogStash::Event:0x7b28ce3c>
3.1.0 :002 > e.set("@timestamp", "hello")
(irb):2:in `evaluate': wrong argument type String (expected LogStash::Timestamp) (TypeError)
3.1.0 :003 > e.set("@timestamp", "2024-12-24T15:34:00.000Z")
(irb):3:in `evaluate': wrong argument type String (expected LogStash::Timestamp) (TypeError)
3.1.0 :004 > e.to_java.setField("@timestamp", "2024-12-24T15:34:00.000Z")
=> nil
3.1.0 :005 > e.to_hash
=> {"@version"=>"1", "@timestamp"=>"2024-12-24T15:34:00.000Z"}
Typically plugins will not use this API. maybe you hit this issue with different plugin?