Parse timestamp/valid_at as TimeWithZone in Mapper
Is your feature request related to a problem? Please describe.
Yes. I would like to be able to use the RES Event Browser but it fails as it expects timestamp to be a Time (or TimeWithZone) object as it calls #iso8601 on it. I would also just prefer in general to interact with the event timestamp as a object vs. a string.
Describe the solution you'd like
I think it would make sense to parse the JSON values for both timestamp and valid_at with Time.zone.parse in the Mapper class. I'm not sure if that would cause backward compatible issues though. Seems like anyone would prefer a Ruby Time/TimeWithZone object to an ISO date string.
This is already the case in the default YAML serializer.
Additional context
I'm happy to create a PR if you think that would be a good step forward.
This is already the case in the default YAML serializer.
As far as I can see, a serializer (we use JSON by default) is used to encode data and metadata fields, not timestamp and valid_at.
Or RES use data in the browser?
Frankly speaking, I'm kind lost in all the mappers and serializers stuff in v2 🙂
This is already the case in the default YAML serializer.
Maybe, that could be a way out: use YAML serializer?
Also, regarding Mapper updates: I think, we should first migrate to Pipeline mapper and, maybe, even use some of the built-in transformers (they have smth like event class mapper).
Thanks @palkan. You are correct on the serializer. What I should have said is that the YAML serializer accepts a Time object and deserializes to a Time object whereas the JSON serializer accepts a Time object but deserializes it to a string. It doesn't cast it to the original object. So you get date deserialization for free. At least that's how I interpret it.
I came across the issue when trying to use the RES Browser as it would throw the error:
10:04:57 web.1 | 2021-10-05 10:04:57 - NoMethodError - undefined method `iso8601' for "2021-10-05T14:04:51.169Z":String:
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/json_api_event.rb:32:in `block in metadata'
10:04:57 web.1 | <internal:kernel>:90:in `tap'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/json_api_event.rb:31:in `metadata'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/json_api_event.rb:18:in `to_h'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/get_events_from_stream.rb:18:in `block in as_json'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/get_events_from_stream.rb:18:in `map'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/get_events_from_stream.rb:18:in `as_json'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/app.rb:99:in `json'
10:04:57 web.1 | /Users/ryanwood/.rbenv/versions/3.0.1/lib/ruby/gems/3.0.0/gems/ruby_event_store-browser-2.3.0/lib/ruby_event_store/browser/app.rb:54:in `block in <class:App>'
It's clearly expecting the timestamp quack like a Time object. I've temporarily fixed it in my app by subclassing the ActiveEventStore::Mapper with this:
class CustomMapper::Mapper < ActiveEventStore::Mapper
def record_to_event(record)
super.tap do |event|
%i[timestamp valid_at].each do |dt|
event.metadata[dt] = Time.zone.parse(event.metadata[dt])
end
end
end
end
and using it with
ActiveEventStore.config.store_options = {
mapper: CustomMapper::Mapper.new(mapping: ActiveEventStore.mapping)
}
I assume this would break backward compatibility with this gem. I could create a PR to roll that into the current mapper if you think that makes sense or I can just keep my customized mapper.
I'm with you on not really "getting" all the new v2 serializers and mappers. Just trying to sort out my issues.
I should also say, though I don't think it has any effect on this issue, that I'm using JSONP fields in Postgres for the data and metadata so I've set ActiveEventStore.config.serializer = RubyEventStore::NULL in the initializer per the RES instructions.
I don't think it has any effect on this issue, that I'm using JSONB fields in Postgres for the data and metadata
That doesn't allow us to switch to YAML :)
I assume this would break backward compatibility with this gem
Yeah. I suggest adding your custom mapper to the docs as an example of dealing with the browser issue.
And maybe @pawelpacana could help us to find a proper solution) Shouldn't Browser work with RubyEventStore::NULL or we're missing something here?
And maybe @pawelpacana could help us to find a proper solution)
A repository with reproduction of the issue would be ideal. RES version and its config sufficient to tell more.
I'm not up to date with AES internals, sorry.
In general in current RES (2.x):
-
mappers are for transforming data (symbols-strings, compression, encryption, upcasting, etc.)
-
serialization is pushed to the edge into event repository (event storage) and into scheduler(feeding background jobs, kind of storage too — Redis for Sidekiq), there can be different serializers for each of them but as a safe default we use one like in RES 1.x and that is YAML
-
ActiveRecord-on-JSONB prefers to not receive JSON strings, thus the NULL serializer on event repository to give it Ruby hashes — it will take JSON string without any issue but all benefits of indexing will be lost, you seem to already know it
-
it's the event
dataandmetadatathat are serialized for storage, however we take outcreated_atandvalid_atfrom metadata to store them separately as columns — for the purpose of querying and also for the benefit of always having the same way to parse them regardless of your choice of serializer- here's special handing of timestamps: https://github.com/RailsEventStore/rails_event_store/blob/24d2b69dd3ba731b1c0d8f76d1009c0e382fbc00/ruby_event_store/lib/ruby_event_store/mappers/transformation/domain_event.rb#L9-L10
- here's serializing those timestamps: https://github.com/RailsEventStore/rails_event_store/blob/24d2b69dd3ba731b1c0d8f76d1009c0e382fbc00/ruby_event_store/lib/ruby_event_store/record.rb#L61-L62
- here's deserializing them: https://github.com/RailsEventStore/rails_event_store/blob/24d2b69dd3ba731b1c0d8f76d1009c0e382fbc00/ruby_event_store/lib/ruby_event_store/serialized_record.rb#L59-L60
- event repository ultimately decides whether it prefers serialized or non-serialized version of timestamp for storage: https://github.com/RailsEventStore/rails_event_store/blob/81c23fd0c2509abe5aa63092fa32308184626090/rails_event_store_active_record/lib/rails_event_store_active_record/event_repository.rb#L126-L127
-
browser should use the same stack of mappers and serializers configured in RES, there should not be an issue there
-
ecommerce sample app covers usage of JSONB (https://github.com/RailsEventStore/ecommerce/blob/master/rails_application/config/initializers/rails_event_store.rb#L21) and has a browser plugged-in: https://ecommerce.arkademy.dev/res
I think the core of the issue is this: https://github.com/palkan/active_event_store/blob/8d1a61f24687a09832a01337cb573b759928f150/lib/active_event_store/mapper.rb#L45
Copying behaviour from Transformation::DomainEvent is one possible solution: https://github.com/RailsEventStore/rails_event_store/blob/81c23fd0c2509abe5aa63092fa32308184626090/ruby_event_store/lib/ruby_event_store/mappers/transformation/domain_event.rb#L8-L27
A different one could be indeed migrating to PipelineMapper and passing your own to_domain_event to Pipeline: https://github.com/RailsEventStore/rails_event_store/blob/81c23fd0c2509abe5aa63092fa32308184626090/ruby_event_store/lib/ruby_event_store/mappers/pipeline.rb#L6
A different one could be indeed migrating to PipelineMapper and passing your own to_domain_event to Pipeline
Yeah, that sounds like the best option.
Thanks for such detailed response!