DateTime not encodable in JSON
Calling encode with a datetime properly casts to a Google.Timestamp
Protobuf.encode(%MyProto{created_at: DateTime.utc_now()})
But calling the JSON module to encode results in an error
Protobuf.JSON.encode(%MyProto{created_at: DateTime.utc_now()})
# function DateTime.__message_props__/0 is undefined or private
It seems that Protobuf.JSON.Encode.encodable doesn't have a clause to handle DateTimes. Is there a reason for this? Would a PR be accepted to handle DateTimes?
I might be wrong on how the ergonomics of how the JSON module should work. It just feels interesting that the json module can only encode if working with a proto that might have been encoded and decoded using the top level encoder api and then ran through the json encoder.
I have been wondering about the same issues recently using the JSON module.
What is the motivation for removing the new/1 factory function? I would expect such a factory to guarantee data integrity and prevent the situations we are running into from happening.
Constructing structs directly is a hassle, we could put any value we want, and I am unsure how the situation suppose to play out, at the very least, with the JSON module thus far.
Maybe @whatyouhide could share more insights since he is the author of https://github.com/elixir-protobuf/protobuf/pull/330
It seems that Protobuf.JSON.Encode.encodable doesn't have a clause to handle DateTimes. Is there a reason for this? Would a PR be accepted to handle DateTimes?
This is the solution here I think, to add a clause for handling this. The reason for removing new/1 is that it was doing no validation.
Constructing structs directly is a hassle
Why?
we could put any value we want
You could do the same with new/1.
@whatyouhide I wouldn't expect new/1 to do "validation" but "parsing," ensuring that the data I pass into is proper. Using struct directly bypasses the integrity of the data, potentially causing a significant headache, since I can put any value I want until the data reaches the Protobuf.JSON; the developer could forget to test a given branch or whatever, which it may be too late and cause significant outrage.
Right now, I am using Ecto Schema and Changeset for all my events to ensure no broken data is constructed, using https://hexdocs.pm/one_piece_commanded/OnePiece.Commanded.Event.html
How could I have the same experience using Protobuf.JSON?
In an ideal scenario, only valid values (from the perspective of protobuf, not domain specific thing) would be ever exists constructing these structs.
Please advice šš»
@yordis this will be kind of fixed with types so Iām not sure investing in it is helpful right now. Validation and parsing in this instance are the same thing I believe, you want to make sure that data has the correct types (hence why I think the type system will help)>
@whatyouhide I love it; this is the precise situation where I would prefer the type system.
Is it ready for Elixir v1.18? I am figuring out what strategy I could take here. I love taking advantage of the type system, but I am still determining if I should professionally take that risk.
I can keep in the "discipline-driven" until things are ready (let me know if I can do any type of work to help), but I am concerned if that never happens.
What would you do?
@yordis it's hard to say what you should do given I don't know the use case here. However, the usual suspects: validate data at the boundaries so that you don't construct incorrect Protobuf structs, test your encoding, and so on. I don't think we should (re)add APIs here that will be completely obsolete once (if, but it's a small if) the type system lands.
I ended up creating a factory function in my test environment (please share your thoughts); I am OK giving up on things as long as the test environment "enforces" correctness in constructing some of these messages.
The tricky bit for me is that, there is a lot of unit testing going on, by the time I get to the "boundaries" so many things may be broken already
defmodule TestSupport do
def from_decoded!(module, data) when is_map(data) and is_atom(module) do
data
|> to_proto_decoded()
|> Protobuf.JSON.from_decoded(module)
|> OnePiece.Result.unwrap_ok!()
end
defp to_proto_decoded({k, v}) when is_atom(k) do
{Atom.to_string(k), to_proto_decoded(v)}
end
defp to_proto_decoded(value) when is_nil(value) do
value
end
defp to_proto_decoded(value) when is_list(value) do
Enum.map(value, &to_proto_decoded/1)
end
defp to_proto_decoded(value) when is_boolean(value) do
value
end
defp to_proto_decoded(value) when is_atom(value) do
Atom.to_string(value)
end
defp to_proto_decoded(%DateTime{} = value) do
DateTime.to_iso8601(value)
end
defp to_proto_decoded(existing_map) when is_map(existing_map) do
Map.new(existing_map, &to_proto_decoded/1)
end
defp to_proto_decoded(value) do
value
end
end
I tested with Elixir v1.18 RC.0, and passing a DateTime instead of Timestamp does not causes a compilation error
Is there any reason why atom keys aren't supported? Primarily because the structs already register such keys atom, so it shouldn't be a concern as far as I can tell
Is there any reason why atom keys aren't supported?
Can you elaborate?
I need to do something differently, as I shared https://github.com/elixir-protobuf/protobuf/issues/373#issuecomment-2528554689
I had to create a function that transforms atom keys into string keys to make it work with Protobuf.JSON.from_decoded/2
Ah yeah, I see. Semantically you would not really ever need to pass atom keys to that, because you are supposed to pass the output of JSON decoding to that, but we can add support for that given that things like Jason.decode support decoding to atoms. I have zero time do that now though.
Right, because if Jason.decode allows atoms keys as well is why I am asking to some extent.
I can give it a try, hopefully I can get it to work