Add support for protobuf inputs
Is your feature request related to a problem? Please describe.
tctl only supports JSON encoding for arguments. Workflows/signals that have inputs of protobuf type cannot be started/sent through it.
Describe the solution you'd like
Support protobuf arguments. This might require passing protobuf file as an additional parameter.
Additional context
User request (from Slack):
I have a workflow that uses a protobuf generated objects for its arguments. I can start workflows fine using the Java client directly. However, when I attempt to start with tctl, I get an error that indicates it is using the Jackson data converter instead of the proto json data converter. It looks like the payloads get routed to data serializers based on json/plain or json/protobuf encoding. Is there a way to set the different JSON encoding via tctl? Are there other possible options here? Caused By: io.temporal.common.converter.DataConverterException: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot find a (Map) Key deserializer for type [simple type, class com.google.protobuf.Descriptors$FieldDescriptor]
To clarify, the serialization is still JSON but serialized from a proto generated object. It's encoded as json/protobuf for use with the ProtobufJsonPayloadConverter, but tctl sends JSON with the json/plain encoding, which routes to the JacksonJsonPayloadConverter. I believe this could be fixed by providing an option in tctl for setting the json-proto encoding, as opposed to fully serializing to proto binary (encoded as binary/protobuf and handled by ProtobufPayloadConverter).
I see. Assuming that JSON that is passed to the argument is valid for the protobuf to deserialize it could work.
We should support loading a message from a binary file and specifying the encoding through the command line. This would support any future message format.