ts-proto
ts-proto copied to clipboard
NestJS support for union style oneofs
After going a bit into the whole oneof
stuff I struggled to implement the code generated by the union style oneof option (oneof=unions). The types were generated just fine so I could send out my messages according to the $case
specification.
I then realized, that apparently it doesn't work too well with with NestJS or the gRPC client I'm using (BloomRPC) to test my services.
Initially, I thought the conversion of the $case
properties will somehow magically be handled by grpc-js and being converted in the background so I could just send out an object like
// based on https://github.com/stephenh/ts-proto/blob/main/integration/oneof-unions/oneof.ts
const response: PleaseChoose = {
name: 'test',
age: 1,
choice: {
$case: 'aBool',
aBool: true,
},
};
but that doesn't work. After digging a bit deeper I realized, that the actual case conversion is handled by the generated encode methods from ts-proto
, right? Because there is no case handling in pbjs.js
and also no client could understand the messages send like the above.
So to 'fix' this for outgoing messages, I could simply convert the message to JSON beforehand and transform to the gRPC valid format before handing the request over to the NestJS GrpcClientProxy service method. Same goes for the responses. Simply using the to- and fromJSON methods obviously works just fine here but there are two main things about this whole process that bother me:
- The to- and fromJSON conversions have to be addressed manually each time I want to comply with the .proto schema
- I end up with not having the actual JSON type (as defined in .proto) in my code (toJSON returns unknown)
So I'm thinking about if there isn't a 'better' or just different approach to the problem. So far, I came up with 2 potential solution/different approaches.
Alternative Approach
Considering the following .proto
syntax = "proto3";
package hero;
service HeroService { rpc FindOne(HeroById) returns (FindOneResult); }
message HeroById { int32 id = 1; }
message FindOneResult {
int32 code = 1;
oneof result {
Hero hero = 2;
ErrorMessage error = 3;
}
oneof meta {
string details = 4;
string message = 5;
}
}
message ErrorMessage {
int32 code = 1;
optional string message = 2;
}
message Hero {
int32 id = 1;
string name = 2;
}
the current implementation for oneof=unions
generates
export interface HeroById {
id: number;
}
export interface FindOneResult {
code: number;
result?:
| { $case: 'hero'; hero: Hero }
| { $case: 'error'; error: ErrorMessage };
meta?:
| { $case: 'details'; details: string }
| { $case: 'message'; message: string };
}
export interface ErrorMessage {
code: number;
message?: string | undefined;
}
export interface Hero {
id: number;
name: string;
}
which requires the conversion steps described above to work with standard gRPC clients. What if we would instead generate the following types instead?
export interface HeroById {
id: number;
}
export type FindOneResult_oneof_base = {
code: number;
};
export type FindOneResult_oneof_result =
| { hero: Hero; error: never }
| { error: ErrorMessage; hero: never };
export type FindOneResult_oneof_meta =
| { details: string; message: never }
| { message: string; details: never };
export type FindOneResult = FindOneResult_oneof_base &
FindOneResult_oneof_result &
FindOneResult_oneof_meta;
export interface ErrorMessage {
code: number;
message?: string | undefined;
}
export interface Hero {
id: number;
name: string;
}
It's a bit messy, sure and there are probably even better ways to implement it but it would actually work that way and allow us to comply with the .proto schema without having any json conversion at all.
As I'm still learning about all the parts of the whole gRPC story I'm probably not aware of all circumstances atm so I don't know if this would lead to any problems about how everything works but so far I think it wouldn't ... I actually created a working example here for you to check out.
I'm wondering why you chose to generate an ADT for the oneof fields and not just use a 'flat type' I suggested above? Am I missing something here?
Also, what do you think about my approach in general?
I actually just realized there is an even cleaner way to implement this:
export type FindOneResult_oneof_base = {
code: number;
};
export type FindOneResult_oneof_result =
| { choice: 'hero'; hero: Hero }
| { choice: 'error'; error: ErrorMessage };
export type FindOneResult_oneof_meta =
| { meta: 'details'; details: string }
| { meta: 'message'; message: string };
export type FindOneResult = FindOneResult_oneof_base &
FindOneResult_oneof_result &
FindOneResult_oneof_meta;
which is basically the same thing but with better code completion and always having the oneof field (e.g. choice='hero' | 'error') in the resulting JSON. That way the message will reflect exactly what is being 'generated' by the grpc clients anyway, e.g.
when you'd return
const response = {
code: 1,
error: { code: 1 },
details: 'test',
};
the client would receive
const result = {
code: 1,
choice: 'error',
error: { code: 1 },
meta: 'details',
details: 'test',
};
so why not just set the fields in the response right away?
The flat structure would end up making typescript's job much harder (and more expensive to type check). Imagine a proto message with 10 oneofs each with 10 fields. With the existing structure typescript doesn't need to consider how each oneof might affect the structure of the parent object as each union is self-contained. But with the flat structure you've proposed typescript now has to consider how each intersection of unions could affect the structure of the parent object.
We know based on the protobuf spec that none of the intersections of unions could collide with each other, but typescript doesn't and so it still needs to perform that verification.
Another thing to consider is that proto fields can be optional which means that this should be considered a valid FineOneResult
object, but isn't based on the proposed flat type:
const messageToSend: FineOneResult = {code: 1};
Additionally it makes mutations more cumbersome because we can no longer mutate a single oneof at a time, we must mutate multiple properties of the parent object at once which typescript doesn't care for:
const result: FindOneResult = {
code: 1,
choice: 'error',
error: { code: 1 },
meta: 'details',
details: 'test',
};
delete result.details; // TS error
result.meta = 'message'; // TS error
result.message = 'foobar'; // TS error
Compared to the current structure where it is trivial to mutate the oneofs in a single expression that satisfies typescript:
const result: FindOneResult = {
code: 1,
result: {
$case: 'error',
error: {code: 1},
},
meta: {
$case: 'details',
details: 'test',
}
};
result.meta = {
$case: 'message',
message: 'foobar'
};
This is not to say that these tradeoffs wouldn't be worth it to have a nicer to construct/consume flat structure. This is more just playing devil's advocate to show what those tradeoffs would be.
Hey guys, coming across this much later, so sorry about being quiet on it...
I'm not personally a NestJS user, but I believe the issue, as I just mentioned in #411, is that NestJS, as @jsbrain originally pointed out, NestJS doesn't know about ts-proto's own encode
/decode
methods.
I think the best fix for this, instead of changing how ts-proto's messages/output looks (which I'm admittedly biased but I think are generally pretty good :-D ) and instead figuring out how to tell NestJS's internals how to use ts-proto's encode
and decode
methods.
Note that we've done this with grpc-web, where we output requestType.serliazeBinary
and responseType.deserializeBinary
adapters that pass to grpc-web when calling RPC methods:
https://github.com/stephenh/ts-proto/blob/main/integration/grpc-web/example.ts#L684
And that hooks ts-proto into the right spot, such that we handle the object <-> bytes
and grpc-web handles everything else.
It'd be great if someone could figure out the same adapter approach for NestJS. Thanks!
Do we have any news about this?
We tried to find way to tell NestJS to use ts-proto's encode/decode, unfortunately there is no way. To allow this, need to make changes in @grpc/grpc-js and after this in NestJS.
Currently NestJS does not handle communication layer at all.
@danduh AFAIU someone figured out that NestJS's encode/decode can be intercepted using the protobufjs wrappers
:
https://github.com/stephenh/ts-proto/issues/69#issuecomment-1096981570
But it's not been integrated into ts-proto yet. If you want to submit a PR to do so, that'd be great. Fwiw even just playing with the wrappers
setup by-hand and confirming it would/would not help this issue would be a good first step.
@stephenh thanks for response and for reference.
Maybe I'm missing something but here what we got so far.
my gaas.proto file
package gaas
message GeoUnions {
oneof kind {
string string_value = 1;
double number_value = 2;
bool bool_value = 3;
string date_value = 4;
}
}
message FilterParam {
string field = 2;
GeoUnions value = 3;
}
generated gaas.pb.ts with --ts_proto_opt=oneof=unions
export interface GeoUnions {
kind?:
| { $case: 'stringValue'; stringValue: string }
| { $case: 'numberValue'; numberValue: number }
| { $case: 'boolValue'; boolValue: boolean }
| { $case: 'dateValue'; dateValue: string };
}
export interface FilterParam {
field: string;
value: GeoUnions | undefined;
}
In my ts code I'm expecting FilterParam to be
const filterParam = {
field:'username',
value: {
kind: {
$case: 'stringValue',
stringValue: 'myusername'
}
}
}
Instead when I'm running NestJs with {loader: {oneofs: true}} I have
const filterParam = {
field:'username',
value: {
kind: 'stringValue',
stringValue: 'myusername'
}
}
Be honest, this structure is good enough for me, but not compatible with generated types.
So I can see two options:
- Create wrapper for
GeoUnions
to convert message to be aligned withGeoUnions
interface
export const loadSync = (
filename: string | string[],
options?: ExtendedOptions
): PackageDefinition => {
wrappers['.gaas.GeoUnions'] = {
fromObject: ()=>{
// not implemented
},
toObject:(message, options) => {
const key = Object.keys(message)[0];
const response = { $case: key, ...message };
return { kind: response };
}
} as any;
return _loadSync(filename, options);
};
- Or to add
--ts_proto_opt=oneofsUnions=keepKind
. based on this config generate interfaces like this:
export type GeoUnions =
| { kind: 'stringValue'; stringValue: string }
| { kind: 'numberValue'; numberValue: number }
| { kind: 'boolValue'; boolValue: boolean }
| { kind: 'dateValue'; dateValue: string };
export interface FilterParam {
field: string;
value: GeoUnions | undefined;
}
Any comments and thoughts are wellcome.
Thanks
Hi @danduh , yeah, the idea with this ticket would be to have ts-proto output this:
wrappers['.gaas.GeoUnions'] = {
fromObject: ()=>{
// not implemented
},
toObject:(message, options) => {
const key = Object.keys(message)[0];
const response = { $case: key, ...message };
return { kind: response };
}
} as any;
Automatically. Although really it might be a lot simpler, like:
wrappers['.gaas.GeoUnions'] = {
fromObject: (bytes) => GeoUnions.encode(bytes),
toObject: (message) => GeoUnions.decode(message),
} as any;
And doing that basically for every message in the schema.
I.e. ts-proto's SomeMessage.encode
and SomeMessage.decode
already know how to 1) encode (take some bytes and turn it into a ts-proto-specific TS object) and 2) decode (some a ts-proto-specific TS object and turn it into bytes).
So, we shouldn't need to re-write any of that, we just need to tell NestJS to use each message's encode
/ decode
method.
Disclaimer my snippet about of fromObject
--> encode
and toObject
--> decode
is just off the top of my head, I haven't dug into the details of what the actual parameters & return types of fromObject
and toObject
are.
Hello guys,
After days of trying to make proto oneOf declarations working in our NestJS project, we finally have something and we made it available here: https://github.com/Maze-Capital/protobufjs-nestjs-protoloader
It was a nightmare to debug to be honest, we got inspired/helped by some comments from this issue.
Feel free to have a look at the repo!
Just wanted to chime in here since I ran into a similar issue. There is (almost) a really easy way to manage this. ts-proto
is already doing something similar with common types (like google.protobuf.Timestamp
). Basically, a wrapper could be generated somewhat similarly to the above:
wrappers['.mytypes.SomeType'] = {
fromObject: (object) => SomeType.toJSON(object),
toObject: (message) => SomeType.fromJSON(message)
} as any;
(the naming here is kinda weird because fromObject
gets called on serialization, etc.)
This pretty much follows everything listed above on a decent solution. This seems like it could be pretty easily generated, especially since it follows similar generated code output already.
I say almost above because there's one drawback that I've noticed. protobufjs
is still expecting that Timestamps
come from the controller as Date
objects. The above code will convert the Date
object to a string
, which has the effect of protobufjs
just inserting a new Date()
object.
My two cents is that the approach @stephenh proposed above seems like the best solution, but really what's needed is a pure transform function (that can be used in the above suggestion in place of toJSON
/fromJSON
). Basically what toJSON
/fromJSON
are doing minus any stringification that would happen as part of the process. AFAIK the encode
/decode
functions won't work in the wrapper since at this stage protobufjs
is still expecting a JS object, not serialized binary.