protobuf.js
protobuf.js copied to clipboard
Optional/undefined/null props in generated TS interfaces
protobuf.js version: 6.8.8
My use case for the generated Typescript types, specifically the interfaces, is to expose a less verbose API to a client consuming, what is under the hood, protobuf data. My Typescipt API would have methods that return those interface types.
When I generate classes and Typescript types from my proto files, a generated class may look like:
export class CafeRegistration implements ICafeRegistration {
public address: string;
}
while the generated interface looks like:
export interface ICafeRegistration {
/** CafeRegistration address */
address?: (string|null);
}
Why does the interface use optional properties for class properties that are not actually optional? Those optional values seem unnecessary, at least for my use case, and make the API much harder to consume.
I would be interested in writing a patch for this, as it's causing problems for me too. Admins, would a PR for this be considered assuming it's high quality? Or is there a reason that pbts generates null | undefined for protobuf messages that are not optional?
facing the same problem! the optional syntax is redundant with the null type.
In proto3, all fields are "optional", see https://github.com/protocolbuffers/protobuf/issues/2497#issuecomment-267422550
So the optional property is required for the pbts generated model to match the proto3 definition. This, however, is not ideal as you have found (I too have the same dilemma). The proper solution is to convert a pbts object instance to your ts model, and handle the ambiguities with optional/null properties as you see fit.
isn't optional/required a separate issue?
let's say you have these proto3 messages:
message Foo {
int32 a = 1;
}
message Bar {
Foo foo = 1;
}
i would expect the TS interfaces for these messages to be:
export interface IFoo {
a: number;
}
export interface IBar {
foo: (Foo|null);
}
which is proto3's philosophy: Foo.a
being unset is the same as Foo.a
being set to zero (the default scalar value for int32). But embedded messages can be unset.
this mirrors the APIs for go, python: https://developers.google.com/protocol-buffers/docs/reference/go-generated#singular-scalar-proto3 https://developers.google.com/protocol-buffers/docs/reference/python-generated#message
which is proto3's philosophy:
Foo.a
being unset is the same asFoo.a
being set to zero (the default scalar value for int32)
I don't think so. My link states "The whole idea of using protobuf is that it allows you to add/remove fields from your protocol definition while still being fully forward/backward compatible with newer/older binaries." Your example seems to only consider the proto definition at the time of creation, and making suitable type definitions to suit that proto. However, protobufs are designed to be upgraded, as per the quote. So the absense of a field in a message doesn't necessarily mean null, it can also mean the sending/receiving system doesn't know of that field.
But proto files are quite handy for defining API's, and this compatibility may not be required/desired by certain people (ie, people in this thread). So I think it would be a nice feature for having either a command line parameter, or a special flag in the proto files (similar to what gogoproto does for go generated files), that would alter the output of this package.
Based on the current proto3 language guide: each singular element
has a default value.
When a message is parsed, if the encoded message does not contain a particular singular element, the corresponding field in the parsed object is set to the default value for that field. -- source
From my understanding each field value should be defined by a default value. So no undefined
or null
types.
pbjs respects required
directive in field declaration.
syntax = "proto3";
package test;
message SomeMessage {
string one = 1;
required string two = 2;
}
to be
import * as $protobuf from "protobufjs";
/** Namespace test. */
export namespace test {
/** Properties of a SomeMessage. */
interface ISomeMessage {
/** SomeMessage one */
one?: (string|null);
/** SomeMessage two */
two: string;
...
}
...
}
@civilizeddev required
fields are not a thing anymore in Proto3
Required fields are no longer available.
@civilizeddev
required
fields are not a thing anymore in Proto3Required fields are no longer available.
I see. This issue should be fixed properly then.
Pinging @nicolasnoble @dcodeIO @alexander-fenster
What do you expect me to do? Close this as fixed?
Wondering would be the best approach to fix this I don't see any real conclusions in #844
This behaviour has been troublesome to work with and there were no answers to @fractalmarc questions last year:
I would be interested in writing a patch for this, as it's causing problems for me too. Admins, would a PR for this be considered assuming it's high quality? Or is there a reason that pbts generates null | undefined for protobuf messages that are not optional?
Is the current result wanted or is it a bug that someone could grab and try to fix? (If it is indeed a bug, maybe a "bug" or "help wanted" label is proper in this issue so people know they can grab this)
My bad if you're not the correct person to contact — I pinged you because you appear as one of the 2 members of the protobuf.js team in GitHub
Iirc it is required because one can create a CafeRegistration
from a partial ICafeRegistration
relying on omitted required fields being filled in with default values, with the expectation being that any place requiring the full rules should be typed as CafeRegistration
. From the issue it seems that we are missing a way to express this better, but given that required
is also deprecated it might make sense to weigh up the expected benefit against the additional complexity a fix would introduce I guess.
Would love an update on this. It not easy in TS to validate every field in a proto interface for every event handler, and seems redundant since a field will have a default value if not passed. Validating if a field's value is the "default" value is one thing, but having to add a type guard for every individual field is a burden.
It appears object fields even on the concrete classes are Class|null|undefined
could we at least choose one to represent not-set?
w.r.t. the partial objects for construction... could typescript's Partial be helpful here? constructor(properties?: Partial<my.namespace.IProbeValue>);
so that fields when reading would be nullable only for non-scalars and never undefined, but everything would be |undefined
when using the construction methods?
That would of course be backwards incompatible - potentially another cmdline flag?
It looks to me like the object fields can never be undefined
as the prototype has got = null
for each of those fields set on it.
It can only be undefined if undefined
is passed as a set field value in the constructor - is there a reason for that?
This is generated now:
function ProbeValue(properties) {
if (properties)
for (let keys = Object.keys(properties), i = 0; i < keys.length; ++i)
if (properties[keys[i]] != null)
this[keys[i]] = properties[keys[i]];
}
But I feel like there needs to be a check for undefined - then we can remove a significant amount of unions from the generated types...
function ProbeValue(properties) {
if (properties)
for (let keys = Object.keys(properties), i = 0; i < keys.length; ++i)
if (properties[keys[i]] != null && properties[keys[i]] !== undefined)
this[keys[i]] = properties[keys[i]];
}
FYI you can reduce the type guard noise by using --force-message
with pbjs - at least then nested Message types are of the concrete class and not the interfaces (scalar fields are not-nullable).
Any update or workarounds for this?