ts-proto
ts-proto copied to clipboard
Bytes decoded as Buffer even with env=browser
Search Terms
uint8array buffer env browser node nodejs
Expected Behavior
When I generate bindings with --ts_proto_opt=env=browser
, I would expect bytes to be decoded as Uint8Array
, even when running the code under NodeJS.
(I'm aware that Buffer
is a subclass of Uint8Array
, but I'd still like to get the same types in the browser and under NodeJS, unless I use env=both
. Or is this intended behavior?)
Actual Behavior
Bytes are decoded as Buffer
instances.
Steps to reproduce the problem
See https://github.com/threema-danilo/ts-proto-buffer-repro:
- Generate script: https://github.com/threema-danilo/ts-proto-buffer-repro/blob/c6932cafa082beb1d2dda3eed0156f0b896c8b5f/generate-protobuf.sh#L10
- Generated interface field has type
Uint8Array
: https://github.com/threema-danilo/ts-proto-buffer-repro/blob/c6932cafa082beb1d2dda3eed0156f0b896c8b5f/messages.ts#L8 - However, when running
npm install && npm test
, the field is decoded as aBuffer
instance.
Specifications
- ts-node v10.4.0
- node v16.8.0
- compiler v4.5.2
- Operating system and version: Arch Linux
Do I understand correctly, that the types are produced the way you expect, but only that during the decoding, the instance is of subclass Buffer
, rather than Uint8Array
?
If so, what is the benefit of working with the parent class, rather than a substitutable child class? I think the current implementation is just a simple way of serving both environments, but if it causes issues it might warrant having a look.
Yes, that is correct. As long as the generated Uint8Array
type is used everywhere, it should be fine, otherwise it's possible that code being tested under NodeJS could start to use Buffer
-specific APIs and not notice that these won't work in the web environment.
Note: It's perfectly fine if you decide that this is out of scope. I was just a bit confused initially because NodeJS specific types were used even though the env was explicitly set to browser
. I assumed this was an oversight. Plus, I encountered some errors when running ts-proto based tests under Node, which I initially attributed to Buffer
being used, but I was wrong about that.
what is the benefit of working with the parent class, rather than a substitutable child class?
Having Buffer
s in JS code compiled for the browser is an issue in multiple ways. The first is testing frameworks like Jasmine can be picky about the prototype. Then a Buffer and an Uint8Array with the same content are not considered equal. The next big pain is bundling: when JS and commonjs modules succeeded on the server and brought the Node.js specific Buffer type, this was made available to web browsers via alternative Buffer implementations. Webpack 4 gives you a global Buffer replacement by default. Webpack 5 does not do that as there is no Buffer in the JS spec. It requires much more discipline now by either importing the Buffer type explicitly in every module or turing on a global Buffer implementation like in Webpack 4.
Bytes are decoded as Buffer instances.
I encountered this problem a lot when I used protobuf.js. Looking at the implementation in
decode(input: Reader | Uint8Array, length?: number): SimpleMessage {
const reader = input instanceof Reader ? input : new Reader(input);
let end = length === undefined ? reader.len : reader.pos + length;
const message = createBaseSimpleMessage();
while (reader.pos < end) {
// …
it turns out that class Reader
from protobuf.js is used, which is most likely the reason.
In such cases I used to work around this problem by wrapping the bytes in Uint8Array.from(bufferOrUint8ArraySource)
to get a consistent runtime type. However it would be nice to avoid the use of Buffer entirely in the implementation.
In order to really address this problem, we probably want to implement the Reader/Writer ourselves instead of using it from protobufjs/minimal
. We don't want to embed them in every file, so a different package would be required (ts-proto-runtime or something like that). This would bring a few more advantages as well.
By the way, installing buffer
package and making it available through window.Buffer = Buffer
kinda fixes the issue. I consider this a dirty hack, though.