grpc-web
grpc-web copied to clipboard
Reduce compiled JS file size
As it stands, our npm package index.js
is 218KiB (which is being compiled by the closure compiler here).
With the Echo example echo.proto, that has about 7 message types, 1 service, 3 unary methods and 5 streaming methods:
- The Closure example final
compiled.js
compiled by the Closure compiler: 317KiB - The CommonJS example final
dist/main.js
compiled bywebpack
: 415KiB - The same CommonJS example compiled by
browserify
: 479KiB - The CommonJS example with
.d.ts
typings generated, and then compiled bywebpack
: 499KiB
I tried babel-minify
and uglify-js
on some of these JS files and the output are not any particularly smaller.
There should be some other things we can do to further minimize the compiled JS code size.
Reference: #79
A bit more info, as it stands:
- the generated protobuf messages
echo_pb.js
produced by--js_out
: 35KiB - the generated gRPC Web stub
echo_grpc_web_pb.js
produced by--grpc-web_out
: 9KiB
A couple more observations:
- Just by looking at
compiled.js
,dist/main.js
produced by the closure compiler, webpack, etc, the compiled JS files are already reasonably minimized. Most variables are already renamed, whitespace removed, etc. - From a cursory look, the largest increase of code size is caused by 1. proto messages pulling in
google-protobuf
runtime classes and 2. grpc-web service stub pulling ingoog.net.streams
and closure library code. But again, after those dependencies are pulled in, the resulting code are already reasonably minimized. So how does 35KiB/9KiB get to 317KiB?
For the protobuf messages, could you check the compiled.js with only the protobuf messages?
I.e. it is useful to find out the respective sizes of google-protobuf v.s. goog.net.streams
Would be super interesting if you could share any news here. The biggest hurdle to adopt grpc-web for our front end people is the size of the dependencies / generated code.
@codesuki could you share the generated code size v.s. dependencies respectively for your front-end project? Optimizing protobuf-JS is not something we could do easily but grpc-web stub and runtime may have room for optimization.
Will need to dig into the bundled code but at a glance I believe it's because of the same reason as this: https://github.com/protocolbuffers/protobuf/pull/5509
Currently in build.js
, it looks like closure compiler's advanced optimizations feature is not turned on, so the bundle ended up containing redundant library code because there is no tree shaking. To fix this we will need to:
- Turn on advanced optimizations
- Add
@export
to the public APIs and let closure compiler advanced mode safely performs tree shaking
Will get back on this thread once I confirmed this. If it works it would drastically reduce the bundle size.
I think @jjbubudi is right, I'm seeing smaller bundle sizes and we use advanced optimizations for all our code (I don't have exact numbers to share here because there is more compiled into our bundles, but our files are smaller than the sizes mentioned above).
If we go ahead and add @export
annotations to the public APIs, can we do that in separate files, or at least add an option to turn that on/off? Otherwise, those of us who use it directly from closure-style code will export these symbols when there is no need to.
IIUC, Closure Compiler will ignore @export
unless --generate_exports
is explicitly used when running the compiler, as per https://github.com/google/closure-compiler/wiki/Flags-and-Options. Would that resolve your concern @Yannic? (Please correct me if I am wrong)
Finally got it working locally and passing all mocha tests. Here is the result so far:
index.js
is now only ~24KB (uncompressed) after enabling advanced optimizations with proper annotations.
Will send a PR soon for review and further testing if anyone is interested.
Wow, that's great news @jjbubudi!
Unfortunately, --generate_exports
won't help in my case. We use @export
in our library to export public parts of the API.
However, @export
is translated to goog.export{Symbol,Property}
. Manually writing these and guarding them by a compile-time constant would be an option. It's a tiny bit more work, but doesn't increase the compiled size because of dead code elimination. I think it's worth it if we can avoid exporting unnecessary symbols.
/** @const {boolean} */
const EXPORT_GRPC_WEB_SYMBOLS = true;
function foo() {}
if (EXPORT_GRPC_WEB_SYMBOLS) {
goog.exportSymbol('foo', foo);
}
Let me know when you submit your PR.
Ah, ok. I've submitted PR #422. Looks like it needs to be further updated to avoid affecting existing Closure users.
@Yannic Actually, looking deeper at the optimized code, Closure Compiler has stripped away all the goog.exportX
calls and merely leaving 1 little line of prototype alias code. I'd think when using @export
the increase in code size should be negligible but we gain a lot in maintainability.
Here is the pretty printed version of the new bundle FYI: https://gist.github.com/jjbubudi/86549bd597ffc4d74b30990e7a385f44#file-grpc-web-js-L1287
Did @jjbubudi's fix ever end up being available? I've encountered the same issue — with a trivial hello world
proto, the smallest my bundle got was 180~Kb with Webpack, of which, only 8Kb is actual application code. The rest appears to be grpc-web. This is a problem because our web app targets mobile devices, we simply don't have the ability to add +180Kb to our payload, our landing page in total is smaller than that.
I've also tried using Closure Compiler via Webpack, however, this predictably fails at google-protobuf
on the advanced mode required to get the size reduction.
Thanks for the hard work — it otherwise works very well. I would just love to be able to strip away the google-protobuf's unused parts somehow, preferably with Webpack.
https://github.com/grpc/grpc-web/pull/469#issue-254871907
@nehbit It went into 1.0.4, see https://bundlephobia.com/[email protected]
I built six single-field protos and my bundle.js grew by 332 KiB. This is unacceptable. So far, it seems to me that the only solution offered is to learn a new language (Closure) and translate our code to it. Is this correct?
Now I will try Protobuf.js.
I built six single-field protos and my bundle.js grew by 332 KiB. This is unacceptable. So far, it seems to me that the only solution offered is to learn a new language (Closure) and translate our code to it. Is this correct?
Now I will try Protobuf.js.
I believe the references are to the Closure compiler
I just built a simple "hello world" test, and ended up with about 250k of minified Javascript. The comments above suggest that this issue should be resolved. Am I doing something wrong or is there still a problem somewhere?
By the way, thank you for all your work on this. I believe grpc-web will be the future of client-server communications on the web, even though there might be a few blocks at the moment.
@PeteX thanks for the update and your kind comment! :)
Not having much context of this issue myself, i just tried to build the helloworld example and i actually got a 292KB main.js file...
I did a quick analysis using the webpack-bundle-analyzer and here's the results:
It looks like:
- The great majority (230KB, or 78%) of the size came from google-protobuf.js..
- buffer also seem to account for 47KB but i'm not sure from where it's pulled in :)
- grpc-web itself is only <34KB in size (which makes sense since npm says its size is 44KB unpacked)
I'm not sure if this is expected or not since i don't have too much context on the history.. but just sharing my results.. :)
(Although by a quick look it seems not necessarily easy to trim those size from the google-protobuf.js lib?)
P.S. Command i used for webpack-bundle-analyzer:
# Under net/grpc/gateway/examples/helloworld/
npm install webpack-bundle-analyzer
webpack client.js --profile --json > stats.json
npx webpack-bundle-analyzer stats.json
Thanks, I think I understand the situation now.
In 2022, google-protobuf
has about 45Kb gzipped size... (230KB minified)
Why was this issue closed? As it stands now this library is completely unusable for a serious, production web project due to the unreasonably large dependency size. I realize that this is not the fault of the developers of grpc-web. Having this issue open would at least let developers know to steer clear.
@egor-yolabs Hi Egor thanks for the input :)
I guess i closed it because the majority of the code size is due to the use of protobuf JS.
There isn't much we could do on our side since it's a required dependency.
If you'd like, maybe you could consider filing a bug against https://github.com/protocolbuffers/protobuf-javascript based on the comments in https://github.com/grpc/grpc-web/issues/270#issuecomment-996512398 and https://github.com/grpc/grpc-web/issues/270#issuecomment-1198428226?
Thanks! :)
I think we can keep it open for now, but we need to decide on a "reasonable" size so we know what's the target :) Input is welcome :)
I see that @sayjeyhi recently logged https://github.com/protocolbuffers/protobuf-javascript/issues/124 with them which is related to this issue.
What seems very odd to me, is that the entire google-protobuf is being pulled in even if only a tiny part of the library was actually used. Ex. goog.exportSymbol()
and goog.object.extend()
. I wonder if tree shaking is not working properly for google-probuf for webpack (which I assume we're all using here) because that lib's package.json doesn't list a "sideEffects": false
.
What seems very odd to me, is that the entire google-protobuf is being pulled in even if only a tiny part of the library was actually used.
I'm curious what's the reason you think the dependency we have pulled in is a tiny part of the library, and should not in turn pull in the majority of the protobuf library?
if you are aware of any kind of static analysis that would show the percentage of the library we should in fact pull in, it would be helpful :)
What seems very odd to me, is that the entire google-protobuf is being pulled in even if only a tiny part of the library was actually used.
I'm curious what's the reason you think the dependency we have pulled in is a tiny part of the library, and should not in turn pull in the majority of the protobuf library?
if you are aware of any kind of static analysis that would show the percentage of the library we should in fact pull in, it would be helpful :)
You may not have understood what I meant. I see you ran the webpack-bundle-analyzer above which is what I've been running too on my project. In my project, I have a very simple case which imports the generated enums that were generated from some very simple protos (no service definitions). The generated code contains only two references to google-protobuf
which are goog.exportSymbol()
and goog.object.extend()
as mentioned above. In theory here, webpack should tree shake away the unused parts of google-protobuf
as there's no way that those two functions require the vast functionality that is provided by that package. Yet the entirety of google-protobuf
is pulled into the bundle that we're all experiencing here. This tells me that tree shaking is somehow broken here. The typical culprit is forgetting to add sideEffects: false
to the package.json which is a signal to webpack that the package doesn't do some crazy things like dynamically importing using variables which makes it difficult to figure out precisely which parts may be tree shaken out or not. It doesn't appear that using that package.json property does anything here, so I'm not sure exactly why tree shaking is not working.
Regardless of that, it looks like someone finally figured out how to do grpc in the browser correctly: https://github.com/bufbuild/connect-web I'm only waiting on them to add node.js support after which this grpc-web repo may as well be deprecated as far as I am concerned. I see that they wrote their own protoc plugin, which I'm guessing is the key to this issue. I have not tested any of it yet, but it sounds very promising.
The generated code contains only two references to google-protobuf which are goog.exportSymbol() and goog.object.extend() as mentioned above. In theory here, webpack should tree shake away the unused parts of google-protobuf as there's no way that those two functions require the vast functionality that is provided by that package.
Yeah this is my question here. Would love to see an analysis of the dependency here to see if the above is indeed true.
The typical culprit is forgetting to add sideEffects: false to the package.json ... (edited — please use respectful languages in this forum thanks :))
The grpc-web package is built using Closure compiler, and the parameters are specified here, where dependency pruning is already enabled.
If anyone is able to experiment with the flags can showcase a code size reduction, it would be much appreciated.
Regardless of that, it looks like someone finally figured out how to do grpc in the browser correctly: https://github.com/bufbuild/connect-web I'm only waiting on them to add node.js support after which this grpc-web repo may as well be deprecated as far as I am concerned. I see that they wrote their own protoc plugin, which I'm guessing is the key to this issue. I have not tested any of it yet, but it sounds very promising.
I'm glad you found a solution that works for you :)
This repo indeed may not have all the latest features, but it is the official solution used internally and will stay up-to-date as the grpc-web protocol evolves.
I know very little about JS bundling and tree-shaking, but it seems that some of the foundational technology choices here and in Google's JS protobuf runtime make controlling bundle size difficult. It's understandably hard to revisit those decisions without breaking existing users. This is a great place for third-party implementations to experiment and find a better path forward.
A good first step toward encouraging a robust gRPC-Web ecosystem would be to formalize the protocol specification: https://github.com/grpc/grpc/issues/30565
(I realize that this is a bit of a tangent, but I thought folks interested in this issue may also want to weigh in on the protocol formalization.)
I've opened a PR(https://github.com/protocolbuffers/protobuf-javascript/pull/128) on google-protobuf
repo to fix and remove lots of unused code there, and maybe we can have it after a new release.
![Screen Shot 1401-07-16 at 12 01 21](https://user-images.githubusercontent.com/6254009/194696972-17e1870a-c388-4efc-9539-b9a44419987e.png)
Good news guys!
The changes for reducing bundle size of google-protobuf
is merged, and the new version is out there 😍
I am not sure do we need to update the version of this pkg to use the latest version of google-protobuf
or not.
Edit: I've opened this PR: https://github.com/grpc/grpc-web/pull/1291 to use the new version and fix this huge bundle size issue.
Update: Due to an issue in CommonJs, the code size optimization was rolled back in https://github.com/protocolbuffers/protobuf-javascript/pull/143.
Good new is that the team is actively working on re-enabling that optimization (comment) 😃
@sayjeyhi FYI :)