algoliasearch-client-javascript
algoliasearch-client-javascript copied to clipboard
feat: add gzip as content-encoding
The engine is introducing a new feature allowing to optionally compress request payloads (ie. POST/PUT body, http or https).
To test it:
Perform a saveObject
with the content encoded with GZIP
Don't forget to add: "Content-Encoding: gzip"
as a header.
We will discuss a specification for a more detailed implementation.
Like adding a enableGZIP
flag in client's configuration.
the flag should be contentEncoding: 'gzip'
IMO :)
For node: this is doable and could be beneficial for sure especially for big batches.
For browsers: is there a native way to encode gzip in browsers?
@Haroenv the header is the HTTP one, not the flag of the option. Actually there should be no option but rather just activate if doable or not
I think that as well, preferably no flag at all, but if necessary the flag shouldn't be specific to gzip :)
I've looked around, and couldn't find a good way to send gz via the browser, so it should just be disabled there I think
Referencing our main task to keep track of this issue' status. https://github.com/algolia/algoliasearch-client-specs-internal/issues/30
edit: dead link updated (still private though).
This feature is really important as soon as anybody expect to use the Atomic Reindexing - https://www.algolia.com/doc/api-reference/api-methods/replace-all-objects/.
For example, we would need to upload about 100MB of json and following that rule of thumb - https://gist.github.com/jordansissel/3044155 - we would only need to upload between 20 and 30 MB if gzip content. This is day and night.
This issue is going to manifest more as people approach the size limit of 1 GB on the total request. https://www.algolia.com/doc/api-reference/api-methods/add-objects/#about-this-method
BTW the https://github.com/algolia/algoliasearch-client-specs/issues/30 link is dead or unreachable.