storage icon indicating copy to clipboard operation
storage copied to clipboard

Upload and download progress events.

Open inian opened this issue 3 years ago • 6 comments

Feature request

How do we go about emitting these events from the server? Does S3 support this natively? What about other storage backends like Backblaze? I do not want to add S3 only features to our server if possible.

inian avatar May 24 '21 04:05 inian

The @aws-sdk/lib-storage library exposes an .on('httpUploadProgress', (progress) => {}) method on the Upload instance. This progress object contains the loaded and total bytes of the request.

I'm not quite sure on Backblaze's capabilities to expose the upload progress. But from their docs they claim to have a S3 compatible API. I also found a B2 library which succeeded in adding an upload progress tracker for Backblaze. So seems doable.

mgm1313 avatar May 24 '21 08:05 mgm1313

Having upload progress would be helpful! I've looked into this a little bit and wanted to document my findings and thought process to hopefully advance this idea further.

The httpUploadProgress event in the AWS library is a good find! It seems like the difficulty would be passing those events back to the client somehow. Given that the storage-api API is a single POST request to /storage/v1/object/{bucket}/{path} (for uploading), it seems like the only way for the server to provide progress feedback to the client would be out-of-band, for example via a separate client connection to storage-api (could be polling or long-polling or websockets) or via a separate client connection to the database (could be realtime notifications of changes to a storage.object_progress table that storage-api updates). Those options add considerable server-side complexity (sharing upload-progress state between multiple storage-api connections or adding a noisy database table to act as pubsub) and client-side complexity (requiring multiple connections and maybe realtime-js).

The above ideas don't seem great to me, so exploring more widely brings us to the idea of changing the server-side API or, more likely, adding a separate API. Options that come to mind:

  • Upload files over a duplex connection like websockets, so the server can send messages to the client about progress uploading to the storage provider. This seems doable, but unusual, so there might be surprise obstacles.
  • Mimic the underlying AWS library, which bases its httpUploadProgress event on how many parts of a multipart upload are complete (sidenote: it does not actually emit a progress event for PutObject other than when it's complete, i.e. at 100% progress!). This approach would involve the client JS library splitting the file into chunks and making separate requests to upload each chunk. The server-side could handle making CreateMultipartUpload and CompleteMultipartUpload requests and would attach auth and forward the chunk upload requests on to the storage provider (e.g. S3).

Yuck, these ideas don't seem great either. Stepping back, I wonder if it makes sense to report progress from the server-side at all. If progress is measured client-side, we side-step the complexity of communicating progress from server to client, and we avoid adding storage-provider-specific functionality to storage-api. Assuming storage-api has relatively short timeouts and doesn't silence errors, then measuring progress client-side should be safe and reasonably accurate.

Measuring progress on the client doesn't quite fit in a storage-api issue, but I'll push on a bit further -- how could this work? There's a recent article about fetch accepting a stream body (https://web.dev/fetch-upload-streaming/), which should let us measure how many bytes fetch has read. Unfortunately, it seems this feature is supported by no browsers and only works in chrome by enabling chrome://flags/#enable-experimental-web-platform-features. Also, this feature doesn't seem to have much momentum as far as I can tell -- related issues have been open for years (#1, #2) so this functionality might not land in browsers anytime soon.

The alternative is XMLHttpRequest and the progress events that it emits. I'm not too familiar with node and isomorphic-javascript type stuff, so I'm wondering: is it feasible to have an option or separate function in supabase/storage-js for upload/download with XMLHttpRequest instead of fetch? (Should I open an issue about this on that repo?)

Thoughts? Hopefully I'm missing an easy, obvious answer that y'all can point out to me! If not, then hopefully the above helps clarify this issue and save time for others interested in this feature.

grschafer avatar Nov 18 '21 01:11 grschafer

In case it's helpful for others, here's an approach for getting upload progress with XMLHttpRequest in the browser.

In place of:

await supabase.storage.from(bucket).upload(path, data);

you could instead do the below:

// Typescript will give errors for accessing protected members of supabase
const url = `${supabase.supabaseUrl}/storage/v1/object/${bucket}/${path}`;
const headers = supabase._getAuthHeaders();

const req = new XMLHttpRequest();
req.upload.onprogress = updateProgress;
req.upload.onload = transferComplete;
// You might want to also listen to onabort, onerror, ontimeout
req.open("POST", url);
for (const [key, value] of Object.entries(headers)) {
  req.setRequestHeader(key, value);
}
req.send(data);

function updateProgress(e) {
  const pct = (e.loaded / e.total) * 100;
  console.log(`Upload progress = ${e.loaded} / ${e.total} = ${pct}`);
}

function transferComplete(e) {
  console.log("The transfer is complete.");
}

Some of the code above is directly from MDN docs.

Note that events for listening to uploads are on XMLHttpRequest.upload and events for listening to downloads are on XMLHttpRequest.

grschafer avatar Nov 18 '21 21:11 grschafer

Since S3 provides signed urls for upload: https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html

Is it possible for storage-api to call this and give us the url? We can then try to upload to this url directly and probably able to track progress using e.g. axios on the web client.

kk21 avatar Nov 27 '21 05:11 kk21

Has any progress been made on this issue yet? Not work arounds but official support? @kiwicopple @kangmingtay @awalias

He1nr1chK avatar May 29 '22 09:05 He1nr1chK

Any progress on this, we need this on flutter supabase package too

iampopal avatar Jun 21 '22 09:06 iampopal

progress events are much needed when uploading large files.

Nisthar avatar Oct 05 '22 10:10 Nisthar

We need this on Flutter Package!!!

On Wed, Oct 5, 2022, 6:18 PM Nisthar @.***> wrote:

In case it's helpful for others, here's an approach for getting upload progress with XMLHttpRequest in the browser.

In place of:

await supabase.storage.from(bucket).upload(path, data);

you could instead do the below:

// Typescript will give errors for accessing protected members of supabase const url = ${supabase.supabaseUrl}/storage/v1/object/${bucket}/${path}; const headers = supabase._getAuthHeaders();

const req = new XMLHttpRequest(); req.upload.onprogress = updateProgress; req.upload.onload = transferComplete; // You might want to also listen to onabort, onerror, ontimeout req.open("POST", url); for (const [key, value] of Object.entries(headers)) { req.setRequestHeader(key, value); } req.send(data);

function updateProgress(e) { const pct = (e.loaded / e.total) * 100; console.log(Upload progress = ${e.loaded} / ${e.total} = ${pct}); }

function transferComplete(e) { console.log("The transfer is complete."); }

Some of the code above is directly from MDN docs https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Using_XMLHttpRequest#monitoring_progress .

Note that events for listening to uploads are on XMLHttpRequest.upload https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/upload and events for listening to downloads are on XMLHttpRequest https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest#events.

idk why but my upload using this code is very slow.

— Reply to this email directly, view it on GitHub https://github.com/supabase/storage-api/issues/23#issuecomment-1268465239, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIUFLXH32E63HH425AACPK3WBWBL5ANCNFSM45MO5OCQ . You are receiving this because you commented.Message ID: @.***>

iampopal avatar Oct 10 '22 18:10 iampopal

yeah never i can upload file of up to 90mb , fix it please

DeabitTech avatar Oct 21 '22 15:10 DeabitTech

In case someone interested, I wrote this function to download from Supabase storage and track download progress:

https://gist.github.com/isaiasmatewos/c15c4d75ce501437bd2be6eea6d0acb9

isaiasmatewos avatar Nov 02 '22 18:11 isaiasmatewos

Any news on this?

An absolute must have for these cases where users must upload large files. We need to have a way to give them feedbacks on their upload progress. Thanks!

Devosaure avatar Jan 22 '23 14:01 Devosaure

Can you please give us an example about showing uploading progress to user based on uploaded file… This is a very must functionality we need..

On Sun, 22 Jan 2023 at 6:45 PM Devosaure @.***> wrote:

An absolute must have for these case where users must upload large files, we need to have a way to give them feedbacks on their upload progress. Thanks!

— Reply to this email directly, view it on GitHub https://github.com/supabase/storage-api/issues/23#issuecomment-1399500457, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIUFLXCUXPFPBDGO3P6HOSTWTU6JHANCNFSM45MO5OCQ . You are receiving this because you commented.Message ID: @.***>

iampopal avatar Jan 22 '23 17:01 iampopal

Any update?

colestriler avatar Jan 25 '23 16:01 colestriler

I also think this is a much-needed feature, and the supbase dashboard supports the display of progress when uploading files.

Maybe some examples should be provided?

songhn233 avatar Jan 31 '23 20:01 songhn233

Hey, wanted to share how I was able to do this with React & Axios (much cleaner API than raw XMLHttpRequest) without too much trouble.

import axios from "axios";
import type { Axios, AxiosRequestConfig } from "axios";

async upload(file: File, bucket: string, name: string, config?: AxiosRequestConfig) {
  // Create form data
  const blob = new Blob([file], { type: "video/mp4" });
  const formData = new FormData();
  formData.append('cacheControl', '3600');
  formData.append('', blob);

  return axios.post(
    `${BASE_SUPA_URL}/storage/v1/object/${bucket}/${name}`,
    formData,
    {
      headers: {
        "Content-Type": "multipart/form-data",
        // @ts-ignore
        ...supabase.headers,
      },
      onUploadProgress: config?.onUploadProgress,
      onDownloadProgress: config?.onDownloadProgress,
    }
  );
}

Then, in my form's onSubmit handler:

const upload = await api.upload(
  file,
  "artifacts",
  `${fileID}/video.mp4`,
  {
    onUploadProgress: (evt) => {
      const _progress = (evt.loaded / (evt.total || Infinity)) * 100;
      console.log(_progress)
      setProgress(_progress)
    }
  }
);

Another example here: https://dev.to/jbrocher/react-tips-tricks-uploading-a-file-with-a-progress-bar-3m5p

One note: I'm uploading to a local Supabase instance in Docker and had to set a RLS policy via the UI for this to work: http://localhost:54323/project/default/storage/policies

I used the "Enable insert for authenticated users only" template. Haven't tested any other RLS policies but seems that supabase.headers has everything necessary.

cameronk avatar Feb 05 '23 17:02 cameronk

Gloat seeing you did this with javaScript, how will we able check upload progress with dart package?

On Sun, 5 Feb 2023 at 9:31 PM Cameron Kelley @.***> wrote:

Hey, wanted to share how I was able to do this with React & Axios (much cleaner API than raw XMLHttpRequest) without too much trouble.

import axios from "axios"; import type { Axios, AxiosRequestConfig } from "axios";

async upload(file: File, bucket: string, name: string, config?: AxiosRequestConfig) { // Create form data const blob = new Blob([file], { type: "video/mp4" }); const formData = new FormData(); formData.append('cacheControl', '3600'); formData.append('', blob);

return axios.post( ${BASE_SUPA_URL}/storage/v1/object/${bucket}/${name}, formData, { headers: { "Content-Type": "multipart/form-data", // @ts-ignore ...supabase.headers, }, onUploadProgress: config?.onUploadProgress, onDownloadProgress: config?.onDownloadProgress, } ); }```

const upload = await api.upload( file, "artifacts", ${fileID}/video.mp4, { onUploadProgress: (evt) => { const _progress = (evt.loaded / (evt.total || Infinity)) * 100; console.log(_progress) setProgress(_progress) } } );

One note: I'm uploading to a local Supabase instance in Docker and had to set a RLS policy via the UI for this to work: http://localhost:54323/project/default/storage/policies

I used the "Enable insert for authenticated users only" template. Haven't tested any other RLS policies but seems that supabase.headers has everything necessary.

— Reply to this email directly, view it on GitHub https://github.com/supabase/storage-api/issues/23#issuecomment-1418158042, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIUFLXCE3WCVDHT7CLQAEZTWV7MGLANCNFSM45MO5OCQ . You are receiving this because you commented.Message ID: @.***>

iampopal avatar Feb 06 '23 20:02 iampopal

@cameronk I tried your solution but first I'm getting errors: error: "Error", "headers must have required property 'authorization'"

so I then tried using: ...supabase.auth.headers

yet that was giving me a different error of: error: "Invalid JWT", message: "new row violates row-level security policy for table \"objects\""

I have the same policy you suggested and tried a few other combinations too, also double checking the bucket name, to no avail. But thanks for at least putting me in the right direction with your snippets.

RieMars avatar Feb 09 '23 09:02 RieMars

The example provided by [grschafer] above was working for me with Supabase V1 but after recently upgrading to V2 it barfed on this line:

const headers = supabase._getAuthHeaders();

After replacing it with: 'const headers = supabaseClient.auth.headers'

it is working again.

I made uploaders and downloaders for vue.js (using with Quasar's uploader component. (https://quasar.dev/vue-components/uploader)

I hope they help someone.

SupabaseUploaderDownloader.zip

getLoggedinUser() is in the Supabase boot file (used for Quasar plugins).

export async function getLoggedinUser() { const { data: { session } } = await supabaseClient.auth.getSession() if (!session) return null const { user } = session return user }

paladyne-io avatar Mar 01 '23 06:03 paladyne-io

Hi all, with resumable uploads, there is a native way to listen to progress events now.

inian avatar Apr 18 '23 06:04 inian