rails
rails copied to clipboard
Make ActiveStorage work for API only apps
I have a Rails API serving a React SPA. It works perfect but I had to do a modification to use ActiveStorage's direct uploads.
The problem appears when trying to create a direct upload (i.e. a Blob). ActiveStorage::DirectUploadsController
will fail with some errors which I believe are expected on a normal app but not on an API controllers. These are the errors:
-
HTTP Origin header (http://localhost:3001/) didn't match request.base_url (http://localhost:3000)
-
Can't verify CSRF token authenticity.
My solution has been to change this line: https://github.com/rails/rails/blob/4ec8bf68ff92f35e79232fbd605012ce1f4e1e6e/activestorage/app/controllers/active_storage/direct_uploads_controller.rb#L6
and make:
class ActiveStorage::DirectUploadsController < ApplicationController
I think the problem is solved because my ApplicationController
inherits from ActionController::API
.
If my assumptions are correct, shouldn't ActiveStorage controllers inherit from ActionController::API
or ActionController::Base
depending on config.api_only = true
?
@hector Would you mind sharing how you used ActiveStorage outside of a Form field?
Sure, add the attribute data-direct-upload-url='/rails/active_storage/direct_uploads'
to the file inputs that you want to act as direct uploads.
Remember you need to import the ActiveStorage javascript as suggested in the readme: https://github.com/rails/rails/tree/master/activestorage#direct-upload-installation
For this importing step I had to do a variation since I am using React and ES6 imports, I needed more control over success/failure of the direct uploads and I am using React's controlled components. I extracted the code from ActiveStorage.start()
. This is the general idea:
import React from 'react';
import {DirectUploadsController} from '../../lib/activestorage/direct_uploads_controller';
class NewProject extends React.Component {
componentDidMount() {
this.project = new Project(); // OptionaI: I store attributes in a Project model
document.addEventListener('direct-upload:end', this.setDirectUploadAttr);
// bind methods below here, I do it automatically with a decorator
}
setDirectUploadAttr(event) {
const fileInput = event.target;
const hiddenInput = document.querySelector(`input[type="hidden"][name="${fileInput.name}"`);
// Here you have the value you have to assign to the has_one_attached attribute,
// do whatever you do assign it to. I have it in a project model
this.project[fileInput.name] = hiddenInput.value;
}
// You may use promises instead of async/await
async onFormSubmit(event) {
event.preventDefault();
try {
// Upload attached files to external storage service
const form = event.target;
const controller = new DirectUploadsController(form);
const {inputs} = controller;
if (inputs.length) {
await new Promise((resolve, reject) => {
controller.start(error => error? reject(error) : resolve());
});
}
// Run your submit function. In my case:
await this.project.save();
} catch (error) {
// Do something with the error. For example:
alert(error);
}
}
render() {
returns (
<form onSubmit={this.onFormSubmit}>
<input id="video" name="video" type="file" data-direct-upload-url="/rails/active_storage/direct_uploads" />
<button type="submit">Submit</button>
</form>
);
}
}
This would be the original Rails model:
class Project < ApplicationRecord
has_one_attached :video
end
I wrote this package to make it easier to use ActiveStorage in React components, in case anyone finds it helpful. https://github.com/cbothner/react-activestorage-provider
Thank you @hector !
That's very useful!
A decent workaround for this issue that I have been using is in the config/initializers dir I have created a file called active_storage_api_modification.rb
that has ActiveStorage::DirectUploadsController.instance_eval { skip_forgery_protection }
to run.
@cbothner will the react-activestorage-provider work with ReactNative?
Right now, I’m sorry to say, it certainly will not. The activestorage JavaScript package uses DOM Form APIs to perform the direct upload, so it won’t work with React Native.
2018年4月21日(土) 13:27 cdesch [email protected]:
@cbothner https://github.com/cbothner will the react-activestorage-provider https://github.com/cbothner/react-activestorage-provider work with ReactNative?
— You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub https://github.com/rails/rails/issues/32208#issuecomment-383318499, or mute the thread https://github.com/notifications/unsubscribe-auth/AEbXJ7kltyGN4T5RgU0OHJOBc_-ERGuVks5tq3oqgaJpZM4SjcIj .
Are there any recommendations on how to use ActiveStorage with a React Native app?
@tommotaylor I would like to use ActiveStorage with a React Native app as well.
Previously, I had a rather hacky setup combining react-native-fetch-blob
and react-s3-uploader
. This uploads to S3 directly via a presigned URL (you'd need to render the right response at /sign-s3
):
/**
* Taken, CommonJS-ified, and heavily modified from:
* https://github.com/flyingsparx/NodeDirectUploader
*/
import RNFetchBlob from "react-native-fetch-blob";
var Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
S3Upload.prototype.server = "";
S3Upload.prototype.signingUrl = "/sign-s3";
S3Upload.prototype.signingUrlMethod = "GET";
S3Upload.prototype.signingUrlSuccessResponses = [200, 201];
S3Upload.prototype.fileElement = null;
S3Upload.prototype.files = null;
S3Upload.prototype.onFinishS3Put = function(signResult, file) {
return console.log("base.onFinishS3Put()", signResult.publicUrl);
};
S3Upload.prototype.preprocess = function(file, next) {
console.log("base.preprocess()", file);
return next(file);
};
S3Upload.prototype.onProgress = function(percent, status, file) {
return console.log("base.onProgress()", percent, status);
};
S3Upload.prototype.onError = function(status, file) {
return console.log("base.onError()", status);
};
S3Upload.prototype.scrubFilename = function(filename) {
return filename.replace(/[^\w\d_\-\.]+/gi, "");
};
function S3Upload(options) {
if (options == null) {
options = {};
}
for (var option in options) {
if (options.hasOwnProperty(option)) {
this[option] = options[option];
}
}
var files = this.fileElement ? this.fileElement.files : this.files || [];
this.handleFileSelect(files);
}
S3Upload.prototype.handleFileSelect = function(files) {
var result = [];
for (var i = 0; i < files.length; i++) {
var file = files[i];
this.preprocess(
file,
function(processedFile) {
this.onProgress(0, "Waiting", processedFile);
result.push(this.uploadFile(processedFile));
return result;
}.bind(this)
);
}
};
S3Upload.prototype.createCORSRequest = function(method, url, opts) {
var opts = opts || {};
var xhr = new RNFetchBlob.polyfill.XMLHttpRequest();
xhr.open(method, url, true);
xhr.withCredentials = true;
return xhr;
};
S3Upload.prototype.executeOnSignedUrl = function(file, callback) {
var fileName = this.scrubFilename(file.name);
var queryString =
"?objectName=" + fileName + "&contentType=" + encodeURIComponent(file.type);
if (this.s3path) {
queryString += "&path=" + encodeURIComponent(this.s3path);
}
if (this.signingUrlQueryParams) {
var signingUrlQueryParams =
typeof this.signingUrlQueryParams === "function"
? this.signingUrlQueryParams()
: this.signingUrlQueryParams;
Object.keys(signingUrlQueryParams).forEach(function(key) {
var val = signingUrlQueryParams[key];
queryString += "&" + key + "=" + val;
});
}
var xhr = this.createCORSRequest(
this.signingUrlMethod,
this.server + this.signingUrl + queryString,
{ withCredentials: this.signingUrlWithCredentials }
);
if (this.signingUrlHeaders) {
var signingUrlHeaders =
typeof this.signingUrlHeaders === "function"
? this.signingUrlHeaders()
: this.signingUrlHeaders;
Object.keys(signingUrlHeaders).forEach(function(key) {
var val = signingUrlHeaders[key];
xhr.setRequestHeader(key, val);
});
}
xhr.onreadystatechange = function() {
if (
xhr.readyState === 4 &&
this.signingUrlSuccessResponses.indexOf(xhr.status) >= 0
) {
var result;
try {
result = JSON.parse(xhr.responseText);
} catch (error) {
this.onError("Invalid response from server", file);
return false;
}
return callback(result);
} else if (
xhr.readyState === 4 &&
this.signingUrlSuccessResponses.indexOf(xhr.status) < 0
) {
return this.onError(
"Could not contact request signing server. Status = " + xhr.status,
file
);
}
}.bind(this);
return xhr.send();
};
S3Upload.prototype.uploadToS3 = function(file, signResult) {
var xhr = this.createCORSRequest("PUT", signResult.signedUrl);
if (!xhr) {
this.onError("CORS not supported", file);
} else {
xhr.onload = function() {
if (xhr.status === 200) {
this.onProgress(100, "Upload completed", file);
return this.onFinishS3Put(signResult, file);
} else {
return this.onError("Upload error: " + xhr.status, file);
}
}.bind(this);
xhr.onerror = function(err) {
return this.onError("XHR error", file);
}.bind(this);
xhr.upload.onprogress = function(e) {
var percentLoaded;
if (e.lengthComputable) {
percentLoaded = Math.round(e.loaded / e.total * 100);
return this.onProgress(
percentLoaded,
percentLoaded === 100 ? "Finalizing" : "Uploading",
file
);
}
}.bind(this);
}
xhr.setRequestHeader("Content-Type", file.type);
if (this.contentDisposition) {
var disposition = this.contentDisposition;
if (disposition === "auto") {
if (file.type.substr(0, 6) === "image/") {
disposition = "inline";
} else {
disposition = "attachment";
}
}
var fileName = this.scrubFilename(file.name);
xhr.setRequestHeader(
"Content-Disposition",
disposition + '; filename="' + fileName + '"'
);
}
if (signResult.headers) {
var signResultHeaders = signResult.headers;
Object.keys(signResultHeaders).forEach(function(key) {
var val = signResultHeaders[key];
xhr.setRequestHeader(key, val);
});
}
if (this.uploadRequestHeaders) {
var uploadRequestHeaders = this.uploadRequestHeaders;
Object.keys(uploadRequestHeaders).forEach(function(key) {
var val = uploadRequestHeaders[key];
xhr.setRequestHeader(key, val);
});
} else {
xhr.setRequestHeader("x-amz-acl", "public-read");
}
this.httprequest = xhr;
return xhr.send(file);
};
S3Upload.prototype.uploadFile = function(file) {
var uploadToS3Callback = this.uploadToS3.bind(this, file);
if (this.getSignedUrl) return this.getSignedUrl(file, uploadToS3Callback);
return this.executeOnSignedUrl(file, uploadToS3Callback);
};
S3Upload.prototype.abortUpload = function() {
this.httprequest && this.httprequest.abort();
};
export default S3Upload;
This does not work at all with ActiveStorage...and using ActiveStorage would be much nicer.
The npm package activestorage
and anything that depends on it won’t work on React Native because FileReader.readAsArrayBuffer
is not implemented. But that does not mean you can’t use ActiveStorage on the Rails side by uploading your files with the fetch API or as an HTML form would.
No Direct Upload
It is easiest if you expect only to receive small enough files or if you’re not using Heroku and you don’t need direct upload to S3. React Native supports the fetch API so you can use FormData
to send the file as part of a standard POST request to your normal endpoint. After you get your image from one of the image picker components do
let data = new FormData()
data.append('user[image]', fileObject)
fetch(/* user_url */, {
method: 'PUT',
body: data
})
Direct Upload
It will be harder to do a direct upload, but these are the requests that the activestorage
package performs.
Get the signed upload URL
fetch('/rails/active_storage/direct_uploads', {
method: 'POST',
body: {
blob: {
filename: "griffin.jpeg",
content_type: "image/jpeg",
byte_size: 1020753,
checksum: /* base 64 of the MD5 hash of the file */
}
})
(The activestorage
package uses FileReader.readAsArrayBuffer
to calculate the checksum incrementally. Maybe you could skip that optimization and avoid the incompatibility? https://github.com/rails/rails/blob/master/activestorage/app/javascript/activestorage/file_checksum.js)
That returns a JSON response with keys signed_id
and direct_upload
that you’ll need
Upload the file
PUT the file to response.direct_upload.url
with the headers from response.direct_upload.headers
. There’s no other body but the file as a blob.
fetch(response.direct_upload.url, {
method: 'PUT',
headers: response.direct_upload.headers,
body: fileObject,
})
Update your Rails model with the signed_id of the blob you just made
fetch(/* user_url */, {
method: 'PUT',
body: {
user: {
image: response.signed_id
}
}
})
That should be it. Create a blob, which gives you a signed upload url. PUT the file to that signed URL. Then update the rails model to assign your signed url to the attachment relation.
I am having the same request and i am looking to send files from an iOS App to a Rails REST API which uses ActiveStorage. All the responses here seem to use a HTML form. I'd like some guidance on how to do it without a form but from any external service.
Would i need to replicate/mimic a form multipart? My setup uses local disk storage, not an external provider.
@drale2k the Direct Upload section of my answer above should work for you using a URLRequest or some other abstraction in place of javascript’s fetch. You can still use the direct upload workflow with local disk storage—that’s what I’m using in development and it works the same.
You might even have an easier time with the MD5 in native iOS than others might have on React Native https://stackoverflow.com/questions/1684799/generate-hash-from-uiimage
@cbothner Thanks!! I could do activestorage direct_upload works with reactjs and redux-sagas with the help of your comments.
I have a promise-service with all the proper decorated headers for my requests including a bearer token, endpoint url, etc... I don't want to interpolate the strings into DirectUpload, does anyone know of a way to reuse the promise with DirectUpload?
Here is a write up for what I did to get this working. This is my contribution to Rails community for the rest of the year 😄.
My Project Setup
I want to provide some context first. I have my React app split away from Rails. In production, I will have my React app in S3 with Rails in Heroku. In summary, I am not using Rails to serve my single page application.
My rails server:
http://localhost:3001/
My react app:
http://localhost:3000/
High Level Overview
You will want to use DirectUpload
provided by ActiveStorage to help you manage the upload with ActiveStorage.
Behind the scene, DirectUpload
makes a POST
to ActiveStorage and it will automatically follow up with a PUT
request. In the response to your POST
, it will give you the details of the file that was stored into ActiveStorage.
Here are examples of the request with their responses, but again, you don't have to worry about this since the library handles it for you:
Request URL: http://localhost:3001/rails/active_storage/direct_uploads
Request Method: POST
Status Code: 200 OK
Response:
{
"id":1,
"key":"sxp2df7kkjEG8FWSzxonnYYp",
"filename":"testname",
"content_type":"image/png",
"metadata":{
},
"byte_size":6853,
"checksum":"FAt9JP8ARkvkapFYB/IDLw==",
"created_at":"2018-07-22T07:17:05.037Z",
"signed_id":"eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaHBCZz09IiwiZXhwIjpudWxsLCJwdXIiOiJibG9iX2lkIn19--aa5d17cb4058350d7c3850c4bd644a59fe6e3676",
"direct_upload":{
"url":"http://localhost:3001/rails/active_storage/disk/eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaDdDVG9JYTJWNVNTSWRjM2h3TW1SbU4ydHJha1ZIT0VaWFUzcDRiMjV1V1Zsd0Jqb0dSVlE2RVdOdmJuUmxiblJmZEhsd1pVa2lEbWx0WVdkbEwzQnVad1k3QmxRNkUyTnZiblJsYm5SZmJHVnVaM1JvYVFMRkdqb05ZMmhsWTJ0emRXMUpJaDFHUVhRNVNsQTRRVkpyZG10aGNFWlpRaTlKUkV4M1BUMEdPd1pVIiwiZXhwIjoiMjAxOC0wNy0yMlQwNzoyMjowNS4wODZaIiwicHVyIjoiYmxvYl90b2tlbiJ9fQ==--d6d1890294d1f650ba079208fca936cedd071e5f",
"headers":{
"Content-Type":"image/png"
}
}
}
Request URL: http://localhost:3001/rails/active_storage/disk/eyJfcmFpbHMiOnsibWVzc2FnZSI6IkJBaDdDVG9JYTJWNVNTSWRjM2h3TW1SbU4ydHJha1ZIT0VaWFUzcDRiMjV1V1Zsd0Jqb0dSVlE2RVdOdmJuUmxiblJmZEhsd1pVa2lEbWx0WVdkbEwzQnVad1k3QmxRNkUyTnZiblJsYm5SZmJHVnVaM1JvYVFMRkdqb05ZMmhsWTJ0emRXMUpJaDFHUVhRNVNsQTRRVkpyZG10aGNFWlpRaTlKUkV4M1BUMEdPd1pVIiwiZXhwIjoiMjAxOC0wNy0yMlQwNzoyMjowNS4wODZaIiwicHVyIjoiYmxvYl90b2tlbiJ9fQ==--d6d1890294d1f650ba079208fca936cedd071e5f
Request Method: PUT
Status Code: 204 No Content
After getting this far, then you just take the signed_id
from the POST
response and use it for your custom endpoint. You can find an example implementation below.
The Implementation
Prepping Your Controllers
@derigible had the right idea to skip forgery, but you guys need to understand the consequences of skipping this and removing the authenticity token. These things help prevent random requests from replaying and storing random stuff into your ActiveRecord. Now that said, I have all my endpoints protected with an access token granted through OAuth. More specifically, I am using Doorkeeper and I will be decorating the controller to include it.
I didn't like @derigible method, because it was in an initializer. It doesn't follow convention well, and the next developer won't be able to predictably find where they would expect it.
Here's what I did instead:
routes.rb
...
post '/rails/active_storage/direct_uploads' => 'direct_uploads#create'
...
direct_uploads_controller.rb
class DirectUploadsController < ActiveStorage::DirectUploadsController
protect_from_forgery with: :exception
skip_before_action :verify_authenticity_token
before_action :doorkeeper_authorize! # this is doorkeeper specific, but you can use any token authentication scheme here
end
Now after prepping my controllers, if I don't send an access/bearer token then this endpoint will respond with a 401
.
Frontend
Now in your frontend, we have a few things to do. One, we need to decorate the requests made by DirectUpload
with our access token since we protected the endpoint using Doorkeeper.
const { accessToken } = this.state;
const railsActiveStorageDirectUploadsUrl = "http://localhost:3001/rails/active_storage/direct_uploads";
const upload = new DirectUpload(file, railsActiveStorageDirectUploadsUrl, {
directUploadWillCreateBlobWithXHR: (xhr) => {
// This will decorate the requests with the access token header so you won't get a 401
xhr.setRequestHeader("Authorization", `Bearer ${accessToken}`)
}
})
upload.create((error, blob) => {
if (error) {
// Handle the error
console.log(error)
} else {
console.log(blob)
/*
By the time you reach here, DirectUpload has made the 2 requests that I mentioned above.
When you're here, just use your services to talk to rails normally.
When you reach this block of code, then you can expect a Blob to exist in your Rails backend now. You can run this in `rails c` to check:
ActiveStorage::Blob.last
*/
userAvatar(blob.signed_id) // This makes a request to my custom endpoint (look below): "http://localhost:3001/users/profile/avatar.json"
}
})
Backend
routes.rb
...
post 'users/profile/avatar', to: 'users_profile#avatar'
....
class UsersProfileController < ApiController
def avatar
current_user.avatar.attach(avatar_params) # this is just the signed_id
head :ok
end
private
def avatar_params
params.require(:avatar)
end
end
If you made any changes to your image using HTML5 canvas like I did, then you'll need to add a name
key to the blob before you pass it into DirectUpload
.
Yielding the following:
const { accessToken } = this.state;
const railsActiveStorageDirectUploadsUrl = "http://localhost:3001/rails/active_storage/direct_uploads";
file.name = 'testname';
const upload = new DirectUpload(file, railsActiveStorageDirectUploadsUrl, {
directUploadWillCreateBlobWithXHR: (xhr) => {
// This will decorate the requests with the access token header so you won't get a 401
xhr.setRequestHeader("Authorization", `Bearer ${accessToken}`)
}
})
upload.create((error, blob) => {
if (error) {
// Handle the error
console.log(error)
} else {
console.log(blob)
/*
By the time you reach here, DirectUpload has made the 2 requests that I mentioned above.
When you're here, just use your services to talk to rails normally.
When you reach this block of code, then you can expect a Blob to exist in your Rails backend now. You can run this in `rails c` to check:
ActiveStorage::Blob.last
*/
userAvatar(blob.signed_id) // This makes a request to my custom endpoint (look below): "http://localhost:3001/users/profile/avatar.json"
}
})
Good luck guys! Let me know if I can help with anything.
If this helped, could you drop a 🍆 ?
And thanks @cbothner for your write up.
This is something we should definitely consider adding to the documentation for ActiveStorage. I took the above and cleaned up a little. Im using redux token auth and its a viable part of the conversation since using the JS package how its described in the docs wont work for token authenticated requests.
Thanks @dagumak !
A little bit on how it differs.
- I promisified the
create
method so it can be used in async/await - Im grabbing my tokens from storage
- I had more headers I needed to add other than Authorization.
- I wanted a way to simply get my image blob so I could then dispatch the user update method through thunk
import { DirectUpload } from 'activestorage/src/direct_upload';
import getHeadersFromStorage from './apiHeaders';
import { HOST, authHeaderKeys } from '../constants';
const RASURL = `${HOST}/rails/active_storage/direct_uploads`;
/**
* Promisify the create method provided by DirectUpload.
* @param {object} upload DirectUpload instance
* @return {promise} returns a promise to be used on async/await
*/
function createUpload(upload) {
return new Promise((resolve, reject) => {
upload.create((err, blob) => {
if (err) reject(err);
else resolve(blob);
});
});
}
/**
* Upload to service using ActiveStorage DirectUpload module
* @param {Object} file Image buffer to be uploaded.
* @return {Object} blob object from server.
* @see https://github.com/rails/rails/issues/32208
*/
async function activeStorageUpload(file) {
let imageBlob;
const headers = await getHeadersFromStorage();
const upload = new DirectUpload(file, RASURL, {
directUploadWillCreateBlobWithXHR: xhr => {
authHeaderKeys.forEach(key => {
xhr.setRequestHeader(key, headers[key]);
});
}
});
try {
imageBlob = await createUpload(upload);
} catch (err) {
throw err;
}
return imageBlob;
}
export default activeStorageUpload;
These are my authHeaderKeys
const authHeaderKeys = [
'access-token',
'token-type',
'client',
'uid',
'expiry'
];
Usage:
const imageBlob = await activeStorageUpload(imageData);
Edit
In terms of updating just the image record like in current_user.avatar.attach(avatar_params) # this is just the signed_id
I recommend instead, to create method where you update your record and just send the params you want to update like updateUser({ image: image.signed_id });
For me, that hits my controller in the following way
def update_profile
if current_api_v1_user.update_attributes(user_params)
user_serilized = UserSerializer.new(
current_api_v1_user
)
render json: {
data: user_serilized,
is_success: true,
status: 'success',
}
else
render json: { error: "Failed to Update", is_success: false }, status: 422
end
end
With the above, you bypass the id needed in the default rails update route and capitalizes on using the current_user helper (in my case current_api_v1_user
due to namespace from devise)
You could in fact do the same without creating a method or route using the default rails update
method api/v1/users#update
and pointing your update to /api/v1/users/:id
Want to also update folks who might have the same issue as I do to make this function correctly.
Referencing @dagumak 's post I needed to include a create
method for this. Looks like that part was left out. My controller looks like:
class DirectUploadsController < ActiveStorage::DirectUploadsController
# I'm using JWTSession for auth and have to include it
# as this doesn't inherit due to `ActiveStorage::BaseController < ActionController::Base`
# where we are using `ApplicationController < ActionController::API` in API Mode.
include JWTSessions::RailsAuthorization
rescue_from JWTSessions::Errors::Unauthorized, with: :not_authorized
protect_from_forgery with: :exception
skip_before_action :verify_authenticity_token
# Calling JWTSession to authorize
before_action :authorize_access_request!
def create
blob = ActiveStorage::Blob.create_before_direct_upload!(blob_args)
render json: direct_upload_json(blob)
end
private
def blob_params
params.require(:blob).permit(
:filename,
:content-type,
# etc...
)
end
# Rescue the Auth Error
def not_authorized
render json: { error: 'Not authorized' }, status: :unauthorized
end
end
The create method comes right from the source code https://github.com/rails/rails/blob/301409a98f2bef4f431a10cae74a3430bbaca9c2/activestorage/app/controllers/active_storage/direct_uploads_controller.rb#L7
That comes back with all the JSON DirectUpload
expect when using direct uploads.
My route file then is just
Rails.application.routes.draw do
post '/rails/active_storage/direct_uploads' => 'direct_uploads#create'
end
My JS also looks like (Using React)
const url = `${API_URL}/rails/active_storage/direct_uploads`;
const upload = new DirectUpload(file, url, {
directUploadWillCreateBlobWithXHR: xhr => {
// Put my JWT token in the auth header here
xhr.setRequestHeader('Authorization', `Bearer ${this.state.accessToken}`);
// Send progress upload updates
xhr.upload.addEventListener('progress', event => this.directUploadProgress(event));
}
});
That should help others get started on 5.2 at least. Hope we can get something to work out here for API uploads to make this a little easier.
@robertsonsamuel You do not need to add the create
method, because your DirectUploadsController
is inheriting directly from ActiveStorage::DirectUploadsController
. Rails will just use the create
on ActiveStorage::DirectUploadsController
; if you must declare a create
method then I would suggest just doing:
def create
super
end
Edit: Actually @robertsonsamuel I noticed you are using include
in your controller. This is most likely the issue.
@dagumak that was entirely correct. I was able to remove that create method as well. Cleaned it up quite nicely. Thanks for pointing that out. 🎉Includes didn't effect this however, thanks for the insight! Still hopefully this helps others, even in SEO haha.
This issue has been automatically marked as stale because it has not been commented on for at least three months.
The resources of the Rails team are limited, and so we are asking for your help.
If you can still reproduce this error on the 5-2-stable
branch or on master
, please reply with all of the information you have about it in order to keep the issue open.
Thank you for all your contributions.
For those on react native, I was able to get direct uploads working using rn-fetch-blob
for md5 hashing (which is output in hex), then converting its hex output into base64 using buffer
for calculating the checksum. To lookup the content_type, I used react-native-mime-types
, and last but not least, used rn-fetch-blob
again for calculating the size. Then, just follow the communication guidelines pointed out by @cbothner, and if the files are big, use rn-fetch-blob
for efficiently uploading the file. I just might create a library for this, but at the minimum, I'll blog soon about solving this issue on react-native.
@Samsinite could you share a working example, please? I've spent all my day to manage direct upload from RN app
For anybody who will want to make react-native and rails ActiveStorage (s3) make it work:
import RNFetchBlob from 'rn-fetch-blob';
import { Buffer } from 'buffer';
const hash = await RNFetchBlob.fs.hash(uri, 'md5');
const checksum = Buffer.from(hash, 'hex').toString('base64');
Here's how I got this to work with Expo ImagePicker (based on https://github.com/rails/rails/issues/32208#issuecomment-383737803 above)
import * as ImagePicker from "expo-image-picker"
import * as FileSystem from "expo-file-system"
import { Buffer } from "buffer"
const result = await ImagePicker.launchImageLibraryAsync({ base64: true })
// this is your fileObject
const fileObject = Buffer.from(result.base64, "base64")
// this is your checksum
const meta = await FileSystem.getInfoAsync(result.uri, { md5: true })
const md5 = meta.md5
const checksum = Buffer.from(md5, "hex").toString("base64") // btoa
// this is your byte_size
const byteSize = fileObject.length
On the rails side, just had to override the default ActiveStorage::DirectUploadsController
with
# config/routes.rb
post "/rails/active_storage/direct_uploads" => "direct_uploads#create"
# app/controllers/direct_uploads_controller.rb
class DirectUploadsController < ActiveStorage::DirectUploadsController
protect_from_forgery with: :exception
skip_before_action :verify_authenticity_token
end
Hope this helps someone!
@jbschrades if result.base64
is your file in a base64 format, than large media could potentially crash when it uses up all of the memory on weaker android devices, this will definitely occur for video media.
Recommend using the URI along with RNFetchBlob and Buffer to calculate the checksum:
const result = await ImagePicker.launchImageLibraryAsync()
const hash = await RNFetchBlob.fs.hash(result.uri, 'md5');
const checksum = Buffer.from(hash, 'hex').toString('base64');
@Samsinite then what do you send as the file?
That's my solution on a react js app with progress
const upload = new DirectUpload(file, railsActiveStorageDirectUploadsUrl, {
directUploadWillCreateBlobWithXHR: (xhr) => {
// This will decorate the requests with the access token header so you won't get a 401
xhr.setRequestHeader("Authorization", `Bearer ${JSON.parse(localStorage.getItem('user')).auth_token}`)
directUploadWillStoreFileWithXHR(request) {
request.upload.addEventListener("progress",
event => this.directUploadDidProgress(event))
},
directUploadDidProgress(event) {
console.log(event)
}
})
I ended up using Shrine and Uppy instead of ActiveStorage because it supports this usecase better than ActiveStorage.
On Thu, May 7, 2020 at 3:57 PM Emmanouil Kontakis [email protected] wrote:
That's my solution on a react js app with progress
const upload = new DirectUpload(file, railsActiveStorageDirectUploadsUrl, { directUploadWillCreateBlobWithXHR: (xhr) => { // This will decorate the requests with the access token header so you won't get a 401 xhr.setRequestHeader("Authorization", `Bearer ${JSON.parse(localStorage.getItem('user')).auth_token}`) directUploadWillStoreFileWithXHR(request) { request.upload.addEventListener("progress", event => this.directUploadDidProgress(event)) }, directUploadDidProgress(event) { console.log(event) } })
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/rails/rails/issues/32208#issuecomment-625537989, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFNGS2ZXN6SQYFSUJNCWMLRQM4D3ANCNFSM4EUNYIRQ .
As we wait for this ☝🏽 to be merged, anyone got a guide for using Vue with ActiveStorage?
Thanks!
@Samsinite then what do you send as the file?
You want to buffer it from disk instead to calculate the checksum. Uploading to the server needs to be buffered as well, your code calculated the checksum by loading the entire file into memory instead of buffering.