(Feature) Native cloud saving (Bounty)
#Request: native cloud saving for services like maybe google drive or something at all for all retroarch ports that you can like Wii U, ps3, Raspberry Pi, pc, and android and more cues this would be heaven 🙏 and very convenient.
Bounty
https://www.bountysource.com/issues/59506367-feature-native-cloud-saving
You could just add the RetroArch save directory to Dropbox, or something similar. Outside of Google Drive, have any other services that you recommend?
One of the main issues stopping this is TLS support. While we have actually had support for it for a while now, it is not compiled in by default and at the request of @twinaphex it has stayed that way. Until that stance changes I think there is no point in even working on such a feature.
Rob Loach cues then you cant use it on other things like wii u, and so on just pc really witch kills the point of a cloud save if you can't use it on other things, other then for PC users
bparker06 O ok then, well that sucks, thx for the info I really hope you can get some kind of TLS support one day cues that would be perfect for on the go play on switch then back on your PC of what ever and a very hype new thing to try and advertise ones you get it.
On Tue, Jun 12, 2018 at 7:48 PM, bparker06 [email protected] wrote:
One of the main issues stopping this is TLS support. While we have actually had support for it for a while now, it is not compiled in by default and at the request of @twinaphex https://github.com/twinaphex it has stayed that way. Until that stance changes I think there is no point in even working on such a feature.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/libretro/RetroArch/issues/6875#issuecomment-396778692, or mute the thread https://github.com/notifications/unsubscribe-auth/AmBUoQ05nevbdYfr5kNUkpuW6oYVXC_kks5t8GFTgaJpZM4UiKIn .
Even with cloud saving... Savestates are more often than not architecture dependent. (ie: PS3, WiiU saves may not work on PC and viceversa)
Still would be nice to have
It is obvious that any solution is going to have to be mindful of save state portability (and thus said save states portability might need to either be addressed as a separate issue, or management of different save state usage needs to be thought of).
Considering this has been kicking around for four years, I just dropped a $100 on the bounty to kick start development of it. As what good is a cross platform emulation tool if the data isn't!
Best of luck.
FWIW, I'd like something more than this, and it is remote file system support. I'd like to be able to speak to a webdav share or sshfs or something like that and store data there.
That way everyone can implement their own solution to the problem in any way they see fit.
For webdav or other network mountable folders I think a better method would be to mount the drive through the OS and point RA's save dir to the mount. Seems like adding webdav to RA would be reinventing the wheel since nearly every OS already has a robust implementation. I could provide a script that set up a webdav drive automatically for most major OS's.
SSL is enabled by default now so that solves one cloud hurdle. I am thinking of implementing this through Google Drive. I'd like some feedback before beginning. Or some coding help if anyone else is up to working on this now.
For native cloud saving here is my plan.
- If HAVE_SSL and HAVE_GDRIVE
- Add a config option under Settings>Saving to Enable Google Drive.
- When toggled on check for Google Credentials. When toggled off delete credentials.
- If not authorized, then open default web browser and prompt for authorization. Google let's you log in through a secondary device if you don't have a browser or keyboard.
- If disabled make no changes to default save/load functions.
- If enabled, check for Google Credentials at program launch, content launch, and before saving and loading.
- If authorized then after RA saves to local disk, upload to Google Drive.
- Before loading state, download from Google Drive and then RA loads state from the local drive.
It is possible to skip using the local drive but I think it is best to maintain a local copy so states are available when playing offline.
The save path in google drive would be /retroarch/[architecture]/states/[core]/[save file name]
Some issues that could cause local and cloud to go out of sync and have different save states.
- Multiple players sharing one google account and playing the same game on the same architecture. We shouldn't worry about this. If it happens it is the user's fault, not RA's.
- Corrupt upload/download.
- User playing both online and offline.
Options to handle out of sync states.
- Whenever a user creates a new save always assume it is the desired state and overwrite anything in the cloud.
- Before loading, check local and cloud state's modified date. Assume the newest version is the desired version. If dates are different, overwrite the older file. If dates are the same, load the local state.
- Whenever a save is overwritten it could be copied to a temp dir or to a Google drive revision folder.
Feedback before I begin would be awesome?
My only suggestion would be to split the functions that RA performs when dealing with cloud saves, and the actual providers API itself so as to allow extensibility to other cloud providers, and a reduction in technical debt from cloud providers changing API,
Aping the libretro design goal of "rethink what code belongs to ‘core land’ and what should belong to ‘frontend land’. Libretro cores ideally should have minimal to no dependencies on any system/OS-specific APIs so that the same libretro cores can work on any libretro-compatible frontend." I think could be appropriate in this scenario.
Otherwise all looks like a rational minimal viable to me.
Unlike any other services, webdav is a standard protocol, and it just takes a little more http support than what RA has already.
You can't "mount" the webdav share on every supported OS.
Unlike gdrive, you can roll out your own webdav implementation or use owncloud, nextcloud, seafile, all of those have webdav support.
If not authorized, then open default web browser and prompt for authorization. Google let's you log in through a secondary device if you don't have a browser or keyboard.
If you can do this, have a web browser and all you most likely can just sync by pointing the folder making builtin syncing redundant and pointless.
Not trying to stop you from using gdrive, if you want to do it more power to you. Your gameplan would only work on platforms that can run official google clients already though.
@twinaphex (and many users) have stated that they want cloud saves on mobile and consoles/handhelds too, so relying on the operating system features and/or having a browser is simply not an option.
Same goes for requiring external dependencies, everything needs to be baked in instead, and use the existing net_http code, which will most likely need to be extended to support this. For example, it doesn't support redirects, authentication or alternative methods like PUT as used by webdav.
Is it possible to use JSON to get information on the saves, use POST with data to send saves to cloud, and download any changed MD5 files?
I know that there is no C API's for GDrive, AWS, OneDrive, etc. so making an API might have to be done.
There are examples of gdrive wrappers for FUSE (which FUSE itself is written in C)
https://github.com/dsoprea/GDriveFS
Might be useful for reference
@CammKelly gdrive, according to that code, requires an external browser for oauth authentication, so that's not acceptable for us.
Is this a good solution? Base64 encode the file, upload it using the existing net_http POST function. Then the server decodes it and saves to a file.
I made a rough demo of this to test it and it is working. I was able to upload and download save states.
Once the file is uploaded to a web server, you can then send it to any backend you want - webdav, Dropbox, Google, etc.
I agree, GDrive is probably not a good solution but for completeness, here is more info on it.
The external browser for oauth can be on an external device. The device playing the game doesn't have to have a browser. You can trigger the request from any device and authorize it from your phone.
For the Google API there is this https://google.github.io/google-api-cpp-client/latest/
Not everyone may have access to a browser and I don't think it would be right to require that. Also we cannot forward files from our server to some other service because we don't want the responsibility of storing people's private account info. There are already people begging for a self-run server solution so that we never see anything they upload, but at the same time most people just want it to be easy to use and may not care about the security of a save state and whatnot.
For roll your own server - this is the PHP I used for testing. It works with the existing task_push_http_post_transfer function after adding base64.c for the encoding.
<?php
// Prevent Cache
header('Expires: Mon, 26 Jul 1997 05:00:00 GMT');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: post-check=0, pre-check=0', FALSE);
header('Pragma: no-cache');
// simple security to keep strangers out
if (!isset($_POST['password']) || $_POST['password'] != 'password')
{
header("HTTP/1.1 404 Not Found");
exit();
}
// $_POST['upload'] is file name to upload
// $_POST['data'] is base64 encoded file
if (isset($_POST['upload']) && isset($_POST['data']))
{
$fp = fopen($_POST['upload'], 'wb');
fwrite($fp, base64_decode(str_replace(" ","+",$_POST['data'])));
fclose($fp);
// respond with file size for success checking
echo filesize($_POST['upload']);
}
// $_POST['download'] is file name to download
else if (isset ($_POST['download']))
{
if (file_exists($_POST['download']))
{
header('Content-Length: ' . filesize($_POST['download']));
header("Content-type: application/octet-stream");
readfile($_POST['download']);
}
else
echo 'Error: File Not Found';
}
else
echo 'Error: Unknown Request';
?>
This is the code added to save_state_cb to initiate the upload. It needs some work, just seeing if this is the correct direction.
#include <libretro-common/include/utils/base64.c>
#include <net/net_http.h>
int encoded_data_length = Base64encode_len(state->size);
char* base64_string = malloc(encoded_data_length);
Base64encode(base64_string, state->data,state->size);
char *file_name = NULL;
net_http_urlencode(&file_name, path_basename(state->path));
char post_data[encoded_data_length+512];
post_data[0] = '\0';
snprintf(post_data, sizeof(post_data), "data=%s&upload=%s", base64_string, file_name);
task_push_http_post_transfer(url, post_data, false, NULL, upload_cb, NULL);
Duplicate of https://github.com/libretro/RetroArch/issues/2028
Re-opened with the linked bounty over at https://www.bountysource.com/issues/59506367-feature-native-cloud-saving
~~I have started tackling this bounty.~~
Here's an overview of what I WAS planning on doing.
-
RetroArch server software to be developed in many different supported syntaxes.
-
Create 2 new files (cloud.c and cloud.h) that will use code with the existing base64.c code base and upload/download functions.
-
Saves to the cloud will happen when the save state and game save files are created. Downloads will happen before the rom loading process.
- is definitely a pipedream though. There are bandwidth caps most dedicated servers have, and we already pay over $300+ or more per month on servers. We cannot rack up even more bills, especially as our Patreon has gotten smaller as of late, and not bigger.
Also, we cannot be dependent on somebody else's own personal server, either, for various reasons.
Some guy's LLC residing over a cloud service being offered through RetroArch is completely unacceptable to us. Besides, this is again a pipedream - Article 13 is about to be instated in the EU soon, you will need mandatary Google-style content ID filtering put in place to screen every possible upload for potential 'copyright-infringing' content that is being uploaded by an EU citizen. Every copyright violation that wouldn't get immediately dealt with BTW would be a massive fine.
We are only going to accept this bounty if it's going to interface through the existing predominant cloud services that exist. Having it go through some guys' private server is a complete non-starter, and will simply not be accepted.
I'm going to try to get @bparker06 involved in this conversation, so that we can come to some kind of consensus and some kind of gameplan on what it is we are going to do.
I discussed this with @bparker06 and @fr500. I think all three of us are in agreement that the most pragmatic way to go about this, is to actually implement additional VFS backend drivers. We could have VFS backend drivers for say webdav, ssh, ftp, any other cloud-based thing, etc., etc. This would neatly fit into our existing system and would also be usable by libretro cores as well.
tip: on PCs you can use the rclone mount command. It supports many backends, and works pretty well for me.
Here's what I was working on https://github.com/erfg12/RetroArch-network-transfer
The project managers want to take this a different way, but at least here's a working concept that people can incorporate right now.
Dropbox or Google Drive would be the great so nobody needs to host anything themselfs. I pledged $5 as a bounty
Any update on this? Is this dead or is it worth adding on the bounty? This is the feature that lacks the most to RetroArch.
Any update on this? Is this dead or is it worth adding on the bounty? This is the feature that lacks the most to RetroArch.
Agreed. In today's age I honestly don't know anyone personally anymore who only plays on one system strictly except for my console friends who only know and want PlayStation no matter what.
Willing to pitch in as well, but won't do it if the concept is buried and simply not talked about.
I am all for this as well. This is probably the single greatest thing that holds me back from using RetroArch more than I do. I managed to create a workaround for my Phone and PC but i would install this on almost everything if it had cloud saves.