motioneyeos
motioneyeos copied to clipboard
New bug report: Google Drive no longer accepts connections to get new key in Files, Uploads
Preliminary Docs
I confirm that I have read the CONTRIBUTING guide before opening this issue. Yes
I confirm that I have read the FAQ before opening this issue. Yes
motionEyeOS Version
I am running motionEyeOS version: dev20201026
Board Model
I am using the following board/model: Any
Camera
I am using the following type of camera: N/A
My camera model is: N/A
Network Connection
My motionEyeOS unit is connected to the network via: Either Ethernet or WiFi
Peripherals
I am using the following peripherals that I consider relevant to this issue:
N/A
When attempting to set up Files, Uploads, Google Drive, Obtain Key, before you can do anything else you get:
Their report suggestion Related Developer documentation
Ive just encountered the error, is there any manual way to obtain the key?
Not that I am aware of, you could check with google...
In the developer notes from google, it says you can add something to the request to 'bypass' the Oauth error (not recommended), however it would entail decompiling the .pyc files to find where the request is made, and isn't a real fix.
I am encountering the same issue running dev20201026
same issue here, using motioneye in Home Assistant , using th HACS community addon. Was working before with the same setup, without issues. Had to reinstall home assistant due to unrelated issue and now the the "Access blocked: motionEye’s request is invalid" message when clicking "obtain Key"
I have one camera (camera1) that can upload files to google drive. Adding another camera requires me to obtain a new key from google drive. Is there a way to use the camera1 key for new cameras?
I have one camera (camera1) that can upload files to google drive. Adding another camera requires me to obtain a new key from google drive. Is there a way to use the camera1 key for new cameras?
Im not sure, from what i understood, all currently working auth keys will stop working at some point anyway. https://developers.google.com/identity/protocols/oauth2/resources/oob-migration
Hopefully the devs will have a solution soon , I really can't afford to pay Dropbox prices, even with Black Friday around the corner. Maybe this is a good time to consider implementing AWS S3 bucket, Backblaze B2, or self hosted solution like nextcloud. I'll keep my fingers crossed for nextcloud or B2.
I don't know. The key should be in the camera1.conf file, and you could try to use it in the camera2 setup (don't bother with trying the get key function). I doubt if the new devs will add functionality for the AWS B2 or other options as they aren't trying very hard to get the new release out there for motionEye let alone motionEyeOS. They aren't even acknowledging new issues here.
I understand you used to be the dev (great work btw). But looks like you still spending a lot of your time here. I think the google Oauth OOB migration is going to make all google drive users of Motioneye and motioneyeOS fail at some point in the next month or so (or maybe they all stopped working on 5th October), most people won't even notice until they check. I don't know how much contact you have with the new devs, but this is probably going to effect a lot of users. If they are not going to deal with the issue, should we consider Motioneye to be slow for updates and mOS to be an unmaintained project?
@xjon685 I am not, nor have I ever been a dev, and CCrisan was the originator. I have been here for a number of years, originally helping CCrisan with tech support (my specialty). As to current dev work, the new devs are almost exclusively focused on motionEye (not mEOS) converting from Python2 to Python3.... I created tickets for both mE and mEOS concerning the issue. I really hate to think it's been abandoned by the new devs, but it is a possibility. (depressing).
looking forward to having this sorted out. my motioneye is not usable anymore without google photo, or at least drive upload.
I wouldn't hold your breath, I'm considering this to be a dead project now which is sad.
I'm looking at work arounds after concluding this won't be fixed, realistically ever. im now saving videos to my local NAS (openmediavault 6) then using rclone to move it to google drive, or nextcloud. same result as " clean cloud " option is working by using the local "save videos for X time" and using the rclone sync option.
I tried google drive as a work around and it was horrible. Maybe it isn't designed for large files it would just stop. However my current solution I like even more than having the feature in motioneye's having this feature. I learned how to use rclone. Note: do not use the one that comes with the latest "Raspberry Pi OS" is is super outdated get the one from https://rclone.org/ as google had changed their api from between those versions. I followed the setup in rclone for google drive. https://rclone.org/remote_setup/ I used the name "google" for my google drive I used the defaults I had to port forward as it explained in the setup from above. After setting it up I use the following script the folder in my google drive I use is garage because that is the area I monitor with my motioneye setup. /scripts/move_motioneye_to_google.sh
#!/bin/bash google_folder="google:/garage/" #so it only get run once for the loop yesterday=$(date --date="yesterday 00:00:00" +%s) #copy all new files over rclone --log-level INFO --log-file=/var/log/rclone.log copy /motioneye/ "${google_folder}" #remove all local date folders older than today while read folder;do if [ "$(echo "${folder}"|egrep '^[0-9]{4}-[0-9]{2}-[0-9]{2}$')" != "" ] && [ "$(date -d "${folder} 00:00:00" +%s)" -lt "${yesterday}" ];then rm -rf /motioneye/${folder} fi done <<< $(ls -1 /motioneye) #so it only get run once for the loop fortnight_ago=$(date --date="14 days ago 00:00:00" +%s) #remove all remote date folders older than a fortnight while read folder;do if [ "$(echo "${folder}"|egrep '^[0-9]{4}-[0-9]{2}-[0-9]{2}$')" != "" ] && [ "$(date -d "${folder} 00:00:00" +%s)" -lt "${fortnight_ago}" ];then rclone delete "${google_folder}${folder}" rclone rmdir "${google_folder}${folder}" fi done <<< $(rclone lsf --dirs-only "${google_folder}"|cut -d/ -f1)
I set the script in motioneye to run every time a video finishes. motioneye writes the files to /motioneye/ my script syncs them from there to google drive removes old local directories by the default date settings for filenames then removes all 2 week old directories from my google drive. This works really well for me.
i'm also experiencing this issue. its a pitty
Yep , also just added another camera and found this ,,, was about to delete the container and start again .... (lucky i didnt) but unless a statement gets released , it does look like this motioneyeOS is dead..... Can anyone recommend a alternative ?
Same here. Please fix
same issue 1/13/2023
i was able to slove it by writing a python script that copies the fil to my local server, then syncing that folder to googe drive
mkn025
Could you share this code please?
mkn025
Could you share this code please?
Sure. It is quite simple really, it just copies the newest file from where the camera stores it to the server using scp. I am at my cabin right now, but I will share the code when I get back home on Monday.
mkn025
Could you share this code please?
Sure. It is quite simple really, it just copies the newest file from where the camera stores it to the server using scp. I am at my cabin right now, but I will share the code when I get back home on Monday.
Much appreciated 👍
mkn025
Could you share this code please?
Sure. It is quite simple really, it just copies the newest file from where the camera stores it to the server using scp. I am at my cabin right now, but I will share the code when I get back home on Monday.
Much appreciated 👍
i am just trying out this notion thing https://nitramknutsen.notion.site/Transfer-files-to-the-server-from-Motion-Eye-using-SCP-12753569ad4f47fb913453acabbf3eb4?pvs=4