first a word of Caution. the below presented solution does not really work for me at the moment. issue #1 is, that it's only downloading a subset of all my photos, so it seems to randomly miss some pictures. Issue #2 is a known issue documented on the project's webpage: GPS data is stripped from the pictures. I haven't had time to test ou the experimental functionality to grab that data through another way.
In the meantime, I'm thinking about automating the handling of Google Takeout export emails. This Service can be set up to send you an email every two months with a download link to download all your Data which is stored on the google cloud. Two caveats though: 1.) the service can only be setup for a year (6 backups) and neets to be re-started thereafter, 2.) from what i understand one would download a full backup every second month.. but hey, with today's bandwith and unlimited volumes, that's not such a big issue.
i finally gave in to the amazing functionality that google photos provides using AI for object detection as well as location data and much more to help you find a picture you're looking for. Also the way it integrates with android superb. If I switch to a new phone or use my tablet every now and then, my photos are just there and I don't need to worry anmore about syncing anything.. Even though i generally like to stay on top of my data and manage myself what is synced to where, it is nice to have somthing that is just there and works automatically for once :)
Howevever, I want my data secured on my own Server so I can take responsibility for backups.. Should i ever loose data, i want it to be my fault entirely so i have nobody else to blame than myself for losing my data.. i believe with local backups reaching fare back in time and a current dataset backed up to an offsite server i keep my data pretty save, so i want to be able to use this infrastructure also for the pictures i store on google photos.
I'm using a tool called gphotos-sync. It syncs all your pictures to a local folder, or to be more precise, to a local foder structure and it also syncs albums by symlinking the photos that are contained in the album. so in the end you have a nice sleek directory structure which is perfect for further backups and to eventually move to something else if I don't like google photos anymore in the future :)
first i needed to install python 3 PIP
apt install python3-pip
followed by gphotos-sync (i've installed this as an unprivileged user)
pip3 instal gphotos-sync
~/.local/bin folder to your user's
PATH environment variable (i.e. in .bashrc)
now we need to create our own client id with google for our instance of gphotos-sync:
~/.config/gphotos-sync/client_secret.json(I had to create the directory first)
now run gphotos-sync and pass the folder where you want to keep your local copy of your photos as argument:
it will now ask you to open a link and allow the app to access your google photos.. do that and paste back the code that you are given after the login.
last but nto least to run this automatically in the background, add the above command to your user's crontab (by running
crontab -e as the user)