I want to have a mirror of my local music collection on my server, and a script that periodically updates the server to, well, mirror my local collection.
But crucially, I want to convert all lossless files to lossy, preferably before uploading them.
That’s the one reason why I can’t just use git
- or so I believe.
I also want locally deleted files to be deleted on the server.
Sometimes I even move files around (I believe in directory structure) and again, git deals with this perfectly. If it weren’t for the lossless-to-lossy caveat.
It would be perfect if my script could recognize that just like git does, instead of deleting and reuploading the same file to a different location.
My head is spinning round and round and before I continue messing around with find
and scp
it’s time to ask the community.
I am writing in bash but if some python module could help with it I’m sure I could find my way around it.
TIA
additional info:
- Not all files in the local collection are lossless. A variety of formats.
- The purpose of the remote is for listening/streaming with various applications
- The lossy version is for both reducing upload and download (streaming) bandwidth. On mobile broadband FLAC tends to buffer a lot.
- The home of the collection (and its origin) is my local machine.
- The local machine cannot act as a server
If you were to use Git, deleted files get deleted in the working copy, but not in history. It’s still there, taking up disk space, although no transmission.
I’d look at existing backup and file sync solutions. They may have what you want.
For an implementation, I would work with an index. If you store paths + file size + content checksum you can match files under different paths. If you compare local index and remote you could identify file moves and do the move on the remote site too.