Dwa tygodnie temu na boisku przy ul. Radarowej oglądałem kolejny mecz ósmej ligi mistrzów pomiędzy UKS Klukowo a Moreną Gdańsk. Parę fotek pokazujących stadion Klukowa i atmosferę meczu poniżej.
I recently bought myself a camera that can shoot RAW pictures. That means that before I publish them, I need to pass them through photo processing application that allows me to fix some mistakes I’ve made while taking a picture. I’m using darktable for that. When I was downloading pictures from my recent trip [PL] to my disk, it turned out that I ran out of disk space. Of course, the main reason were my precious RAW pictures that were laying on my disk. The main issue is that I don’t really expect to use them anymore as I’ve already created JPG files out of them, but I would like to have the backups somewhere just in case. There’s no need for me to be able to access those backups in an instant – I can wait.
That’s why I’m using Amazon Glacier for backups for some time already. Right now I have 40GB of data there, paying an amazing amount of 20 US cents each month. These are mostly my university assignments and some old photos that I also have a backup on Google Photos. Doesn’t sound like something I might retrieve too often, right? Yup, this is only for me to be certain that I still have access to those data if I ever get sentimental and want to take a look at my first semester C++ codes. There is one issue, though. If I want to retrieve my data, I’ll need to wait up to several hours for them to be available.
There’s a nice article at howopensource describing how to set up Amazon Glacier on Linux. I just used that one to set up my Linux client and send some backups.
On Windows, I’ve used a client that’s called Fastglacier. Worked quite smoothly, I must admit.
- You need to remember that upload process is quite slow too. I’m uploading some .tar.gz packages right now and the speed isn’t extraordinary, even though I have a really fast Internet connection:
mat@mat-G41MT-D3:~/Pictures/Darktable$ glacier-cmd upload backup foto-20170210.tar.gz Wrote 677.0 MB of 1.7 GB (38%). Rate 644.25 KB/s, average 617.04 KB/s, E
- Retrieval of files can take up to several hours. This isn’t something you’d want as a failover scenario when you have a failure that wipes out your entire site.
- It isn’t too good if you really want to obtain your entire backup at once. There was a person that paid $150 for obtaining 60GB of data.
What are your backup options? I considered external hard drives, but they are quite expensive as a single up-front payment.