Take me to Pitcairn

Od kiedy dostałem jako dziecko kompendium państw świata, często w moich marzeniach pojawiało się odwiedzenie tych najmniejszych i/lub nieuznawanych z listy. Jedną z wysp, które najbardziej rozbudzały moją wyobraźnię, było Pitcairn – w kompendium opisane jako zamieszkane przez 37 osób i jako jedno z najbardziej izolowanych zamieszkałych terytoriów na świecie. Kompendium nie wspominało nic o aferze pedofilskiej, która mniej więcej w tym samym czasie docierała do mediów. Fragment książki Kathy Marks opisuje całą sytuację.

Od tego czasu o Pitcairn dowiedziałem się znacznie więcej. Bardzo polecam szczególnie książkę „Jutro przypłynie królowa”, kiedyś też na Reddicie pojawiło się AMA z jedną z mieszkanek wyspy. Dzisiaj trafiłem na film „Take me to Pitcairn”, który pokazuje drogę pewnego Brytyjczyka, który chciał odwiedzić wyspę. Choć Pitcairn pojawia się w tym filmie, to równie ciekawa wydaje mi się sama droga tam.

Po obejrzeniu filmu zdecydowanie zmalała mi (już wcześniej mocno obniżona przez książkę Wasielewskiego) chęć pojechania tam kiedykolwiek. Sam dokument jednak polecam.

Amazon Glacier: storing backups you don’t need too often

I recently bought myself a camera that can shoot RAW pictures. That means that before I publish them, I need to pass them through photo processing application that allows me to fix some mistakes I’ve made while taking a picture. I’m using darktable for that. When I was downloading pictures from my recent trip [PL] to my disk, it turned out that I ran out of disk space. Of course, the main reason were my precious RAW pictures that were laying on my disk. The main issue is that I don’t really expect to use them anymore as I’ve already created JPG files out of them, but I would like to have the backups somewhere just in case. There’s no need for me to be able to access those backups in an instant – I can wait.

That’s why I’m using Amazon Glacier for backups for some time already. Right now I have 40GB of data there, paying an amazing amount of 20 US cents each month. These are mostly my university assignments and some old photos that I also have a backup on Google Photos. Doesn’t sound like something I might retrieve too often, right? Yup, this is only for me to be certain that I still have access to those data if I ever get sentimental and want to take a look at my first semester C++ codes. There is one issue, though. If I want to retrieve my data, I’ll need to wait up to several hours for them to be available.

Linux setup

There’s a nice article at howopensource describing how to set up Amazon Glacier on Linux. I just used that one to set up my Linux client and send some backups.

Windows setup

On Windows, I’ve used a client that’s called Fastglacier. Worked quite smoothly, I must admit.

Issues

  • You need to remember that upload process is quite slow too. I’m uploading some .tar.gz packages right now and the speed isn’t extraordinary, even though I have a really fast Internet connection:
    mat@mat-G41MT-D3:~/Pictures/Darktable$ glacier-cmd upload backup foto-20170210.tar.gz
    Wrote 677.0 MB of 1.7 GB (38%). Rate 644.25 KB/s, average 617.04 KB/s, E
  • Retrieval of files can take up to several hours. This isn’t something you’d want as a failover scenario when you have a failure that wipes out your entire site.
  • It isn’t too good if you really want to obtain your entire backup at once. There was a person that paid $150 for obtaining 60GB of data.

Other backups

What are your backup options? I considered external hard drives, but they are quite expensive as a single up-front payment.

Content-Security-Policy: manage security settings of your app

There are already lots of ways to prevent XSS. We can make sure the data input by the user is never presented with markup the user might have provided, encode the data, prevent any scripts from saving to the DB, add X-XSS-Protection header. Some of the instructions are now incorporated into modern frameworks, so adding a team that contained a script in the team name didn’t cause any harm on Matchlogger.

One of the still less known ways to secure your application more is a Content-Security-Policy header. As it states in the reference guide,

The new Content-Security-Policy HTTP response header helps you reduce XSS risks on modern browsers by declaring what dynamic resources are allowed to load via a HTTP Header.

The header is now supported by all modern browsers (note: IE11 supports only deprecated X-Content-Security-Policy header). It is fairly easy to add to Spring Boot application – the only part I’ve added to the config is headers() section.

Let’s check what happens if we use a default policy on Matchlogger:

Boom. We lose all styles, images, and JS as the browser blocked loading resources because of the Content-Security-Policy header in the response. We still can add teams, though, which is good – the application seems to work without all that.

Each of the issues mentioned in the developer console shows us something else which might be important from application security perspective. While in such a small app all of those issues can be dealt with – with less or more effort – in a big application with lots of legacy code we might take a different approach. Here the best approach would be to download all the used libraries and styles into the application and instead of allowing users to put any crest URL, allow them to upload pictures (make sure the upload is secured too then!).

Content-Security-Policy allows us to both define host whitelist for script/image sources and provides some sources we could use. For example,

script-src 'self' unpkg.com 'unsafe-inline' 'unsafe-eval'

would allow us to use both JS from our origin, unpkg.com, inline scripts and eval() instruction. It still maintains a whitelist of script sources but might be useful in cases where we either explicitly want to allow inline scripts and eval(), or we know we have them in our codebase and we want to manage them from now on as a risk we would like to mitigate in future.

So, is this header helpful? From two perspectives it is.

First of all, we add a new layer of protection – the browser will compare the content it obtained from the server with its security policy, and in the case of unauthorized adding of scripts, block them.

Another good thing here is that while implementing Content-Security-Policy, we get to know our application and its potential security risks better. That’s always a good thing as we want to make sure we don’t harm our users.