About this blog

'Going Spatial' is my personal blog, the views on this site are entirely my own and should in no way be attributed to anyone else or as the opinion of any organisation.

My tweets on GIS, Humanitarian, Tech, Games and Randomness

Thursday 17 January 2013

Show me security in the cloud!



Padlock the cloud!
Security in the cloud for client data and application is of paramount concern for a lot of my customers. Many are large central government departments used to having their own data, servers and networks in their own silos; secure from everyone else. They have sensitive data about you and me and everything in between and they are obligated to make sure that these precious resources should only be used for their intended purpose and not be leaked out to others. That’s not a bad thing now is it? 

Another group of my users are defence and security agencies and once again, operational sensitive data cannot be leaked out or compromised. Access to their servers and resources cannot be tampered with and if there’s a real emergency, at least they can post guards around servers that they own!
So, it is a challenge for Cloud specialists to advocate the movement of some (or all) of an organisations’ resources to the cloud. Certainly, despite initial reservations the UK government has started to look seriously into the cloud through their ‘G-Cloud Programme’[1]  with quite ambitious and far-sighted goals. Clearly the much heralded advantages of cloud computing: it’s much lower TCO, elasticity of use, pay-as-you-go model and lowering barriers to enable more rapid deployment of applications and services has the austerity-minded coalition-government looking into this area with great interest. 

The G-Cloud strategy document is available for download here while their website is here: http://gcloud.civilservice.gov.uk/
 
Since a lot of my company’s business is in the public sector, the scene was therefore set. The last couple of years, a common perception started to entrench itself amongst some IT practitioners. The usual mantra that the cloud was insecure, the servers were located in another country where physical security would be more lax and therefore prone to theft and potential poor infrastructure meant cloud computing was only useful for photo sharing sites like Flickr and Picasso.
I have been challenging some of the more entrenched positions internally and while security is a number one concern to me; I found that the perception of the cloud’s insecurity did not always match the reality.
Let me therefore highlight some very common-sense approaches to securing your application and data in the cloud. I will use Amazon Web Services (AWS) as my main public cloud provider example but am sure that it can apply to a lot of the other providers out there.  As I discuss the various options, imagine if you will the layering up of a ‘sandwich’ of defensive security measures.

Assumption: ‘The servers are physically not in this country! How can we be sure that they are not being copied onto USB drives and sold on the black market?’

Securing your data on the cloud machines

All your valuable data on any server should be securely encrypted, so even if someone runs off with a copy of the volume, they will need to know what the password is to decrypt the volume. You can use Windows on Encrypted File System [2](EFS) for example if you have a Windows OS.

Figure 1 How to use Windows EFS

An alternative that you can use include the popular freeware encryptor, TrueCrypt[3]. I personally use TrueCrypt at home as it is fast, easy to use and free. It creates a virtual encrypted disk within a file. It then mounts it as a disk under windows, appearing as a volume. Of course, thinking securely, one should have disabled the ‘auto-mount-and-decrypt’ option for TrueCrypt as it rather defeats the point of securing the system! Best thing is, the file can be copied and stored on a portable drive. Truecrypt did require administrative rights on the machine in question, but I think this has been changed now.
Also, I have yet to hear of any rumour that it is possible to hack and copy an EBS volume that you don’t own.

Assumption: the public cloud is on the internet; which means anyone can get into the dashboard and then into your instances!!

Make sure you are connecting to the Amazon Web Services Dashboard

An obvious one but there’s plenty of fake websites out there that you can go to. Avoid using short-cuts, make sure you know the entry point of the AWS dashboard. It should always be using the HTTPS protocol. Check the certificate, make sure it is amazon.com

Figure 2 Make sure the dashboard is using HTTPS and that the certificate is valid


Assumption: the AWS passwords are not part of our Active Directory domain therefore they are not governed by the same rigorous password policy!

Utilise an Access Control List (ACL)

Use AWS Identity and Access Management (IAM) to control that has access to the AWS Dashboard. For a while, my team was using the main AWS account (let’s call it the root account) to access the dashboard and the EC2 platform. Then, after a security audit we realised how exposed this was. At about the same time, AWS came out with the (then) new product of IAM where you can create users and groups under the main root AWS account. These new users will have their own passwords and have a varying level of access to products and services. These users will NOT be able to access the Account profile information and (importantly) the billing and metering information. For this access, you will still need to use the main root AWS account.
Even better, we had users within the organisation who were only interested in using a sub-set of AWS products (i.e. just EC2 and not S3 or RDS) and IAM was perfect to cater for them. There’s a whole raft of stuff about IAM that can talk about but that will be for another time[4].
Finally, there is a password policy available – it is best one uses it. 


Figure 3 The AWS IAM password policy editor


Assumption: username and password management can be notoriously lax, with important information being slapped onto post-it notes all over one’s monitor. I might not be able to guess your password, but I doubt you will be able to remember it either. You’ve written it down somewhere obvious. I am going to find it.

Securing the user account

Related to the ACL, each of our user accounts is subject to our password policy procedures that include length of password, limitation of re-use (come on, you’re not using the SAME password for AWS as for your work domain are you?) and the use of a multi-factor authentication device[5]. The latter is important, as if someone DID manage to phish your username and password; they cannot access the dashboard since they also need to the MFA device. You need all three to login:


Figure 4 The AWS login using IAM



Figure 5 The MFA login right after



Assumption: data transfer from your internal network to the cloud can be intercepted!

Transferring data to the cloud securely

There are a number of ways to transfer data to the cloud. For small files, one can use the built in remote desktop options of ‘copy and paste’ or have it remotely mount a drive. You can even email files to yourself and access the same email from the AWS instance. One can use Dropbox or some other cloud enabled storage provider. For large files and to securely transport them, it is recommended that a secure file transfer system is used. Examples include WinSCP, WinSSHD and OpenSSH. 
AWS also offers their import/export service where one can copy data to a portable USB drive and ship it to one of AWS data centres. Depending where you are, the data can be copied over to either your S3 bucket or onto an EBS volume. The latter service currently is only available in the US regions. Sending any media, even by courier does require that the drive and its contents be secured. At the minimum, consider using TrueCrypt to encrypt your data. Avoid any USB drive that requires fingerprint decryption: unless you want to send your thumb with the drive!

Assumption: with no physical firewall, you are open on all ports!

Utilise the security groups in AWS

Despite the name, an AWS Security group (SG) isn’t related to users and groups of users. It is an AWS term for a firewall, controlling all traffic to and from any AWS instance that is associated with the security group. By default, all ports and inbound traffic is blocked – including remote desktop. So it is a good idea to enable RDP via the dashboard before you start logging into the machine. However, security groups can restrict the access of these ports to a bunch of IP addresses. This is extremely useful and at my company we’ve restricted all incoming RDP to any of the instances to the address of our external firewall. Thus, only those who are physically at the office, with the right user name and password and possessing a multi-factor authentication device will be able to access any instance. 

Assumption: all code written has bugs in it.

How secure is your application?

This is a tricky one, as the security of any application is down to how security conscious your developers are and what type of QA/QC one has on unreleased software. Providing basic security principles are adhered to when the application is put together and an established programme of review and release is followed; then it should be okay. Having an established procedure for testing, stage and release should be sufficient to tease out the obvious security flaws in an application before it goes live.


Assumption: the safest computer is one switched off!

Your operating system is vulnerable!

This is an easy one: keep on top of patches and updates, apply them as soon as possible and keep your eyes and ears on the various blogs, webinars and news items on security. It is vitally important that everyone takes security seriously. We have a patch day that occurs when Microsoft releases its own patches, we dedicate time to keep up to date on this. Security audits should be carried out on a regular basis covering password renewal and log analysis. The latter is important: how do you know that someone has been trying to hack into your system and if they’ve been unsuccessful? We have a process where the logs are skimmed for any errors and if any are found, the entire log is analysed in detail.


Let people attack it – within reason

An important part of securing the application in the cloud, we’ve engaged a number of security companies that can perform penetration tests (‘Pen test’) and attempt to ‘ethically’ hack into one’s application. Most of these companies are sensible and will try the more subtle approach and not a brute force, denial-of-service attack. The latter will cause your ISP to react with some alarm and pull up the drawbridge. Once the pen-test has completed; a detailed report is made available of any vulnerabilities detected. We tried our own pen-test by inviting a different team from the company to try and hack into our new application, with an Amazon Kindle as a prize for the most impressive and successful attempt. It was very interesting that phishing [6] attempts were the most successful, illustrating that social engineering attempts attacks the weakest part of any security apparatus: the people. 

Finally - disaster recovery

Data centers sometimes fail look at my favourite public cloud provider: AWS[7],[8]  They experienced an outage that affected a number of clients. Now rather than securing your service against malicious attacks, it is equally important to secure against force majeur but this ambles the conversation towards business continuity planning and disaster recovery. There are so many things that can affect the security and therefore viability of the your cloud based services but good business practise, adherence to standards and known processes and everyone in the organisation taking security seriously should lessen the impact of when there is a breach and ensure fast recovery and safe-guarding of sensitive business data.

The cloud isn’t the magic bullet for application uptime or cost savings but it is going to be the environment for computing now and in the future. It’s much heralded advantages and the need to squeeze efficiencies out of current systems ensures that all organisations need to embrace the elastic computing model sooner or later. The fear that many people feel about the cloud is probably magnified by the physical absence of the hardware, previously tucked away safely in one’s server room. However, many of us now have our money stored digitally and not under our beds or on our physical persons and most people are happy with it. Why not with your data?


[5] This is a device that continually generates a random six-digit number that you need to enter alongside your username and password. So even if your username and password has been compromised, the physical device needs to be acquired before anyone else can use it.

Monday 14 January 2013

All Your Base - Videos available

Happy New Year!

I hope everyone had a fantastic time and ate loads and made merry. 

Linux or Windows?
Been quite busy with my London 2013 Marathon training. However, work life is pretty busy too and lots of exciting plans for 2013. I think Linux is going to play a big part at work (finally) but world wide, I share the view that Windows 8 will (ironically) herald the mass take up of Linux on the PC. Read it here at TechBeat, ExtremeTech and Internet News but there are of course counter-arguments and a good one is from the much respected ArsTechnica.

Actually, I like both OSes - we should all just get along.

MySQL and Twitter
I came across an interesting video how twitter handles their backend databases. It's based on MySQL and the video that follows comes from Lisa Phillips, a senior DBA (since the video I think she's been promoted) and there are some interesting facts and figures.

My FOSS DBA of choice is PostgreSQL but MySQL (or MariaDB) is a close second. The fact that twitter are using it is enlightning. Here it is: http://vimeo.com/56652838 and then there's the conference itself: http://allyourbaseconf.com

ArcGIS Online
Lots of new and exciting things for 2013 and the latest update in December 2012 added a slew of new capabilities and functions. Examples include the ability to show interactive tables, new features to handle feature services and for me, a super important one: the ability to register secured map services with ArcGIS Online and store the credentiuals for a single sign-on.

Amazon Web Services


Just read about the release of the High Storage eight Extra Large ( hs1.8xlarge)  Each instance includes 120 GiB of RAM, 16 virtual cores (providing 35 ECU of compute performance), and 48 TB of instance storage across 24 hard disk drives capable of delivering up to 2.4 GB per second of I/O performance. Currently available in US-East but to be rolled out into other regions over the coming months read a cool post here: http://aws.typepad.com/aws/2012/12/the-new-ec2-high-storage-instance-family.html
Yes please.