Posts Tagged 'EC2'

EC2 Security Revisited

A couple of weeks ago I was teaching Learning Tree’s Amazon Web Services course at our lovely Chicago area Education Center in Schaumburg, IL. In that class we provision a lot of AWS resources including several machine instances on EC2 for each attendee. Usually everything goes pretty smoothly. That week, however, we received an email from Amazon. They had received a complaint. It seemed that one of the instances we launched was making Denial of Service (DoS) attacks to other remote hosts on the Internet. This is specifically forbidden in the user agreement.

I was doubtful that any of the course attendees were intentionally doing this so I suspected that the machine had been hacked. The machine was based on an AMI from Bitnami and uses public key authentication, though, so it was puzzling how someone could have obtained the private key. Anyway, we immediately terminated the instance and launched a new one to take its place for the rest of the course.

In Learning Tree’s Cloud Security Essentials course we teach that the only way to truly know what is on an AMI is to launch an instance and do an inventory of it. I was pretty sure we had done that for this AMI but we might have missed something. I decided that I would do some further investigation this week when I got a break from teaching.

Serendipitously when I sat down this morning there was another email from Amazon:

>>

Dear AWS Customer,

Your security is important to us.  Bitrock, the creator of the Bitnami AMIs published in the EC2 Public AMI catalog, has made us aware of a security issue in several of their AMIs.  EC2 instances launched from these AMIs are at increased risk of access by unauthorized parties.  Specifically, AMIs containing PHP versions 5.3.x before 5.3.12 and 5.4.x before 5.4.2 are vulnerable and susceptible to attacks via remote code execution.   It appears you are running instances launched from some of the affected AMIs so we are making you aware of this security issue. This email will help you quickly and easily address this issue.

This security issue is described in detail at the following link, including information on how to correct the issue, how to detect signs of unauthorized access to an instance, and how to remove some types of malicious code:

http://wiki.bitnami.com/security/2013-11_PHP_security_issue

Instance IDs associated with your account that were launched with the affected AMIs include:

(… details omitted …)

Bitrock has provided updated AMIs to address this security issue which you can use to launch new EC2 instances.  These updated AMIs can be found at the following link:

http://bitnami.com/stack/roller/cloud/amazon

If you do not wish to continue using the affected instances you can terminate them and launch new instances with the updated AMIs.

Note that Bitnami has removed the insecure AMIs and you will no longer be able to launch them, so you must update any CloudFormation templates or Autoscaling groups that refer to the older insecure AMIs to use the updated AMIs instead.

(… additional details omitted …)

<<

So it seems there was a security issue in the AMI that had gone undetected. This is not uncommon as new exploits are continually discovered. That is why software must be continually patched and updated with the latest service releases. Since Amazon EC2 is an Infrastructure as a Service offering (IaaS) this is the user’s responsibility.

It was nice to have a resolution to the issue since it had been bothering me since it occurred. It was also nice that Amazon sent out this email and specifically identified instances that could have a problem. They also gave links to some specific instructions I could follow to harden each instance or a new AMI I could use to replace them.

In the end I think we will be replacing the AMI we use in the course. This situation was an example of the shared responsibility for security that exists between the cloud provider and the cloud consumer. You don’t always know exactly if you have a potential security issue until you look for it. Even then you may not be totally sure until something actually happens. In this case once the threat was identified the cloud provider moved quickly to mitigate damage.

Kevin Kell

Importing Custom Images into Amazon EC2

A current project I am working on has a requirement that custom machine images be built and maintained such that they are usable both from within Amazon EC2 and on virtual machines hosted outside of EC2. These images are all based on the Windows operating system. Since we want to build each machine image only once (we will have about 200 of them!) it left us with a couple of options:

  1. Build the custom image on EC2 and export it for use on outside virtual machines
  2. Build the custom image on an outside virtual machine and import it for use in EC2

This article explores the second option. I will outline some of the challenges I experienced along the way and how I resolved them. Hopefully this may help someone else who is trying to do the same sort of thing.

In theory, the process is simple. Amazon has provided command line tools and decent documentation on how to do this. As with many endeavors, however, the devil is often in the details.

I had wanted to start from VMware images. VWware virtual disk files use the vmdk format. I soon discovered, however, that not all vmdk files are created equal. That is vmdk files which are used for vSphere are not the same as the vmdk files used in VMware Workstation. The EC2 command line tools will complain if you try to use a workstation vmdk. Unfortunately I did not have vSphere available at my disposal.

So, I decided instead to start from a vhd format disk. I know that there are products which claim to convert one to another but I did not want to go there at this point. I used Microsoft Virtual PC 2007 to create a base Windows Server 2008 virtual machine from an ISO image I downloaded using my MSDN subscription. At least that was a relatively easy way to get started. I then went on to customize that image for my requirements.

Next just use the tools and upload the image, right?

Well, for me it took a few tries. I learned after the first that running ec2-upload-disk-image from my local machine takes over 24 hours to complete. My vhd file was about 5.5 GiB. Not huge, but pretty big. I guess I have slow upload speed. After the upload completes some processing takes place on Amazon’s servers. This requires additional time. You monitor progress using ec2-describe-conversion-tasks. My first attempt seemed to get stuck. It never advanced beyond 6% complete.

For subsequent attempts I zipped the vhd file, uploaded it to S3 and then downloaded it to an EC2 instance I had provisioned with the command line tools. There I could un-zip the file and run ec2-upload-disk-image. That whole process, end to end, took about 5 hours so at least that was some improvement. My second effort spun up and I thought I was good to go.

Not so fast! It seemed now that even though the machine was running I had no way to connect to it. I had read in the documentation that Remote Desktop had to be enabled and that port 3389 needed to be opened on the Windows firewall. I had done all that. Still, no go.

For my next attempt I decided to have IIS started on the image so I could at least know that it was alive and communicating on the network. I also double-checked the remote connection settings, made sure that there were no conflicts on port 3389 and that it was definitely open on the Windows firewall.

This time I could see the web server but still couldn’t connect via RDP! To me that meant it had to be a firewall issue. After verifying that the EC2 security group had 3389 open I decided I would try again but this time I would turn the Windows firewall completely off. That worked! I was able to connect to my custom created instance using RDP.

Is there a better way to do this? Probably. However, at least now I know there is a way to achieve the goal! Of course I am not done yet. Make it work, make it right, make it fast!

For more about cloud computing with Amazon Web Services Learning Tree is developing a new course dedicated to that very topic!

Kevin Kell

Interoperability in the Cloud

One of the nice things about cloud computing is that it allows for choice.

That is, I am free to choose from any and all available technologies at any time. Yes, vendor lock-in is a concern, but I am not really that concerned about it! Here’s why: In the cloud, there are almost always multiple ways to make something work. A bold assertion perhaps, but here is what I mean.

Let’s say you come from a Windows programming background. Let’s say you want to deploy a simple contact management application to the Cloud. Cool. The Azure Platform has you covered. You could easily create and deploy your app to Azure. Probably you need some kind of persistent storage and, being a relational database kind of person, you choose SQL Azure.

So, here is that app: http://mycontacts.cloudapp.net/ (you may see a certificate warning you can ignore — I assure you the site is safe!)

Now let’s say you really like the relational database that SQL Azure offers, but, for some reason, you don’t want to host your application on Windows Azure. Why not? Well, for one thing, it may be too expensive, at least for what you want to do right now. How can we reduce the startup cost? Sure, if this application goes viral you may need to scale it … but for now what? Maybe you could choose to deploy to an EC2 t1.micro Instance, monitor it, and see what happens.

So, here is that app: http://50.18.104.190/MyContacts/

If some readers recognize this application as one created with Visual Studio LightSwitch they are correct! The same app has been seamlessly deployed both to Azure and EC2 right from within Visual Studio. They both hit the same backend database on SQL Azure.

Here are the Economics:

Option:

Azure Small

Azure Extra Small

EC2 t1.micro

Hourly Cost

$0.12

$0.05

$0.03

Monthly Cost

$86.40

$36.00

$21.60

SQL Azure Monthly Cost

$9.99

$9.99

$9.99

Total:

$96.39

$45.99

$31.59

There are differences, of course. Azure is a PaaS whereas EC2 is IaaS. If you are unclear on the difference please refer to this excellent post by my colleague Chris Czarnecki.

The point is developers (and organizations) have choice in the cloud. Choice is a good thing. In the future perhaps I will port the front end to Java and host it on Google App Engine, but that is a topic for another time!

Go ahead … add yourself to my contacts. Let’s see how this thing scales!

Kevin

Using EC2 Micro Instances for ASP.NET Hosting

I’m the author of Learning Tree course 502: Programming with .NET: A Comprehensive Hands-On Introduction.  The case study in that course has the students create a program called Flash Cards, which is intended to help kids learn basic math skills.  Good case studies are hard to come up with.  They need to be complex enough to demonstrate what you want to teach, but simple enough for students to understand and program in a short period of time.  It also helps if they seem real world and are a bit fun.

Over the years this case study has worked really well.  In the course, students create both Web and Windows versions of the program choosing either C# or Visual Basic to do the coding.  They create a database to keep user scores, and use ASP.NET Forms authentication for security.  It also provides an example of good object-oriented design.

I got the idea for the case study when my son was very young and just learning his numbers.  Now, he is in third grade and is learning his multiplication tables, and he’s also interested in programming.  So, as a little project he and I decided to enhance the Flash Cards case study and put it online.  He had lots of good ideas for enhancements.  If you have taken our .NET introduction course or think you might like to learn about ASP.NET programming, you can go to www.bbqmath.com and see Flash Cards live.

This also gave me a great opportunity to use an Amazon EC2 Micro instance.  Micro Instances are so cheap they are great for ASP.NET hosting, especially if you’re like me and want your own server. (See my earlier post, Save Even More with Amazon EC2’s Micro Instances.) The server costs about $14 per month.  Plus, I’m hosting a couple other sites on it as well.

Now the hard part.  I want to test how scalable micro instances are.  So tell all the third-graders you know to visit www.bbqmath.com and create an account and use it like crazy.  Maybe if we can get a couple million kids we can test how well Elastic Load Balancing works.  Hey, that’s a good topic for a blog post.  Maybe next month.

Doug

Using the Amazon EC2 Command Line Tools and API

There are three ways to interact with Amazon EC2:

  1. Web based AWS Management Console
  2. Command line tools
  3. An API for programmers

The Web based console is very easy to use and you can (and should!) do a lot with it. There are times, however, where what you want to do is beyond what is currently possible with AWS console. Or you may need to do something that involves managing multiple instances in an automatic, repeatable and consistent manner. In these cases you need to use the other options.

The command line tools and API allow developers and operations people to program or script robust solutions that can leverage the IaaS that Amazon provides. These developer and operations roles (sometimes referred to collectively as “DevOps“) may be filled by technical folks internal to an organization or may be from third parties. Indeed there are numerous providers of value added services for managing and configuring EC2 environments. Typically these solutions are often custom applications which involve, among other things, calling into the API.

To get started using the command line tools you need to take a look at the documentation that Amazon provides. It is not necessary, in my opinion, to read it start to finish. You must, however, be at least somewhat familiar with what the commands are.

Then, you need a machine with the tools installed. One option is to download and install the tools on your own machine. Another option is to use one of the pre-configured Amazon Linux AMIs Amazon provides with the tools already installed.

In either case you will ultimately need to roll up your sleeves and get busy with the Windows or Linux command prompt. For one thing you will need to set up your machine so that it can securely access your account. This involves downloading a certificate and private key file and setting some environment variables to establish your identify and the location of the Java runtime. You also need to set your path to include tool script location.

Amazon provides both “API Tools” and “AMI Tools”. The API tools bring the API functionality to the command line. Essentially the commands are just lightweight wrappers that call into the API. They are implemented as cmd files in Windows or shell scripts on Linux machines. The AMI tools provide utilities for creating, bundling and uploading AMI’s. Unfortunately I’ve never been able to get the AMI tools to completely work on Windows.

To get started using the API you will need to download and install the appropriate SDK for your particular programming language. There are SDKs and/or libraries for Java, PHP, Python, Ruby and .NET. If you use Eclipse as your development environment you can also download and install the AWS Toolkit for Eclipse. Oh, by the way, you also do need to have a look at the documentation!

These screencasts demonstrate using the command line tools and the API.

If you want to learn more about how you can bring the power of the cloud to your organization please consider attending Learning Tree International’s Introduction to Cloud Computing course coming soon to a venue (hopefully) near you! Or, if you’d prefer, attend from the comfort of your own location via Learning Tree’s AnyWare system.

Kevin Kell

As cloud computing continues to make information technology headlines, vendors are aggressively promoting the many benefits it can provide organizations.  Our White Paper, Cloud Computing Promises: Fact of Fiction, addresses the claims and questions that are often raised in relation to cloud computing and provides a clear view of what the cloud can—and can’t—deliver in reality.

Amazon AWS New Storage Features

The rate at which Amazon has been adding to and improving its cloud computing services is impressive. If we just consider some of the announcements made to the storage services in June 2010. Firstly consider the Relational Database Service (RDS). This is now available in all AWS regions.The RDS can now be managed from the AWS management console, allowing MySQL instances to be launched, take real-time snapshots of instances and monitor key database statistics. A superb feature now available is the ability to create a MySQL instance that is synchronously replicated across availability zones to provide enhanced data protection and availability due to both planned and un-planned outages.

Another new feature is the Amazon S3 management console. This enables the management of S3 resources from a simple console with tasks such as bucket creation and data object upload available from the console. Offsite backup and disaster recovery planning have also been enhanced with the addition of the Amazon Import/Export feature. Data can easily be imported and exported from/to portable storage devices into S3 storage.

Another new storage related feature is the Reduced Redundancy Storage(RRS) facility just announced. RRS allows customers to store their non-critical data at lower levels of redundancy and therefore at a lower cost to S3 storage. Both S3 and RRS store data in multiple facilities and on multiple devices. S3 is designed to provide 99.999999999% durability. RRS is designed to provide 99.99% durability and both are backed by Amazon’s S3 service level agreement.

There have been other enhancements to services such as CloudFront and MapReduce too and all this month. It is this continuous rapid development and enhancement of the AWS services that is convincing more and more organisations to move to using EC2. The rapid growth in the AWS business has lead to recent predictions that within 5 years, the Amazon AWS business will be larger than its e-commerce business. Such a rapid growth provides concrete evidence that cloud computing is not just the current computing buzzword but is a serious technology being rapidly adopted by a large number of organisations.

If you are interested in how Amazon AWS can be leveraged for your projects in a risk minimal manner, why not take a look at the following overview, or for a more detailed insight why not take the Learning Tree class – it will really fast track you into exploiting the benefits of the cloud.

Chris

Cloud Computing Adoption within Learning Tree

In this post I am going to change tack slightly. Rather than focus on Windows Azure I am going to illustrate one simple way that Learning Tree International is using cloud computing technology internally.

One of the most compelling use cases for cloud computing is the so-called “Dev/Test”. In this use case a machine (or machines) running in the cloud is configured for development and testing. Dev/Test is appealing because of the relative ease and low cost associated with provisioning the cloud based resources compared to buying and configuring physical machines. It is also one of the simplest and least risky ways for organizations to gain experience with cloud computing.

Internally Learning Tree uses SharePoint. Applications range from simple document sharing, team discussions and wikis to complex reporting, business intelligence and dashboards. Currently there is no automated system in place for employee expense submissions. It is natural to consider using the document workflow capabilities of SharePoint for this purpose.

Anyone who has ever done any .NET development for SharePoint using Visual Studio (prior to the 2010 versions) knows that the process can be somewhat complicated. SharePoint 2007, for example, requires a server OS. Development and debugging using Visual Studio 2008 is best accomplished when the development environment is running on the same machine that is hosting SharePoint. Usually, for developers, this has meant doing development on some kind of Virtual Machine anyways.

So, why not provision a Windows Server machine in the cloud, install SharePoint and Visual Studio on that machine and party on? That is exactly what we did.

We chose Amazon EC2 as our IaaS provider. Amazon makes it incredibly easy for anyone with a credit card to rapidly provision an instance of one of their ever increasing inventory of machine images. We chose a Windows Server 2008 machine also running SQL Server 2008 Express Edition. This became the starting point of the development platform. Additional components, including SharePoint 2007 and Visual Studio 2008 were installed on this instance under our development license agreement. For a few dollars and a few hours spent installing software we had a development and testing environment suitable for our purposes. It would have taken more time and been more expensive to have commissioned these equivalent physical resources.

One appeal to this approach was the capability to take snapshots of the current state of the machine instance at various times during the development. These snapshots became the basis of our own private machine images. In this way we had a built in backup mechanism and could very quickly spin up an instance of our project at various stages of development.

So, how did it all work out? Well, so far, so good! There were a few hiccups along the way. For one thing, we learned about the benefits of using Elastic IP Addresses. We also got to experience the fun of having to rename a SharePoint server after stopping and re-starting an Amazon machine instance!

All in all, though, the development experience was a good one. Using Visual Studio through a Remote Desktop connection was almost like working on a development machine directly. The bottom line is that we were able to provision a fully functional, self-contained development environment cheaper and faster than we otherwise could have.

For a brief demo (< 10 minutes) of an early stage in the development process click here.

Kevin


Learning Tree Logo

Cloud Computing Training

Learning Tree offers over 210 IT training and Management courses, including Cloud Computing training.

Enter your e-mail address to follow this blog and receive notifications of new posts by e-mail.

Join 51 other followers

Follow Learning Tree on Twitter

Archives

Do you need a customized Cloud training solution delivered at your facility?

Last year Learning Tree held nearly 2,500 on-site training events worldwide. To find out more about hosting one at your location, click here for a free consultation.
Live, online training
.NET Blog

Follow

Get every new post delivered to your Inbox.

Join 51 other followers

%d bloggers like this: