Posts Tagged 'AWS'

The Cloud goes to Hollywood

Earlier this week I attended a one day seminar presented by Amazon Web Services in Los Angeles entitled “Digital Media in the AWS Cloud”. Since I was involved in a media project recently I wanted to see what services Amazon and some of their partners offer specifically to handle media workloads. Some of these services I had worked with before and others were new to me.

The five areas of consideration are:

  1. Ingest, Storage and Archiving
  2. Processing
  3. Security
  4. Delivery
  5. Automating workflows

Media workflows typically involve many huge files. To facilitate moving these assets into the cloud Amazon offers a service called Amazon Direct Connect. This service allows you to bypass the public Internet and create a dedicated network connection into AWS. This allows for transfer speeds up to 10 Gb/s. A fast file transfer product from Aspera and an open source solution called Tsunami UDP were also showcased as a way to reduce upload time. Live data is typically uploaded to S3 and then archived in Glacier. It turns out the archiving can be accomplished automatically by simply setting a lifecycle rule for objects in buckets that automatically moves them to Glacier at a certain date or when the objects reach a specified age. Pretty cool. I had not tried that before but I certainly will now!

For processing Amazon has recently added a service called Elastic Transcoder. Although technically still considered to be in beta this service looks extremely promising. It provides a cost effective way to transcode video files in a highly scalable manner using the familiar cloud on-demand, self-service payment and provisioning model. This lowers the barriers to entry for smaller studios which may have previously been unable to afford the large capital investment required to acquire on-premises transcoding capabilities.

In terms of security I was delighted to learn that AWS complies with the best practices established by Motion Picture Association of America (MPAA) for storage, processing and privacy of media assets. This means that developers who create solutions on top of AWS are only responsible for creating compliance at the operating system and application layers. It seems that Hollywood, with its very legitimate security concerns, is beginning to trust Amazon’s shared responsibility model.

Delivery is accomplished using Amazon’s CloudFront service. This service offers caching of media files to globally distributed edge locations which are geographically close to users. CloudFront works very nicely in conjunction with S3 but can also be used to cache static content from any web server whether it is running on EC2 or not.

Finally, the workflows can be automated using the Simple Workflow Service (SWF). This service provides a robust way to coordinate tasks and manage state asynchronously for use cases that involve multiple AWS services. In this way the entire pipeline from ingest through processing can be specified in a workflow then scaled and repeated as required.

So, in summary, there is an AWS offering for many of the requirements needed to produce a small or feature length film. The elastic scalability of the services allows both small and large players to compete by only paying for the resources they need and use. In addition there are many specialized AMIs available in the AWS Marketplace which are specifically built for media processing. That, however, is a discussion for another time!

To learn more about how AWS can be leveraged to process your workload (media or otherwise) you might like to attend Learning Tree’s Hands-on Amazon Web Services course.

Kevin Kell

Cloud Skills in Demand

According to a recent analysis by CyberCoders cloud computing talents will be the second most sought after IT skill in 2013. Interestingly the analysis specifically calls out AWS and Azure. That these two are included is not a surprise to those of us that have followed the evolution of cloud computing over the past few years. Conspicuously absent from the list, however, is Google but let’s not go there right now.

In the early days it seemed that Amazon and Microsoft took fundamentally different approaches to where they would operate in the cloud marketplace.  Amazon was primarily an IaaS play whereas Microsoft Azure was a PaaS. Today the distinction has blurred. Amazon has many offerings which are certainly more than just infrastructure and Microsoft now offers true virtual machines (including Linux!) as part of Azure. Obviously competitive pressures and market realities have caused this convergence.

It is my belief that Amazon is still the market leader in public cloud computing. Don’t count Microsoft out, though. Azure has many good technical merits and the new Web Sites service (which allows you to host up to 10 simple ASP.Net web applications for free) is totally awesome. I am also pretty happy with the SQL Reporting service as I previously have said.

When an IT skill is in demand there is a good possibility that someone will develop a certification program around it so that individuals claiming competence can be validated in some sense. This is now the case with Amazon AWS. Amazon has introduced a certification program for those who would like to demonstrate competence in AWS. The first certification they are offering is called “AWS Certified Solutions Architect – Associate Level“. Microsoft also seems to be adjusting their certification programs for the new reality of the cloud.  Even Learning Tree offers a Cloud Computing Certification Program!

Obtaining the Amazon AWS credential involves completing a comprehensive exam demonstrating in-depth knowledge of AWS services as well as general IT knowledge. Learning Tree course: Cloud Computing with Amazon Web Services is a good introduction to AWS and will definitely help you on your way towards success on the AWS exam. On the other hand Learning Tree course: Cloud Computing Technologies: A Comprehensive Hands-On Introduction will give you exposure to a broad spectrum of cloud offerings including Microsoft Azure. Both of these courses offer hands-on experience in what the data shows is one of the hottest technology areas for 2013.

Hope to see you soon at a Learning Tree Education Center or online via AnyWare!

Kevin Kell

Creating Items and Exploring Tables in Amazon DynamoDB

Having earlier created some tables for our application we will now quickly see how to store some items.

Using the AWS SDK it could be done programmatically very simply as follows:

Figure 1. Code to populate the Players Table with sample Items.

A couple points to reinforced:

  • Not all items need to have the same set of attributes
  • Attributes can be single or multi-valued

Tables can be explored interactively using the AWS Explorer from within Visual Studio.

Figure 2. Explore the Players Table using AWS Explorer from within Visual Studio

Note that item order is not preserved nor is order of list entries in multi-valued attributes.

Alternatively, as of May 22, 2012 you can also use the AWS Management Console to explore, monitor and configure DynamoDB tables.

Okay, cool. So now what? Well we could (and we will) consider ways to query the data. We could also talk about the usual CRUD stuff but we are not going to do that right now. Instead our next game will be to wrap up an interface to our storage that can be implemented as a Web Service. This will allow us to further abstract from DynamoDB and define our interface in terms of objects in our problem domain. Under the covers we will we using DynamoDB but we will have isolated specific code so that if, in the future, we wanted to use something else for storage (SimpleDB, Azure Tables or even a relational database) it will be relatively straight forward to make the necessary changes.

That will be the subject of my next post. In the meantime you might want to check out some of the supplementary course materials available for Learning Tree’s Amazon Web Services course. While some of the programming references there are for Java and not C# you will find that the concepts are equally relevant.

Kevin Kell

Billing Alerts Help Prevent Surprise Bills

Recently I was travelling from my home town Leicester to London by train for a meeting with a consulting client. I arrived at the train station at 6am for a 6.30am train, purchased a ticket and then heard an announcement that the train drivers were on strike and a very limited service was running. A little frustrated I went to the train companies Web site to see if any schedule of this limited service was available only to be greeted by the screen below.

My immediate thought was that this could easily have been averted with the use of the auto scaling of cloud computing which works perfectly in such times of large spikes of traffic to a Web application.

Autoscaling of resources is a great facility but it is not as straightforward as maybe first appears. Technically, it requires configuration on most cloud services. Take for instance Amazon’s AWS. Here a load balancer must be configured and then the CloudWatch service must be enabled with thresholds set for scaling up and down the number of server instances. Added to this there are business concerns too. The obvious one being how much the services will really cost in any particular month. We could be happily running our systems, but at what financial cost ?

Cloud services such as Amazon and Microsoft do not enable the setting of spending thresholds, although Google’s App Engine does. This means paying by credit card or by invoice may result in a surprise at the end of each month–the bill may be much larger than expected if usage is large. As a user of Amazon AWS, my company have been aware of this for some time and regularly check our billing data for abnormal patterns. We were thus delighted to hear that Amazon have now announced billing alerts. This service allows you to configure spending thresholds which when reached for any particular service will send you an immediate notification. This means that you will be aware as soon as spending is above your accepted limit and can take appropriate action at that time. The billing service makes use of the standard CloudWatch alarms and Amazon Simple Notification Service (SNS) for sending alerts. The free monthly usage tier of CloudWatch is 10 alarms and unto 1000 email notifications can be sent before charges are incurred. This facility is a much needed and welcome addition to the Amazon Web Service portfolio.

If you would like to understand more about cloud computing, consider attending Learning Tree’s course, Cloud Computing Technologies:  A Comprehensive Hands-On Introduction,  which provides a thorough coverage of the business and technical benefits of cloud computing as well as exposure to the products from the major vendors. For those looking to use Amazon Web Services, Learning Tree also have an excellent four day hands-on Amazon Web Services course where the lower level details of using and integrating these services are covered.

Hope to see you at one of these courses soon.

Chris Czarnecki

Amazon AWS Releases Beanstalk for .NET

I have written about Amazon’s Elastic Beanstalk for Java and more recently for PHP. This Platform as a Service (PaaS) is incredibly good and eliminates the need for much of the traditional administration required when running Web applications. For Java and PHP developers, as well as Ruby, Python and other languages, there is a wide choice of PaaS available from different vendors, which means that there is no fear of vendor lock-in when selecting a PaaS.

For .NET developers, the choice of PaaS has been limited to Microsoft Azure, or as of this week, Cloud Services. This is not a reason not to use the Azure PaaS, but with little competition, price , quality and performance, pressures do not come so readily to bear as when the market is competitive. Using other cloud vendors for deploying .NET applications, such as Amazon, means using their Infrastructure as a Service (IaaS) where servers are provisioned and configured behind a load balancer. This leaves all responsibility for updates and patches on server software with the end user–something PaaS eliminates. Today the landscape changes significantly for the better for .NET developers. Amazon announced the release (in Beta) of Elastic Beanstalk for .NET. This is a significant move by Amazon in the PaaS market and provides, immediately, a proven deployment platform for .NET application deployment.

Beanstalk for .NET uses Windows Server 2008 R2 virtual machines with IIS 7.5 installed for hosting the .NET applications. The AWS toolkit for Visual Studio enables the development of standard .NET Web applications, including Web Forms and ASP.NET MVC, and via the toolkit these applications can be deployed to Beanstalk. There is no additional cost for using Beanstalk. You just pay for the AWS resources provisioned. By default this is a micro instance machine and a load balancer. For new accounts a micro instance is free for the first year.

For anybody wanting to use SQL Server in the cloud, SQL Azure, or as its known as of this week, SQL Database has been a zero administration solution. Again this has changed as of this week as Amazon have announced that it has expanded its Relational Data Service–a Cloud managed database service offering MySQL and Oracle to now also include Microsoft SQL Server. Multiple editions including Express, Web, Standard and Enterprise Editions of SQL Server 2008 R2 are available, and support for SQL Server 2012 will be available later this year. For organisations that already have licenses for SQL Server, Amazon has a “bring your own license” agreement where you pay just for the compute on an hourly basis. If you are not familiar with Amazon’s RDS, it handles the administrative side of databases, that is, the deploying, scaling, patching, and backing up  as part of the price plan.

So in summary, Amazon’s releases relating to Microsoft products is a direct, proven alternative to Microsoft’s cloud services, which enable developers to work with their standard tools, build applications using proven techniques and deploy with minimal effort. It will be interesting to see how this pushes Microsoft, both on pricing and innovation. My own view is that Amazon may well have just eaten Microsoft’s lunch in the PaaS and relational data area of Cloud Computing.

Chris Czarnecki

New Amazon Web Services and EC2 Training Class

Great news!  Learning Tree recently decided to write a new class on cloud computing with Amazon Web Services. As far as I know, no author has been hired, but the tentative list of topics includes:

  • Amazon EC2 for Infrastructure as a Service (IaaS)
  • Amazon S3 for cloud-based storage
  • Amazon Elastic Beanstalk for Platform as a Service (PaaS)
  • Amazon RDS for Database as a Service (DaaS)

I’m sure there will be many other topics covered as the course content becomes more finalized.  If you have any suggestions or requests please let us know!

Over the last year, we’ve written quite a few articles on Amazon Web Services. Here are a few you might be interested in.

Understanding Amazon EC2 Security Groups and Firewalls
Using the Amazon EC2 Command Line Tools and API
Amazon Mechanical Turk – Artificial Artificial Intelligence
Amazon’s Relational Database Service (RDS)
Amazon AWS and ISO 27001 Certification
Save Even More with Amazon EC2’s Micro Instances
Elastic IP in Amazon EC2
Amazon Launches Email Service
Amazon Moves Towards PaaS with Elastic Beanstalk
Creating Amazon EC2 Machine Instances (AMIs) for Test Servers
Using EC2 Micro Instances for ASP.NET Hosting
Amazon AWS and Amazon EC2
Amazon’s VM Import: A Step Towards Cloud Interoperability

Once again, please leave us a comment if you have any good ideas for this new training class.

Doug Rehnstrom

Amazon Mechanical Turk – Artificial Artificial Intelligence

One of the more interesting and unique products offered through Amazon Web Services (AWS) is Amazon Mechanical Turk. Perhaps more accurately described as “crowd computing” rather than “cloud computing”, Mechanical Turk, named after the famed fake chess-playing automaton, leverages the power of massive numbers of humans connected to the Internet to solve problems that require human intelligence.

For a simple example you may see an image in a YouTube video and you wonder as to the location where it was taken.

Figure 1 Does anybody knows where this location is?

This is difficult to enter into a search engine. But imagine you could somehow post this image and then instantly ask thousands of people if they recognize the place. This is the sort of problem that Mechanical Turk is geared towards solving. Of course there are other, more practical, examples as well, but they are not as much fun!

Amazon Mechanical Turk defines two roles: Requestor and Worker. The fundamental unit of work is a “Human Intelligence Task” (or HIT). A requestor can specify a HIT using a template and then publish that HIT for workers to see. Requestors create HITs and Workers work on them.

Becoming a Requestor is easy. All you have to do is sign up using your Amazon account. First, you need to buy some pre-paid HITs. Then you can create HITs using one of the existing templates or you can create your own template. Finally, you can publish your HITs. This allows Workers to respond to them at the price you have specified (usually a few cents per HIT).

Becoming a Worker is more difficult. You need to pass a test that shows, among other things, that you can look stuff up on Wikipedia. Then you get a rating which allows you to respond to HITs. As you respond to HITs successfully your rating will increase. Make a bunch of goofs and your rating goes down.

In this way, Requestors and Workers are brought together in a marketplace. On-demand services delivered at the going rate.  It is almost frictionless free enterprise.

In addition to the web based interface, Amazon Mechanical Turk also has a command line and an API interface. The API allows for applications to be written which can leverage the power of Mechanical Turk. Think of, for example, a smart phone application that allowed the user to take a snapshot and then upload the image as a HIT asking “what is this object?” or “who is this person?” It could be used while walking through a museum or while sitting in a café on Sunset Boulevard.

So, I guess the point of this article is that computers are not the only thing in the cloud. Humans are there too.

To get a good grounding in cloud computing fundamentals I recommend Learning Tree Course 1200 – Cloud Computing Technologies: A Comprehensive Hands-On Introduction.


Learning Tree Logo

Cloud Computing Training

Learning Tree offers over 210 IT training and Management courses, including Cloud Computing training.

Enter your e-mail address to follow this blog and receive notifications of new posts by e-mail.

Join 51 other followers

Follow Learning Tree on Twitter


Do you need a customized Cloud training solution delivered at your facility?

Last year Learning Tree held nearly 2,500 on-site training events worldwide. To find out more about hosting one at your location, click here for a free consultation.
Live, online training
.NET Blog


Get every new post delivered to your Inbox.

Join 51 other followers

%d bloggers like this: