Lately I have been getting more familiar with the various services on AWS. More specifically, EC2 Container Service, aka ECS. I came across a workshop on the awslabs github page that walked you through deploying a Deep Learning Framework on ECS. I found this extremely interesting and thought I would share with you.
Deep Learning (DL) is an implementation of Machine Learning (ML) that uses neural networks to solve difficult problems such as image recognition, sentiment analysis and recommendations. Neural networks simulate the functions of the brain to detect patterns in data. This allows deep learning algorithms to classify, predict and recommend as more data is trained in the network. The workshop walks you through the deployment of a deep learning library called MXNet on AWS using Docker containers. If you have been following any of my previous posts you should be familiar with docker by now. This just reiterates how containers can really simplify spinning up an environment with very little effort. The intent was to deploy a deep learning environment that would perform image recognition.
At a high level I used a CloudFormation template that set up a VPC, IAM roles, S3 bucket, ECR container registry and an ECS cluster which is comprised of two EC2 instances with the Docker daemon running on each. To keep the costs at a minimum the EC2 instances were spot instances deployed by spot fleet. Spot fleet is a collection of spot instances. This is what the deployed environment looks like.
Once the stack was complete I built an MXNet docker image and pushed it to the EC2 Container Registry, aka ECR. Next, I deployed the MXNet container with ECS that allowed me to perform image classification. Here is the results of an image that I fed into the model.
Hopefully this gives you an idea of how powerful Machine Learning can be. Can you think of ways that this could be useful? I hope this has been enlightening for you. If you are interested in learning more about this or testing it yourself, you can find the github repo here.