How to Deploy Docker on AWS Whitepaper

[rt_reading_time label=”Read Time:” postfix=”minutes” postfix_singular=”minute”]

Docker on AWS

Container technology is slowly moving into the DevOps realm, creating a new methodology for designing, managing, deploying, and scaling services. Container technologies, such as Docker, were developed with that goal in mind – to make it easier to develop and deploy applications. In this whitepaper, we’ll talk about what containers are and why you should consider using such a technology.

What is a Container?

The high-level concept behind containerization should not be thought of as a virtual machine (VM), instead you should think of it as simply putting a single application process inside of its very own sandbox. The sandbox, or container, can have access to its host’s resources. You can specify how much compute and memory the sandbox should have access to as well as what kind of networking should occur. Containers can also have their very own environment variables and shared volumes with the host or other containers. You can have as many of these containers as your hosts resources permit, and you can isolate them from one another or build them to work together as a single cohesive application. One important note to remember about containers is that the container lives as long as the application process inside of it is running. As soon as the process stops, the container will also stop.

Containers are like building blocks and all start with a base. A base container is a simplified and smaller version of an operating system or application. These base containers have been stripped of all unnecessary programs to make them extremely lightweight and portable. When you take these pre-compiled base containers and then layer your application or processes on top of them, you create your own unique container to meet your needs. These containers can then be uploaded and downloaded as many times as needed and can run on many different environments.

Container Pros & Cons

Containers are designed to be small and mobile, a major pro. Small containers typically mean fast builds and quick deploys. This allows a development team to go from a code commit to a running, testable environment incredibly fast. Small containers also mean that deploying and scaling events can be incredibly fast. Instead of waiting for an entire server to boot and your code to install, you simply copy the container and run it on the same server or inside of a cluster of servers.

The concept behind container deployment is a “build once and deploy many” strategy. That means your developers can build and commit an image, and that image will be deployed and tested in a development environment. Once it is tested, it can then be copied and tested in a staging environment, and finally deployed into production. All environments can be covered by a build a developer compiles on a local machine, and the container can run identically in all environments. While it is possible to run Linux containers on a non-Linux environment, and many people do, Docker does require a Linux kernel to operate. In non-Linux environments Docker, will install and manage a small Linux VM where your containers will run.

The pros of containers can also become the cons, depending on how you are using them. The fact that containers are intended to be tiny applications that scale horizontally can be bad if you are running a large application or an application that takes a long time to initialize. Containers are also not a good place to maintain state. You should not have an application save anything locally inside of a container because a single scaling event can cause you to lose any local data. The final con is debugging. Until you start implementing more peripherals around your container environment, trying to debug a rogue container can be difficult, especially if they have scaled up, as logs by default are stored on a per container basis.

When to Build a Container

Hopefully you now understand what containers are and how powerful and useful they can be, but it should also be known that containers are not ideal for every situation. Just because you can containerize it, doesn’t mean that you should. We mentioned that one of the large pros to having a container is the size and mobility of the container. If you decide to package up a multi-site WordPress container with all its PHP code, plugins, themes, etc. you are going to lose some of the benefits that Docker inherently has. The larger the containers become the more difficult it is to move around and to deploy. That being said, it can still be done, but it does not utilize some of the inherent benefits of containerization.

Containerization also don’t provide all of their benefits when you have a container that takes a long time to initialize. If your containers have to pull down many configuration files and install several packages based on those configuration files, this could slow down any scaling events and consume a lot of resources on your host.

Docker shines when you are building and deploying small and lightweight services, often called microservices. Microservices are great because they usually contain a small code base, have very short development cycles, and can often benefit from scaling horizontally (adding more services) instead of vertically (adding more power to a service). Docker makes it extremely easy to deploy, update, scale, and destroy micro services. Docker is also great for applications that can scale horizontally. If your application can survive scaled up and down horizontally and doesn’t require a large server footprint, Docker is a great way to scale.

Deploying Docker Containers on AWS

Amazon Web Services (AWS) enables the deployment of containerized application clusters. Download our whitepaper to learn more about deployment strategies with containers and how to execute them on AWS with our step-by-step guide to deploying Docker containers on AWS.

Download our Deploying Docker on AWS Whitepaper

We are an AWS Premier Consulting Partner. We specialize in guiding our customers with DevOps challenges on their journey into the cloud. Our goal is to increase automation and decrease the 20th century approach to technology thinking. We strive to give you continual measurements for achievement, on demand demonstrations, and milestones for approvals and rejections. We want to help your teams’ talent break through by automating your workloads, because we know that downtime and repetition cost organizations money. Contact us to learn more.

Hidden layer

Share on linkedin
Share on twitter
Share on facebook
Share on email

Onica Insights

Stay up to date with the latest perspectives, tips, and news directly to your inbox.

Explore More Cloud Insights from Onica

Blogs

The latest perspectives on navigating an ever-changing cloud landscape

Case Studies

Explore how our customers are driving cloud innovation in their industries

Videos

Watch an on-demand library of cloud tutorials, tips and tricks

Publications

Learn how to succeed in the cloud with deep-dives into pressing cloud topics