In 2002, VMware introduced their Type 1 hypervisor which made server virtualization mainstream and eventually a requirement for all enterprise IT organizations. Although cost savings are often cited as a driver, virtualization became a big deal for businesses as it allows continuous IT services. Using virtualization, IT departments could now offer zero-downtime services, at scale, and on commodity hardware.
In 2014, Docker, Inc. released Docker 1.0. Docker provides efficient image management for linux containers, and provides a standard interface that can be used to solve several problems with application delivery and management.
Much like VMware made virtualization mainstream, Docker is rapidly making containerization mainstream. In this post I will discuss four reasons why you should consider making containerization part of your business strategy.
Continuous Delivery of Software
Virtualization enabled the automation and standardization of infrastructure services. Containerization enables the automation and standardization of application delivery and management services (a.k.a. platform services).
Faster software delivery leads to faster innovation. If your business delivers software applications as part of its product offerings, the speed at which your teams can deliver new software features and bug fixes provides key competitive differentiation.
Virtualization, service catalogs, and automation tools can provide self-service, and on-demand, Virtual Machines, networks, and storage. But rapid access to Virtual Machines and infrastructure is not sufficient to deliver applications. A lot of additional tooling is required to deliver applications in a consistent and infrastructure agnostic manner.
Application Platform and Configuration Management solutions have tried to address this area, but have not succeeded en mass, as until recently there was no standard way to define application components. Docker addresses this gap, and provides a common and open building block for application automation and orchestration. This fundamentally changes how enterprises can build and deliver platform services.
Another fast growing trend is that cloud applications are being written using a Microservices architectural style, where applications are composed of multiple co-operating fine-grained services (http://bit.ly/1zPPzQH). Containers are the perfect delivery vehicle for microservices. Using this approach, your software teams can now independently version, test, and upgrade individual services. This avoids large integration and test cycles as the focus is on making incremental, but frequent, changes to the system.
Application Portability
Businesses are adopting cloud computing for infrastructure services. Public cloud providers are continuously expanding their offerings and are also constantly reducing their pricing. Some cloud providers may have better regional presence, and others may offer specialized services for certain application types. And, at a certain spend, and for some application types, private cloud remains an attractive option. For all of these reasons, it makes sense to avoid being locked in to a single cloud provider.
Containerization, allows application components to be portable to any cloud that offers base operating system that can run the container. Using containers avoids deep lock-in to a particular cloud provider, or a platform solution, and enables application runtime portability across public and private cloud.
DevOps Culture
The DevOps movement builds on Agile software development, where small incremental releases are favored to long release cycles, and the Lean Enterprise philosophy, where constant customer feedback loops are used to foster a culture of innovation.
With DevOps, developers also responsible for the operations of code. As Adrian Cockroft explains (http://slidesha.re/1v540RL), the traditional definition of “done” was when the code was released to production. Now, “done” is when the code is retired from production.
However, DevOps for a startup delivering a single web application will be very different than DevOps for an enterprise delivering several applications. In larger environments, and for more complex applications, a common platform team is required to service multiple DevOps teams.
Containerization, using Docker, provides a great separation of control across DevOps and platform concerns. A container image becomes the unit of delivery and versioning. DevOps teams can focus on building and delivering containers, and the platform team build automation around operating the containerized applications across public and private clouds, as well as shared services used by multiple DevOps teams.
Cost Savings
Virtualization allows several Virtual Machines to run on large physical servers, which can lead to significant consolidation and cost savings. Similarly, containerization allows several application services to run on a single virtual or physical machine, or on a large pool of virtual or physical machines.
Container orchestration solutions can provide policies to packing different types of services. This is exactly what Platform-as-a-Service (PaaS) vendors, like Heroku, have been doing under the covers. Containerization orchestration tools, that are built on open technologies like Docker, can now make this transparent to end users, and pass along the cost savings to their users.
Current Challenges
Recently, perhaps influenced by the buzz around Docker, Google announced ( http://bit.ly/UY030m) that all of their applications, from Search to Gmail, run in Linux Containers. However, Google and others have spent several years building and fine-tuning platforms and tools around containers, and until recently have treated these tools as a competitive advantage.
For mainstream adoption of containerization, better general purpose container orchestration and management tools are required. Application networking and security also remain areas of key development. Finally, the options for non-Linux applications are currently limited.
Summary
Infrastructure virtualization enabled continuous IT services. Containerization enables continuous application delivery.
Containerization also enables application portability, and can be a key architectural building block for cloud native applications. Once an application is containerized, the containers can be run on a pool of virtual or physical machines, or on Infrastructure-as-a-Service based public clouds.
For new applications, packaging the application components as containers should be strongly considered. Just as with virtualization, the list of reasons why not to containerize are already rapidly shrinking. Another case where containerization can help, is to transform traditional applications that now need to be delivered as software-as-a-service (http://bit.ly/1pVaeeK).
If your business delivers software, you can leverage containerization to develop and operate software more efficiently and in a highly automated fashion across public and private clouds.