Chasing Automation with Docker Containers
Jason Stevens discovers the open source project that lets you pack, ship and run any application as a lightweight container…
One of the consequences of virtualisation in particular and cloud computing in general is the increasing march towards automation in worldwide IT departments and inside web hosting datacentres. This trend is also largely attributable to open-source technology, including Linux, which forms the backbone operating system of leading web hosts using the LAMP (Linux, Apache, MySQL and PhP) stack.
The ability to automate datacentre tasks and processes allows both hosting companies and businesses to lower costs and complexity, and focus on revenue-generating areas rather than time-taxing IT tasks in the areas of hardware, software and networks.
One particular innovative software company, Docker, is spearheading an open-source engine that automates the deployment of software applications as a lightweight, portable, self-sufficient container that will run “virtually” anywhere.
In simple terms, it’s a shipping container for code. It can run on just about any piece of hardware using the power of virtual machines, which form the core center of virtualisation technology and cloud computing.
This allows web developers and programmers to run applications in safe isolation using virtual “cargo” containers. These containers offer a more resource-efficient way of deploying and running multiple applications. This simplifies front-end and back-end headaches for all those involved in building static websites, dynamic websites, and database-driven analytics programs. It also helps people manage them within staging and development environments on various hardware platforms.
Common uses for Docker include…
- Automating the packaging and deployment of applications
- Creation of lightweight, private PAAS environments
- Automated testing and continuous integration/deployment
- Deploying and scaling web apps, databases and back-end services
The rationale for implementing Docker hinges on the huge number of combinations and permutations of applications and hardware environments that need to be considered every time an application is written or rewritten.
“This creates a difficult situation for both the developers who are writing applications and the folks in operations who are trying to create a scalable, secure, and high performance operations environment,” said Docker.
Today, Docker allows developers to build their application once and run it consistently anywhere. Conversely, operators can configure their servers once, and then know they can run any application.
These advances in encapsulating and delivering applications seamlessly across any type of hardware is allowing IT teams inside both businesses and hosting datacenters to automate new products and services at faster rates and lower costs.