Docker is a big thing in the DevOps community. It’s one of those technology enablers which has really enabled the DevOps ways of working. It has helped simplify the building and deployment of software, particularly moving the software from one environment to another. So this all sounds good, but what exactly is Docker?
Launched in 2013, Docker is the number one software container platform in the industry today. The term ‘container’ is borrowed from the concept of a shipping container, which is of a standard size and shape and allows people easily ship their goods, as long as they fit into that container. Now a software container like Docker basically has everything you need to get a piece of software to run – all this is packaged into the container.
The container allows you to easily run the software the same way regardless of where it is deployed. For developers this has previously been a big problem, as software often works differently when put into different environments. However, the container contains all the necessary libraries and settings to negate this issue. These are more lightweight that Virtual Machines (VMs) as they don’t bundle the full operating system – this reduces space needs and also means they are faster to start. You can have multiple containers on a machine, each keeping its isolated application.
Containers have been around for a long time, but Docker has finally broken through with a platform which is easy to use, is secure and has brought much needed standardisation. It’s also worth noting that Docker happened to come into play as cloud adoption was going into hyperdrive and Docker works really well with the cloud, which has certainly helped its uptake!
Docker has also focused on ensuring it can easily be incorporated into most DevOps tools of choice like Puppet, Chef and Ansible. This means a big thumps up for the increasingly important DevOps community.
So what does all this mean from a business perspective? It means faster and easier deployments, which means reduced costs and increased speed to market. It means easier handoffs between teams. It means less mistakes. It means Docker is big news.
The devil is in the detail. Many organisations have long standing legacy systems which aren’t architected to naturally sit in a Docker container. They can be reworked but is there a business case to do so? If most of your technology estate is legacy and these don’t need Docker to keep running, is there a case to invest in Docker for even your new software. It’s not just the Docker software costs you need to consider - think about the costs to update people’s skills, security assessments to ensure Docker is secure, deployment processes, monitoring solutions, patching approaches etc.
Another reason not all companies are using Docker is that they are taking a wait and see approach. The big banks are using Docker, as are major media organisations, but people still want to give it a bit more time before making the plunge. People want to see how secure Docker really is – is it ready for enterprise use? They want to be very clear on the operational implications of moving to a container world, and if Docker really is the right platform to choose
Having said all this, the advantages of Docker containers are clear and big organisations are using them. Docker should be on every IT departments radar for they can mean faster and more effective deployment of software and in today’s world that means competitive advantage.