My thoughts on Docker
Vamshi Yemula
Vamshi Yemula
Senior DevOps Engineer | SRE | Kubernetes, Docker ,Terraform ,AWS and CI/CD Specialist | Driving Reliability & Performance
- "Docker" has been a buzzword for tech people for the last several years, and the more times goes by, the more often you hear about it. We’re seeing it more in job requirements, training centers and in company websites. Nowadays it feels like it’s something so basic and common in the developing world that if you don’t know about it, you are behind everybody else and many companies have moved towards Docker and have incorporated it as part of their DevOps application deployment process.
Docker is a platform for developers and sysadmins to develop, deploy, and run applications with containers.Containers allows to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package
Let us consider a real-time issue
- Imagine you are working on a code which is written in java and python language and you want to send your code to your colleague/friend/fellow-mate. Your friend runs the same project code on exactly the same data set but gets a slightly "different result".
- This can happen due to many reasons such as a different operating system, a different version of Java, python or any dependent files/libraries
Docker is trying to solve problems like that.
- Docker is a tool that is designed to benefit both developers and system administrators, making it a part of many DevOps (developers + operations) tool-chains. For developers, it means that they can focus on writing code without worrying about the system that it will ultimately be running on. For ops people, Docker gives flexibility and potentially reduces the number of systems needed because of its small footprint and lower overhead.
Reasons to use Docker
1. Runs on my machine = Runs anywhere
- If you have correctly dockerized your app and it runs without problems on your machine, 99% of the times it will run smoothly anywhere. By anywhere it means on your friend's machine, on the staging environment and production too. Given all of them have docker installed and configured correctly your app will run. Using docker also makes the application code cloud provider agnostic.
2. Setting up of Environment makes easy
- When you join a new company then it will take more than a day to set up the machine with the right OS, Setting up the language(s) used in the company add database(s) on top of it. 2-3 days are wasted on just getting the environment set up correctly. Installs docker then runs 3-5 commands, grabs some coffee and magic your apps(s) are running.
3. Application compatibility with the newer version of language/database
- Think a new version of the language just released. Like you were using PHP 5.6 and 7.0 has come out. You don't know how much work will be needed to make your application compatible with the new version of the language. Use docker here, you just need to run two different docker containers one running the current version and another running the newer version. You can even test the app side by side to measure performance.
4. Lightweight
- Docker images are typically very small, which facilitates rapid delivery and reduces the deployment time for new application containers.
- As Docker containers don’t have a full OS and they all share the same kernel, they are pretty light and takes seconds to start a container
5. Isolation
- Docker ensures your applications and resources are isolated and segregated.
- Docker also ensures that each application only uses resources that have been assigned to them. A particular application won’t use all of your available resources, which would normally lead to performance degradation or complete downtime for other applications
6. A Valued Tool for DevOps
- DevOps teams are finding it efficient to configure development and testing environments based on Docker containers.
- Instead of deploying the raw binary files for the programs such as RPM and JAR files to the target environment, they are now packaging the entire application as a Docker Image along with all the dependencies and even the required operating system.
- This image shares the same build version before getting published to a central registry. And it is then picked up by various environments (development, testing, staging, and production) for final deployment.
- Also, by leveraging the tight integration with source code control mechanisms such as Git and build frameworks like Jenkins – Docker can automate the build and deployment scenarios and scale them effortlessly. Using these automation tools and frameworks – you can build new Docker images as soon as the new code is available.
- Current Docker Adoption Behaviour in the market as follows (Source: Datadog)
Conclusion :
Docker isn’t too hard to learn. It’s easy to play with, and the learning curve is relatively small.