Kubernetes Deployments & Strategies!
Pavan Belagatti
GenAI Evangelist (66k+)| Developer Advocate | Tech Content Creator | 30k Newsletter Subscribers | Empowering AI/ML/Data Startups
The buzz today is all about Kubernetes. But, before you start embracing Kubernetes, it is essential to know how to get started with Kubernetes the right way.
Before starting down the Kubernetes path, it is important to know if Kubernetes, and therefore containers, will help you solve the problem you have.?Kubernetes was created in response to the rapid adoption and comparative difficulty of running Docker containers. If your application is designed to run in many containers, and you require help automating the steps when starting them, Kubernetes is likely a good fit.?Think of a large microservices-based app with many components running in separate containers. With so many containers, Kubernetes’ ability to orchestrate the deployment of your containers will likely prove essential.
The best way to start with Kubernetes is to get a local Kubernetes cluster up and running with Docker Desktop, Kind or Minikube. You will be able to run a single-node cluster inside a virtual machine on your laptop or the local machine.??To deploy larger applications, you need Kubernetes clusters from any cloud platform and use the kubectl client library to issue commands to set up and deploy applications. Kubernetes Documentation ?and?Kubernetes Quickstart ?are two excellent resources to help you get started.
[Credits - Toolbox ]
Kubernetes?Deployment Strategies! How important are they?
Application deployment is a challenging task for developers, and Kubernetes makes it easy with its numerous deployment techniques.
To keep your application available and make sure users don’t get affected by possible downtimes while deploying new software, Kubernetes recommends using deployment strategies:?Blue-Green, Canary, and Rolling.?
? The rolling deployment strategy?is the default strategy by Kubernetes that slowly replaces the old pods of the previous version with the pods of the new version.?
? In a?blue-green technique, both blue and green versions get deployed simultaneously, but only one version will be active and live at a time. Let’s consider blue as the old version and green as the new version. So, all the traffic is sent to blue by default at first, and if the latest version (green) meets all the requirements, then the old version traffic is diverted to the new version (from blue to green).?
? Canary deployment strategy?is used to carry out A/B testing and dark launches. It is similar to the blue-green approach but more controlled. We will see slow-moving traffic from version A to version B in this strategy.
Let's do some practicals -
Create a Kubernetes canary deployment:?https://lnkd.in/g4rwXhtm
Do Blue-Green Deployments With Kubernetes And Harness:?https://lnkd.in/g5zUJM-R
Create a Kubernetes Rolling Deployment:?https://lnkd.in/gTaYNAZi
Efficiently Exposing Services on?Kubernetes
This was a headache when I started out learning Kubernetes.
There are several service types which can be used to expose apps for different requirements.
领英推荐
The ClusterIP service type is the default and only provides access internally on a cluster internal IP.
The NodePort type exposes the service on a static port on each node. Each Node proxies that same port number on every Node into the Service. This type of service does expose the app to the outside world, however, it is only good for short term public access and is not recommended for production applications.
The LoadBalancer service exposes the app using a cloud provider’s load balancer. The external load balancer directs the traffic to the backend Pods.
A more efficient way of exposing services is the use of Ingress. Instead of using lots of services, such as LoadBalancer, we can route traffic based on the request host or path using Ingress Controllers and rules.?
Know more about efficiently exposing services on Kubernetes in the below articles.
CI/CD with Kubernetes
Leverage?Kubernetes’ strengths to further your CI/CD journey.
Taking advantage of Kubernetes to build and package software is a great use case because modern CI tools focus on creating ephemeral build runners/nodes in Kubernetes. As build requests come in, spin up a new instance to create the build artifacts and then spin down the instance when the job is complete.?
Similar to running Continuous Integration steps on Kubernetes, running certain Continuous Delivery steps on Kubernetes itself is prudent. Standing up test infrastructure and then spinning down test infrastructure is easily achievable on a Kubernetes cluster.?
With Kubernetes depending on how far you take isolation vs having singular clusters, you can easily run the build, confidence-building steps, and deployment on and into the same cluster by?Namespace?separation.?Kubernetes can provide much more to power up your delivery pipeline.
Be a part of this next-gen evolution:?https://lnkd.in/gs4rtV5u
Do continuous delivery of Kubernetes based applications easily with Harness.