Introduction to cloud-native apps on Azure
Abhishek Shrivastava
FinOps Cloud Specialist at TCS, 4*GCP, 2*Azure ,1*AWS, GEN AI Badge, MCT-Microsoft Certified Trainer , MST, AWS Community Builders, MIE Expert , Oracle AI Cloud associate, TCS Gold Certified Mentor
Introduction
Cloud-native apps represent a modern approach to app development, where software systems are designed with cloud technologies in mind. Unlike apps that were originally designed to run on premises, cloud-native apps can take full advantage of the many services Azure has to offer.
Many cloud architects opt for Open-Source Software (OSS) like Kubernetes and Docker when designing cloud-native apps, but turn to proprietary offerings like Cosmos DB when the benefit is overwhelming. Because of this, cloud-native apps make the end-to-end process of building an application easier, with a focus on architectural modularity, rather than monolithic, all-in-one applications. You can utilize the technologies that suit your skill set and situation, without being locked into technology choices.
Scenario: Smart refrigerators, smarter service, at scale
Suppose you work for Adatum Corporation, a manufacturer of home appliances, where you lead a small development team and you've been tasked with building an app for smart refrigerators.
We could, for example, start with creating a small inventory management app for the refrigerators, so businesses can know what needs to be restocked, or potentially, have items reordered automatically. It's the nature of cloud-native apps to have loosely coupled functionality, so we can be more agile in our design and avoid having to predict future requirements. Rather, we can extend the app if it becomes necessary. Later, we can add functionality to the app, such as connecting to refrigerator telemetry and onboard sensors.
What are cloud-native apps?
The cloud-native approach allows you to build cloud-based applications where you choose the components you want to use. Components, such as a database and a .NET Function app, could be coupled together as a service, to form an isolated part of a system. For instance, you might have an inventory service, an ordering service, and a payment service, each with their own technology choices.
Additionally, cloud-native apps are modular in nature. You choose the cloud services and technologies, and loosely couple them together as shown in the diagram. Cloud-native apps often employ another pattern, called Microservices. Notice how each service has its own technology stack, independent of what other services use, meaning you choose the technologies that suit each individual service, rather than a one-size-fits-all solution.
With Cloud Native, we're using many pre-built types of services or services with pre-built infrastructure. So we can use scaling from Kubernetes or Azure Function Apps, and geo-redundant data storage from Cosmos DB or Hyperscale for PostgreSQL.
So, while apps built on "microservices" in general share many of the same characteristics, "cloud-native apps" can have parts of their tool chain where little to no custom code has to be used to get advanced functionality or operational excellence.
Further, with different components loosely coupled together to create an application, you can change technologies as required without rewriting the entire application. Such as with our smart fridges, each service can be upgraded, deployed, scaled, and restarted with no effect on other services, allowing for frequent updates.
Use technologies that you’re strong with
Most cloud-native services support a wide range of technologies. Kubernetes supports multiple client OSs and any tech stack, such as .NET, Node, Ruby, and Java. There are vast options for databases that can be connected using any major programming language.
You can connect a backend to your relational database of choice for one service, while also using a NoSQL database and a pre-built analytics service when it’s a better fit for another service. You can do it all quickly and simply, within the same overall cloud application.
Using containers with cloud-native apps
Reliable, separated environments using containers
Containers are loosely isolated environments that can run software packages. They’re usually a key component of cloud-native apps, as they provide a reliable, separated environment that works the same on any machine. Containers are often referred to as ”Docker containers”, named after the most popular tool for creating and managing them.
Each container is self-contained, with its own code, data, and dependencies. One of the strengths of containerization is that you don't have to configure hardware and spend time installing operating systems, virtual machines, and software to host a deployment.
While we can use containers directly, by taking a working software program from our own machine to the cloud, we can also export containers from services. Such as with Azure Speech Service, which transcribes real-time speech into text, and is available as container images that can be directly deployed to your own system. Many Azure services use containers under the hood, so there are a wide range available.
Containers are easy to use with cloud services. They ensure that, once tested, your application works the same on your local machine as on the cloud—giving you a much more reliable, low-maintenance experience. This containerization means you can easily scale out by replicating containers, and that every instance of your application is in an identical environment. Further, you can manage containers with a container orchestrator, such as Kubernetes. Orchestrated containers, with their lightweight nature, can scale out much more cost effectively and nimbly than virtual machines.
Manage containers easily with a Kubernetes service
Kubernetes, often abbreviated as K8s, is a technology that manages multiple containers for you. You can connect containers so your database can talk to a backend, scale resources, and automate application deployment, backups, and maintenance.
One of the key benefits of Kubernetes is the ability to restore applications to the exact instance that has been tested and saved, otherwise known as self-healing. As containers can be saved and replicated, Kubernetes can check on the health of a container and replace it with an original copy if necessary.
Kubernetes can also automatically increase or decrease the number of containers if demand changes. If traffic to a container is high, Kubernetes can load balance and distribute the network traffic so that the deployment is stable.
Additionally, one of the main benefits of using a Kubernetes service is simplified security configuration management. Many have built in authentication services, allowing the services to provide compliance offerings for most countries/regions and industries.
Further, when a component is updated, you can automate Kubernetes to create new containers for your deployment, remove existing containers, and adopt all their resources to the new containers. Kubernetes services, such as AKS, simplify container management and can provide massive savings in development time, cost, and security obligations. Continuous integration and continuous delivery (CI/CD), allow Kubernetes services to optimize development pipelines and application deployment.
Designing a cloud-native app
Because cloud-native apps are made up of the components of your choice, you can easily architect a solution that uses technologies you're comfortable with. For example, if Python better suits your data analytics service, but your email service is more suited to using a pre-built solution, the architectural modularity of cloud-native apps significantly simplifies implementation. This modularity even extends across different cloud providers.
For example, many cloud services, such as Azure Database for MySQL, allow you to develop using the open-source versions of technologies that you’re used to, but have Azure take care of administration and deployment responsibilities for you.
Architecting a cloud-native solution for Adatum
In our scenario, we can architect a solution that’s easy for a small team to develop but also scales safely to thousands of devices. Cloud functionality eliminates many development issues arising from a need to connect to large numbers of devices and process data on demand. By using cloud infrastructure, pre-built services can be easily configured to communicate with each other and autoscale as needed.
Later, if necessary, the solution can grow to accommodate new or updated products. In our scenario, if a hotel chain ordering thousands of refrigerators needs extra functionality, you can create an extra service with no downtime for existing customers.
Starting small
To begin, we could use a basic webapp for a management interface. A simple backend in the cloud can relay messages from smart refrigerators to the webapp, which can then be containerized and deployed to a Kubernetes cluster—so the number of containers can scale as needed. The following diagram shows this relationship. The Node.js Express box relays messages to our Webapp Next.js box, both of which are deployed from our Kubernetes service.
This solution is easily connected to your database of choice, allowing a scalable, end-to-end service to be up and running quickly.
Growing our application
Smart devices offer a wealth of connectivity and data options. Advancements in the IoT field offer cost-effective options for gathering data and streaming it to the cloud. IoT cloud services for smart devices are easy to connect, allowing you to stream telemetry data such as refrigerator temperatures, power consumption, and water quality.
Developers can use cloud-services, such as IoT Hub and Stream Analytics, to develop cloud-native apps with IoT integration. Since much of the groundwork has been done for you, development time can be decreased significantly.
Because of the loosely coupled nature of cloud-native apps, you can choose a different database solution for telemetry data that's more suitable for streaming data, such as Cosmos DB, rather than a traditional, relational database that might be more suitable for an inventory service.
As the services are separated, your team can also develop and deploy an IoT service with no effect on your existing inventory service, as shown in the following image.
When to use cloud-native apps
Cloud-native apps are architecturally different to more traditional software engineering approaches. Because cloud-native is such a broad category, you can easily create an architecture to serve most needs, such as speed to market, integration of new technologies like machine learning, and rapid adaptation to customer feedback.
Modernizing existing apps
Cloud-native apps aren’t only for new projects. While a retrofit application might never be 'truly native to the cloud,' many existing applications would benefit from cloud-native thinking—allowing individual features to be released without a redeployment of the entire system, while still increasing reliability.
It’s often cost-effective to cloud-optimize an existing application, as you can take better advantage of fine-grained scalability and improved system resiliency. Many cloud services provide administrative functionality suitable for cloud architects and developers, because it lowers the need for specialist management.
For example, Azure Database for PostgreSQL has built-in database administrative functionality that is like DBA, by managing the underlying operating system and database for you.
Gradual adoption of a cloud-native approach
Partially modernizing an app doesn't necessarily mean a full migration and rearchitecture. Existing applications can evolve toward a cloud-native approach by moving existing architecture to a more modular, service-based architecture and using API-based communication. You can extend and evolve existing applications by adding new services based on cloud-native paradigms.
It usually makes sense to adopt these technologies and approaches incrementally, depending on priorities and user needs.
When not to use cloud-native apps
A cloud-native app might not be a good fit if you already have an existing application that doesn’t provide enough value to invest in modernization. Also, if you have an application with predictable resource demands, an existing data center and existing management infrastructure might do fine.
However, even in these cases, you might still want to consider a hybrid approach—where your on-premises applications can work with your other, cloud-based applications and services.
Using cloud-native apps in industry
Cloud-native processes use automation, such as CI/CD pipelines, allowing developers to focus on developing code instead of the deployment overheads that many traditional systems involve.
Many companies with cloud-native architectures have thousands of independent services deploying hundreds, if not thousands, of times per day. They can instantaneously update small areas of a live, complex application, and individually scale those areas as needed.
Summary
In this module, you learned about cloud-native apps. You also learned about services, containers, and other cloud services and concepts.
Cloud-native apps present a new, modern approach for building systems. If you want to learn more about containers, you can see Introduction to Docker containers. If you want to learn more about how to orchestrate containers, you can see Introduction to Kubernetes and Introduction to Azure Kubernetes Service.
Now that you've reviewed this module, you should be able to:
Process Expert ITIS
5 个月Very helpful Abhishek Shrivastava
人员开发人员
5 个月Great newsletter subscriber numbers, Abhishek Shrivastava! That is fantastic (I’m about half your subscribers for the “AI & LinkedIn Profile Showcase”)