The Future of Computing is Serverless

The Future of Computing is Serverless

As IT leaders attempt to map out their strategies for the future of their function, cloud remains the leading favorite as the leading option for most new applications and systems. In fact, according to my research, I’ve found that over half of organizations will put 50% of their net new workloads in a cloud environment this year, with the latter percentage rising rapidly over the next half decade.

At the same time, cloud costs have continued to rise, with 37% of enterprises say their annual cloud spend exceeds $12 million and 80% report that their annual cloud spend exceeds $1.2 million . This growing cost burden has led organizations to seek more streamlined, efficient, and agile onramps for their cloud workloads.

Thus, as part of the discussion of where IT is headed, the concept of serverless has steadily risen as one of the primary paths as a maximally agile, cost effective, and operationally versatile approach to compute.

But misconceptions about serverless abound, so it’s worth looking at how the concept works and what its potential benefits are. It’s important first to understand what makes serverless unique. As most cloud practitioners are aware, the concept is defined as running code or applications on a fully self-contained platform with entirely inbuilt dependencies, in a way that customers need to not worry about the machine resources or even the location or type of the server.

No alt text provided for this image

Serverless can free developers and IT to worry more about what needs to be done from a run-time perspective, rather than how. As such, it defers the effort and expense of provisioning functional cloud environments to later in the deployment process. Advanced serverless platforms can even eliminate run-time provisioning entirely across the cloud lifecycle through automation.

Ultimately, the key to the serverless approach is the idea of a software development pattern that has no direct interaction with a server. Somewhat counterintuitively, however, serverless does not mean that applications actually run without a server. On the contrary, a third-party cloud server is used for hosting the application. But the developer -- and even the operations team -- does not have to understand the details of the server or even where it is. There is no need to manage server hardware and software for hosting the application. The hosting provider is usually entirely responsible for infrastructure and operational tasking for the “serverless” resource.

Serverless is Cloud Made Simple and Scalable

Today, IT customers face enormous challenges as the universe of cloud services become all-encompassing. This includes not just compute and storage, but security and delivery from core to edge. Solving those challenges with the most direct and effective approach requires tremendous integration and scale. Today’s increasingly sophisticated serverless frameworks can offer that shortest direct path, especially as cloud expands across the edge .

What’s more, serverless offers the potential to avoid long term strategic challenges like cloud vendor lock-in, by making it less relevant what specific cloud provider one is using.

Finally, serverless is also a natural for today’s tightly integrated DevOps and DevSecOps environments, that need rapid iterations between developer and operations teams, where dynamic run-times and resource footprints will shift quickly as code goes through rapid evolution. What the code needs to run is handled transparently and automatically by serverless resources via the provider, instead of manually by the developer or operations teams.

However, despite significant promise serverless holds across the cloud lifecycle, I often see it treated as a tactical tool to make development environments easier to manage, as opposed to a systematic way of making all cloud operations better and more efficient from inception to end-of-life. This is what the end game of serverless is, as not just a way of developing in the cloud, but ultimately as the leading way of consuming and operating in the cloud.

Serverless Case Example: Akamai with Linode

It’s worth illustrating these concepts with a popular serverless offering from Linode, now available through Akamai?, which acquired Linode earlier this year . For its part, Akamai is well known for a series of market-defining innovations, beginning with its content distribution networks (CDN) in late 1990s, followed by fundamental advances in streaming video and app acceleration. The company has now invested in a best-in-class serverless cloud compute business to add to its portfolio.? The company hopes this move will help it take to the next major step in becoming a leading cloud provider.

However, in an already crowded and hypercompetitive field, Akamai is going to have to swing for the fences. In my analysis, it is seeking to achieve this by aiming at a sector of the market with the most long term growth potential. This is where serverless becomes a key lever in their strategy, as the approach is going to take an increasingly large share of cloud services over the next decade, for the reasons outlined above. The key driver: The desire for developers and operations teams to tap into the potential of serverless to substantially reduce complexity in their domains while speeding time to value.?

From my perspective, serverless is a key way to manage the growing complexity of IT, by exporting it to the other side of the cloud interface. Service providers thus deal with the details of configuring, provisioning, and operating serverless resources on behalf of their clients, meeting whatever SLA is required, while businesses can focus more on achieving their goals.

This is where virtual private server provider Linode comes in to turn the story from tactical advantages?of serverless into a strategic cloud asset. Linode has had a steady growth story and they currently service over 1 million customers with a reported 99.99% uptime SLA across 11 global regions.?Akamai is making Linode part of their new Build on Akamai initiative as part of the growing offering to appeal to cloud innovators. With Linode’s integration, Akamai’s vision for the initiative is to deliver on two critical capabilities.

The first notable capability is that Akamai+Linode offers a single IaaS and Edge platform. It's a full continuum of compute, built on a simple, accessible platform that Linode has developed, and then distributed across Akamai’s global network, integrated with their serverless edge computing technology. The second is a fabric of intelligent app optimization that can seamlessly connect Akamai’s compute offerings with users, devices, the cloud, and the edge. This includes a serverless fabric that works with the latest cloud native concepts like containers, microservices, Infrastructure as Code (IaC), and Kubernetes.

With Linode, Akamai now has a rich cloud to edge platform that will make it easier for developers and businesses to build, run, and secure next-gen serverless-based applications on a unified platform, with wide reach, low latency, robust security, and greater resiliency. The powerful combination of Akamai and Linode creates a new type of cloud provider that addresses an emerging need not well met in the market, for enterprises large and small with high speed, agility, and safety. This creates an end-to-end lifecycle of compute with global reach that can help power and protect the next big shifts in life online. This includes modernizing traditional IT and helping grow Internet startups, to the cutting edge of digital including Web3, metaverse experiences, and the fast-emerging token-based economies that are backed by blockchain.

Cloud and edge will be the venue for the next twenty years of digital innovation as the high-growth industries of media, entertainment, technology, ecommerce, financial services, and online games rapidly evolve and transform. Akamai is making the calculated investment that serverless in particular will be a significant and growing part of that world of innovation. With its proven legacy of successful Internet infrastructure, I believe it will be a player to watch, and is a useful example of where the serverless industry is headed.

———

Note: I cited Akamai as a notable example of serverless architecture as part of a research effort I've engaged in. You can explore more information about their serverless capabilities below:

Cloud Computing with Linode

Build (a Serverless) Future on Akamai

Akamai Cloud Computing

David Haag

Founder/CEO/CTO Poker Games Interactive

2 年

Serverless is great, I use it a lot… there are still a few cases where running a regular server can have a cost benefit, but I definitely agree with you the direction is heading to serverless for most everything.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了