What AWS Lambda Architecture Presents for IoT, Machine Learning, and ... Microsoft and Docker?
Brian Fink
I enjoy bringing people together to solve complex problems, build great products, and get things done at McAfee! International Keynote Speaker | Author
AWS Lambda opened up new architectural patterns for enterprises to explore, enabling more efficiency with leaner code bases. The serverless service promises to reduce administration overhead, streamline application configuration and provide cost savings for a variety of use cases.
In some ways, Lambda is part of the overall evolution of cloud architectures. Developers can focus on code while AWS handles the triggers, provisioning servers and application scaling.
Lambda also touts several business benefits, as serverless computing requires less maintenance, reduces costs, provides unlimited parallelism and minimizes administrative overhead. Developers can use AWS Lambda functions to schedule and scale batch jobs. Commonly, a batch job reads a large amount of data and then processes each row serially. An AWS Lambda architecture makes it possible to spin up a different function for each item to quickly complete the job. An enterprise can even build apps with an AWS Lambda architecture that handles one billion events per day.
There still ops -- it's just not SysAdmin. A serverless architecture scales effortlessly. In traditional systems with Elastic Compute Cloud, enterprises can use Auto Scaling, but that requires setup work.
Enterprises can reduce labors costs by eliminating many of the steps in traditional application deployment. Serverless functions don't require an OS management; developers no longer have to think about using Ansible, Puppet or Chef. While serverless requires less effort, that doesn't mean technical operations work vanishes entirely.
There still ops! It's just not SysAdmin. A serverless architecture scales effortlessly. In traditional systems with Elastic Compute Cloud, enterprises can use Auto Scaling, but that requires setup work.
A serverless architecture promises shorter lead times for simple applications. In a traditional architecture, developers create the project, code the HTTP handler and application logic, provision a server, set up the server OS, set up security and networking, and then deploy the app.
With an AWS Lambda architecture, developers only need to code the application logic, configure the API gateway and deploy the code; this can all happen in a few minutes.
With a lot of talk about Amazon's automation of the grocery store (and their freakouts about the threats of automation), how will serveless architecture change how we scale our architecture and database teams?
A Different Fight
By adding C# support to Lambda, Amazon wants to attract those Microsoft shops with massive investments in .NET. It exploited the opportunity that came in the form of open source .NET called .NET Core. Non-Windows developers using Linux and macOS will also be able to develop serverless applications on Lambda. AWS has created tools that integrate with open source .NET to easily develop and deploy Lambda functions from the command line. (I think this is one of smartest moves by Amazon to beat Microsoft on its home turf).
A Significant Bet on Lambda Means "Fuck you, Docker"
Amazon has placed a significant bet on Lambda. For any cloud provider, compute, and storage, are the essential drivers of the revenue. Virtual machines are responsible for delivering the compute service in the cloud. When enterprises move complex workloads to the cloud, they will start spinning up beefy virtual machines which directly contribute to the bottom line. That’s one of the reasons why Amazon, Microsoft, and Google keep introducing powerful VM types.
With the rise of Docker, the focus has shifted to container images as the fundamental unit of deployment. Though containers run inside a VM, they are becoming the preferred mechanism to expose compute to outside world. Containers as a Service (CaaS) delivers an aggregate pool of resources by abstracting the underlying VMs. Today, many cloud providers offer VMs and container environments as the compute service to customers.
What about IoT?
AWS Greengrass, the embedded Lambda compute in connected devices is another strategic move from Amazon. The most complicated part of IoT solutions is the device management in offline scenarios. Developers find it challenging to write a consistent layer that can handle device management and communication seamlessly across online and offline scenarios. With Greengrass, Amazon is now bringing a subset of Lambda to hubs and gateways that manage the devices locally. The IoT device SDK that runs on the sensor nodes can talk to the Lambda endpoints in the hub. When the hub gains connectivity to the cloud, it can seamlessly synchronize the state. Developers will write single codebase for both local and cloud connectivity. This solves the complex problem of offline device and state management.
Technically speaking, AWS Greengrass can run on full-blown x86 servers that double up as hubs and gateways. OEMs can bundle Greengrass with their gateways, which will officially mark Amazon’s entry into the IoT appliance world. This technology has the potential to become an alternative to the Fog Computing model advocated by the OpenFog Consortium, led by ARM, Cisco, Dell, Intel, and Microsoft.
What about Machine Learning?
And of course, this goes back to my fascination with Machine Learning.
Subsequently, Amazon needs to offer more powerful machine learning capabilities to its cloud customers. Machine Learning is a subset of Artificial Intelligence that trains computers to recognize patterns from tons of data. It enables breakthroughs in image or facial recognition; by understanding images, a computer can “recognize” a person or a cat, and even detect emotions based on facial expressions. The technology also underlies the ability of computers to translate human speech from one language to another.
As a member of Relus' recruiting team, Brian Fink focuses on driving talent towards opportunity. Eager to help stretch the professional capabilities of everyone he works with, he's helping startups grow and successfully scale their IT, Recruiting, Big Data, Product, and Executive Leadership teams. An active keynote speaker and commentator, Fink thrives on discovery and building a better-recruiting mousetrap. Follow him on Twitter.
Lead Data Scientist at ADP
8 年Nice article, I wonder why are you so harsh on Docker.