Cloud Computing: From Sci-Fi to Reality
(Image: DALL·E 3)

Cloud Computing: From Sci-Fi to Reality

What does a 77-year-old sci-fi short story have to do with today's Cloud Computing and Intel's latest Xeon CPU generation?

Cloud Computing Origins

Like most computer science, Cloud Computing traces its conceptual origins back far. Early relatable concepts are J.C.R. Licklider's "Intergalactic Computer Network" and John McCarthy's idea of computing as a public utility in 1962 and 1961. Yet, Science Fiction preempted them with Murray Leinster's allusion to a network of computers in "A Logic Named Joe", penned in 1946. Amazingly, this story envisions computing networks and something resembling multi-modal Artificial Intelligence (AI) and the ethical troubles it entails. He imagined this only a year after the close of WW2 and over a decade before integrated circuits.

Like AI, the idea and theoretical concept of Cloud Computing evolved, yet only simple versions became a reality for a long time. Time-sharing in mainframes from the 1960s was the earliest incarnation of the idea. Another notable precursor to Cloud Computing was Grid Computing. It demonstrated the viability of distributed computing with the famous SETI@home project analysing radio signals in the search for signs of extraterrestrial intelligence. It lasted from 1999 to 2020, with over 5.2 million participants using largely spare time on PCs. In 2000, Salesforce commercialised public Cloud Computing with its early Software-as-a-Service CRM offering. It broke new ground by selling computing and software easily accessible via the Internet. Around the same time, virtualisation software like VMware made early private Cloud Computing possible for companies with data centres.

Modern Cloud Computing

Amazon Web Services revolutionised Cloud Computing in 2006. I did not use it until 2011 when I moved to the UK and started working in start-ups. It was a perfect match. Start-ups usually have limited capital resources and time. The ability to use OpEx and spin up resources like EC2 and S3, especially with the advent of Hadoop and Big Data, was a game changer.?

It came with its challenges, of course. For example, the eventual consistency of S3 and the unordered nature of SQS meant data may not be visible for minutes, and messages come with any delay or order - by design. EC2 with handcrafted Hadoop images and early EMR were painfully slow to spin up and debug. We have come a long way since then.

In the early days of Big Data and Hadoop, I utilised the general-purpose M-instance types on AWS, including the first M1 type, which the Intel Xeon processor and Xen virtualization powered. I used large clusters of T1 burstable types for data mining, using spot instances to make them even more affordable. Notably, the T1 instances also operated on Intel Xeon processors. They were remarkably cost-effective if you could design your architecture with a self-healing recovery strategy and only short spikes of computes.

Accelerating Cloud Computing

But it is not all about cheap and cheerful. On the other end of the spectrum, I ran beefy Databricks clusters in AWS, which required careful profiling of costs. Factors like processing architectures, CPU types, supported built-in accelerated instruction sets, various SSD storage options, inter and intra-cluster node communication, and, importantly, what you process make a huge difference. The instance types recommended then were often the fastest option but not necessarily the most cost-effective by a wide margin.?

The wide choice of instance type options - general purpose, memory, storage, etc. - allows you, with enough effort, to fine-tune your setup. With the advent of ML and AI, the needs are evolving from Big Data. It is moving the emphasis from data movement and general-purpose processing to more specialised acceleration.

This shift to AI is recent, but Murray Leinster envisioned it with his nexus of voice-controlled 'Logics', granting access to the entirety of human knowledge, adeptly addressing our inquiries tailored to our exact needs. That sounds eerily familiar with the emergence of multi-modal AI and LLMs.?

The challenge for computing in general and Cloud Computing in particular is adapting to the changing computing needs. Particularly for AI training and inference, the compute resource requirements are immense, consuming enormous amounts of money and energy with an insatiable demand as it arrives in mainstream products. With it, the need for efficient, easy-to-use solutions like hardware acceleration is growing.

In computer science history, hardware acceleration of demanding processing needs has a long history. You may be old enough to remember that floating-point unit (FPU) was an optional co-processor before it got integrated into CPUs. It resulted from the ubiquitous need to process floating points transparently and fast. Hardware acceleration in the CPU was the obvious answer and became a standard.

AI and Machine Learning's thirst for matrix computation are the ubiquitous floating-point challenges of today. They are demanding and greatly benefit from additional hardware acceleration. As with the FPUs, the logical step is to make more acceleration available inside the CPU. It promises reduced costs, energy consumption and environmental impact. Intel Accelerator Engines in the 4th Gen Intel Xeon processor is a great example. Data Streaming Accelerator, In-Memory Analytics Accelerator, QuickAssist Technology and Advanced Matrix Extensions help with various tasks like encryption, networking, storage, in-memory analytics and databases. Importantly, they are available in the cloud with AWS today.

Specifically, Intel's 4th Gen Xeon CPU's Advanced Matrix Extensions is relevant. It allows fast, efficient AI processing without costly external accelerators. Leveraging these advancements can be as simple as using PyTorch on an AWS M7i instance. That is a leap forward for efficient and powerful AI and ML Cloud Computing solutions.

Conclusion

The prescient genius of Murray Leinster's story is the imagination of artificial intelligence, networked computing and its general availability before integrated circuits, mainframes, or personal computers existed. We call Leinster's Logics AI and name them DALL·E, Llama, and ChatGPT, not Joe. Still, they and the technology behind them are becoming a utility like Cloud Computing. Numerous AI and ML platforms and services are springing up to bring similar capabilities to everyday products and features in all industries. Like Cloud Computing before, AI is morphing from Science Fiction to reality.


Sponsored by Intel Business #ad #IntelXeon

Christian Prokopp

Data, Cloud, AI | Experience not Hype

1 年

Want to know more??Go to https://intel.ly/3ttWTI2

回复

要查看或添加评论,请登录

Christian Prokopp的更多文章

  • Spring Clean Your Feed in 2 Mins

    Spring Clean Your Feed in 2 Mins

    Is your LinkedIn feed boring and mostly noisy? One reason is the many out-of-network persons you have unintentionally…

    1 条评论
  • Making Cloud AI sustainable

    Making Cloud AI sustainable

    Introduction: Navigating the Environmental and Economic Impacts of Cloud Computing and AI In today's digital landscape,…

    2 条评论
  • The dos and donts of Elon Musk's Twitter debacle

    The dos and donts of Elon Musk's Twitter debacle

    Reading the news, it sounds like Twitter is in a death spiral. Elon got one thing right, only one though.

    12 条评论
  • What do you do to learn something new every day?

    What do you do to learn something new every day?

    "What do you do to learn something new every day?" Paraphrased, that was the question a mentee asked me this week, and…

  • Why you should mentor

    Why you should mentor

    Jon Gleich, whom I had the pleasure to work with, posted recently about mentoring. It made me reflect on what I learned…

  • Case Study: Selecting Big Data and Data Science Technologies at a large Financial Organisation

    Case Study: Selecting Big Data and Data Science Technologies at a large Financial Organisation

    Key takeaways Adopting Big Data and Data Science technologies into an organisation is a transformative project similar…

    2 条评论
  • 6 Steps to Big Data Success: Think Business Not Technology

    6 Steps to Big Data Success: Think Business Not Technology

    In this year’s Gartner hype cycle analysis big data is approaching the trough of disillusionment. So is it unlikely for…

    5 条评论
  • 4 types of Big Data as a Service (BDaaS)

    4 types of Big Data as a Service (BDaaS)

    The popularity of Big Data lies within its broad definition of employing high volume, velocity, and variety data sets…

    2 条评论

社区洞察

其他会员也浏览了