From edge to cloud: The critical role of hardware in AI applications

From edge to cloud: The critical role of hardware in AI applications

This blog was authored by Vijay Nagarajan , Vice President, Wireless Connectivity Division.

I tasked #Midjourney, the AI tool that generates art from text, to create a futuristic egg basket that showcased the concept of being digitally connected. What I saw blew my mind away. The artwork I received was not only visually stunning but also showed how AI is capable of bringing new ideas to life. I was experiencing first-hand, as a creator, the transformative nature of generative AI with Midjourney, chatGPT and other tools.

Image by Midjourney, Inc


#AI is quickly becoming ubiquitous now. The world has woken up to the power of generative AI and a whole ecosystem of applications and tools are quickly coming to life. It is becoming increasingly important in various industries, including healthcare, finance, and transportation.

All this has a tremendous impact on the digital value chain and the semiconductor hardware market that cannot be overlooked. The apps and tools have to gather, process and deliver back data to the consumer with minimal latency. This also has to happen at scale given the rapidly emerging ecosystem.

Hardware innovations become imperative to sustain this revolution. As I think about AI growth and its impact on hardware systems and silicon roadmap, my own journey as a content creator using Midjourney for art or #ChatGPT for editorial help (like for this blog), certainly helps with the big picture.

So what does it take on the hardware side?

An increased demand for high-performance computing for cloud data centers

AI workloads require specialized processors that can handle complex algorithms and large amounts of data. This has led to the development of new processors, such as graphics processing units (GPUs), field-programmable gate arrays (FPGAs) and custom AI silicon that are optimized for AI workloads.

Memory and storage

The vast amount of data generated by AI workloads requires high-capacity storage solutions that can handle both structured and unstructured data. Solid-state drives (SSDs) and non-volatile memory express (NVMe) enable faster data access and processing.

Wired connectivity as the binding thread

Computing speeds loosely measured as a function of Moore's Law does not scale with the burgeoning data needs from AI. This means that the interconnects between computing units processing in parallel become critical. Wired connectivity is also the lynch pin between the computing devices, the GPUs, storage and memory. The rise of AI means that the network is the computer, and connectivity its lifeline.

Wireless at the edge

It goes without saying that our digital experiences today are wireless. To this end, it is important that our wireless broadband networks are capable of handling high-speed low latency data at scale. Cellular technologies and Wi-Fi work in a complementary manner to meet this demand. 5G deployments continue to improve cellular network capacities and provide top speeds. Wi-Fi innovations, together with the newly available 6 GHz band, have culminated in Wi-Fi 7 that directly addresses the issue of edge latency while ensuring top speeds.

Cybersecurity

With the increasing use of AI, cybersecurity is becoming increasingly important to protect against cyber threats and attacks. This requires specialized hardware and software, such as intrusion detection systems and encryption technologies.

At Broadcom, we've always sought to connect everything. That's our mission - to have a world connected by Broadcom. We are very proud of the fact that 99% of the World's data passes through at least one Broadcom chip. For us, the AI hardware needs are in the continuum of what we do every day. Our wired connectivity powers hyperscalers' data centers, and operator networks. At the edge, our wireless broadband solutions power your homes and bring data to your hands. Our software solutions provide the layer of security required. We love that AI is disrupting the status quo, and are proud to provide critical hardware to enable this.

Before I close, let me also briefly preview some of the innovation we focus on.

Advanced manufacturing processes: These are vital for the production of AI hardware. The use of 7-nanometer and 5-nanometer manufacturing processes creates smaller and more powerful chips. Our chips carry higher data loads, and deliver the low latencies needed for AI.

Custom designs: We innovate data center storage and connectivity solutions that are optimized for specific AI workloads.

Power efficiency: Complex workloads require larger amounts of power, which can lead to increased energy costs among other things. This is an area of focus for both our wired and wireless chips. For example, on our Wi-Fi chips used in phones, we steadfastly work on radio optimizations and architectural modifications each generation with an eye to disruptively lower power consumption.

These are just a few examples of our innovation focus. In a series of follow-up blogs, we are hoping to further delve into what Broadcom has to offer as AI takes off.? We will keep you updated on how AI shapes the software and semiconductor hardware market and how our innovations are geared to keep pace with the demands of AI applications.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了