Artificial Intelligence & the Telecoms Industry

Artificial Intelligence & the Telecoms Industry

About Subsea Cloud: Wet Data Centers?

Subsea Cloud ?places data centers subsea and in doing so eliminates the electrically driven cooling. This brings about a reduction in the power consumed (about 40% of the power consumption) and CO2 emission reductions, too.

Strategic Inflection Points

Reductively, a strategic inflection point in business is a point in time throughout the life of a business when its fundamentals are about to change. The change itself could mean growth and opportunity. On the other hand side, it could signal a downward trend and the beginning of the end for a company or industry. In any case, these strategic inflection points are disruptive, full scale changes in the way business is conducted.?

There are examples abound of industry-centric strategic inflection points that in the end affected us all: automated teller machines changed banking; storing music digitally on a portable device changed the music industry; creating content to stream has and continues to change the entertainment industry. For this last example, we are seeing the last casualties of this inflection point become terminal patients (the entire cable industry), and we get to watch in real time as these companies decline because they did not react properly to the change at the application or system level. As it turns out, in technology, what?can?be done?will?be done.?A decisive response is critical.

As an industry, we've reached an inflection point. Like a lot of the change our industry undergoes, what we are experiencing is due to what, how and when other telecom-reliant industries change. Specifically in this case, it’s how companies are now reacting to AI.

There's AI Now & There's AI Later?

There are, though, two points of consideration: the first is that AI isn't?really?here yet. The second is that when it does?actually?come around, we will need to have drastically overhauled our current infrastructure capabilities. In other words, widespread disruption due to the proliferation of AI is coming - and we'll have to support it far more competently than we are currently set up for.?


No alt text provided for this image

AI Today

AI is broadly defined in two categories: artificial narrow intelligence (ANI) and artificial general intelligence (AGI). AGI does not currently exist. The challenge centers around adequately modeling the entire world and all knowledge in a consistent and useful manner. AGI machines will exhibit intelligence close to human intelligence, capable of executing diverse tasks, engaging in abstract thinking, and adapting to novel circumstances. It sounds like a Sisyphean task.?I don't doubt we'll get there, but we aren't there yet.

Essentially the AI of today is enhanced prediction –?major progress in statistical techniques and not something that really thinks. It carries out specific tasks. Moreover, today's AI is mainly helpful for those already reliant on prediction in their businesses – Roughly 11% of businesses. It has impacted banking and finance, pharmaceuticals, automotive, medical technology, manufacturing, and retail. And it will continue to do so.

However, it's not yet transforming the economy. In short, in comparison to what we aim to do with AI, it's having a fairly modest impact right now. To be sure, we are all seeing some incremental benefits but that's not where the biggest opportunities are for the technology. It's held back by difficulties in establishing cause and effect relationships, which limits the usefulness of AI to places where we can collect relevant data or run a simulation and where the setting is the same all the time (like in games, where the rules don't change and where you can run different simulations/permutations to get to some outcome). This is not close to what AI will be able to achieve. Again, today, it's simply progress in statistical techniques.

AI's future impact will be about the things we can do because we can make better decisions and the things we can do differently by making better decisions. As of today, it has simply lowered the cost of prediction for a given set of rules and statistics, and when the cost of something goes down, we all tend to use more of it.


As an aside, when the general population thinks of AI, they think of popular AI characters: Rachael in Blade Runner, C3PO and R2D2 in Star Wars, Wall-E, The Terminator, etc. As well as this, they tend to think of Chat-GPT and the end of the human workforce.

Taking each in turn: it goes without saying these types of machines (characters) are not here yet (outside of science fiction).

Chat-GPT essentially “matches in meaning” by producing a ranked list of words that might follow your prompt. It doesn't just choose the *best* or most highly ranked word, but chooses based on "temperature" and "weight", among other things in order to simulate human awareness, explained in depth here by Stephan Wolfram.

The workforce isn't at major risk because not all jobs can be reduced to repetitive tasks. However, if a machine can do a task more cheaply than a human then replacement will follow. But, again, there are few jobs today in which AI can do all of the tasks associated with a single job. The intent is not to sound flippant, but it's mainly set tasks that can be moved to machines today (not whole jobs/vocations), leaving part of the job to still be performed by a human. In cases like this, where the job is essentially split, the employing company will have to pay the costs of the machine and the human employee, which might act as a short term deterrent in using the machine at all (at least until more of the job can be broken down into machine-doable tasks). Another point of optimistic consideration here is that the more computers are trained to perform these repetitive tasks (often assigned to low or entry-level employees), the more roles focused on complex tasks with competitive salaries may come about (potentially).


AI Later

No alt text provided for this image


AI has the potential to make decisions in areas where rule-based systems were traditionally employed. AI systems can provide much greater value, not simply by doing what we already do (although more efficiently), but by performing new tasks and solving problems that are not addressed in current applications and systems. As was pointed out in the book 'Power and Prediction', most companies operate on standard operating procedures (SOPs), which reduce cognitive load for those working within a company (or should) and make the company more reliable (or should). At the individual level, most people operate in part on their own rules, morals and standards and adhere to society-based rules (...or should). AI could potentially see some rules disregarded in favor of a decision.

Imagine an AI-driven car approaching a stop sign at a four-way junction; and simultaneously imagine yourself approaching the same junction in a run-of-the-mill car on the road today. The rules today say you stop even if you can see that there are absolutely no people, cars or hazards nearby. Most of us stop because of the rule, not because we couldn't make an informed decision. An AI-car may not have that rule imposed on it. AI won't (and shouldn't) just do things the same but better than us. It should do things differently so that *things* are better. This creates value rather than just lowering costs.

In another potentially more consequential example, also in part lifted from 'Power and Prediction', AI could prevent very costly rules. In a future pandemic, a rule with as many knock-on consequences as social distancing might never come about. In the last pandemic, social distancing was often a byproduct of a lack of information – We didn't know how the virus was spreading to begin with and, later, we didn't know who was infected. This often meant everyone was treated as though they were infected. The need to keep social distance meant some large and relied upon institutions were shut down either in part or completely (education, healthcare, economic, etc.) and other places altered to be partially operational (one person in the coffee store at a time). Social distancing had a huge economic effect. In the future, AI might be able to help us make different decisions that create rules for those infected (instead of everyone being treated as if they might be infected) and help us all make smarter, better and different decisions about how to proceed in the most efficient way. It might also be able to more accurately predict how the virus will spread and afford us better decisions around human logistics. AI may create value in a similar global crises when it can analyze intricate data sets, identify patterns, and generate solutions that may not be apparent through conventional rule-based approaches. As well as this it may thrive in uncertain and dynamic environments – adapting to changing circumstances and making decisions based on real-time information, allowing for more flexibility as compared to rigid rules.

In these and many other scenarios, AI may leverage its ability to learn from data, adapt to new situations, and make complex decisions that go beyond the confines of rule-based systems.

Our Current Infrastructure VS AI’s Needs

From an infrastructure standpoint, if we can agree the industry is struggling with today's AI, it's really going to suffer when the AI of the future comes around. It will be more than an advance in statistical techniques; New systems will be built for it, not just new applications or point solutions. This means the world could start to look different and instead of retroactively placing AI into our world, we'll not only build new, specific applications for it, but re-imagine whole systems. Our air-cooled, timeworn, sub-optimally placed data centers and low-risk-change approach could not support anything close to AI's future needs.

Back to our inflection point

To me, we are as an industry currently in a sort of balancing act of driving innovation at the same time as trying to tackle the limits of our infrastructure. There are of course companies out there working on this problem. Most are pushing incremental improvements; Some are pushing actual innovation.?My position is that we need to rely mainly on innovation to keep the future coming – The inflection point we looked at earlier boils down to the choice between the two.?Moreover, the changing landscape of platforms, equipment design, topologies, power density requirements and cooling demands all underscore the pressing need for new architectural designs .?

Ultimately, telecoms infrastructure will play a crucial role in the adoption and implementation of artificial intelligence across industries, as well as incorporating AI more and more into our own industry. Among other things, the use of AI within telecoms will likely start around materially improving matching supply and demand for supply chain effectiveness. It should also prove effective at predicting the need for maintenance and not just at the infrastructure level, but for the machines and vessels that help us maintain the infrastructure in the first place. We might even be able to circumvent rules that are currently in place because decisions are currently too risky due to a lack of certainty or other information – from boilerplate regulations to ineffective and impeding policies (like building for renewables).

Finally, here are some ways in which the telecoms infrastructure will need to adapt for AI at the system, application and point level:

Increased Bandwidth: The deployment of AI systems will require a significant amount of data transfer, which will put a strain on the existing telecoms infrastructure. To handle this increased load, we’ll need to provide more bandwidth to support the data-intensive nature of AI systems and applications.

Low Latency: AI systems require low latency networks to enable real-time data processing, analysis, and decision-making. This means that the telecoms infrastructure will need to be able to deliver high-speed data transfers with minimal delay, which will require upgrades to the existing network infrastructure and will likely include edge and ocean-based data centers.

Edge Computing: The rise of edge computing is becoming an important requirement for AI systems. Edge computing allows for real-time data processing and analysis to occur closer to the source of the data, rather than sending the data back and forth to a central server. This reduces latency and ensures faster response times. We’ll need to provide infrastructure for edge computing to support the growth of AI.

Security: AI systems require secure and reliable data transmission to protect sensitive information. The telecoms infrastructure will need to ensure security protocols are in place to prevent data breaches or unauthorized access. Ironically, we’ll probably use AI-generated and facilitated cyber security.?

Interoperability: AI systems need to work together seamlessly to deliver the best results. Therefore, the telecoms infrastructure must provide interoperability between different AI systems, applications, and devices, enabling them to share data and work together effectively. (I don’t myself have much to say about this that’s notable as it’s not currently in my wheelhouse, but can understand the importance of doing so.)?

Submarine Cables: Our reliance on submarine cables will also continue to increase. They are an equally critical resource in facilitating AI. Interestingly and as previously noted, both of these complementary infrastructure types (cables and DC's) rely on AI as well as helping to facilitate it. With its strengths in machine learning and AI, TEOCO developed?PRIDE , a cloud based, real-time maritime analytics solution designed for monitoring subsea cable threats.?Fishing vessels and other maritime activities are tracked, analyzed, and flagged when potential issues are identified.?The solution also analyzes data on seismic activity and energy dispersions to predict the likelihood of cable damage.


No alt text provided for this image

Power Density

The average power density of data centers is expected to continue to increase. Between rapid growth in data traffic and AI applications driving up?power usage?and?power density?in data centers, this increased power demand is putting a strain on existing data center infrastructure. To address the growing energy consumption issue, new data center architectures are focusing their engineering efforts on power density and scalable design.

Our solution, in fact, originated from the commercial sector's need to implement modularity and sustainability in a cohesive manner.?On average, the power density in a?traditional?data center ranges from?5 kW per rack?to?7 kW per rack. However, this range has been steadily increasing as a greater number of AI and ML workloads have started to be deployed more frequently in data centers. The rack density in a Subsea Cloud unit is 125kW currently (and notably, no one cared about that *at all* circa two years ago).?

Cooling Requirements of AI Data Centers

Artificial intelligence applications and workloads require IT equipment to run at high power densities, which generate a significant amount of?heat, leading to an increase in server cooling requirements. Inefficient cooling can result in reduced equipment life, poor computing performance, and greater demand on cooling systems.

Two commonly used cooling methods to address these heightened cooling challenges are?liquid cooling?and?immersion cooling. Particularly, power densification levels above?30 kW per rack?are where hotspots start to become present, and these sorts of cooling strategies are needed. At power densities of?60 kW per rack?to?80 kW per rack, direct-to-chip liquid cooling becomes more common.

Immersion Cooling

Immersion cooling is a method where electronic components are submerged in a non-conductive liquid coolant, like 3M Novec or Fluorinert. The coolant absorbs the heat generated by the components and is circulated to a heat exchanger for cooling before recirculation. Immersion cooling not only cools the CPU but also other components on the printed circuit board (PCB) or motherboard.

Immersion cooling is gaining traction due to its ability to enable higher power density and lower power usage effectiveness (PUE) for data centers that operate high-performance computing (HPC) environments. Unlike liquid cooling, which cools only the CPU and/or GPU, immersion cooling lowers the temperature for the entire board on which these components are mounted. See here as to why this doesn’t heat up the surrounding water. The cooled liquid is then recirculated.

Overall, the telecoms industry will need to provide a robust and reliable infrastructure that can support the growing demand for AI systems, while also ensuring that data is secure and reliable. This will require significant investment and innovation in network infrastructure, edge computing, security protocols, and interoperability standards. In the not to distant future, there will be real, system-level change that will require a lot more from our infrastructure.

Close, Getting Closer?

No alt text provided for this image

AI is just now starting its journey toward cheaper, better, and faster predictions that drive strategic business decisions and affect real change insofar how we decide and what we decide. When prediction done this way actually hits its peak, all industries will transform and the status quo will be disrupted, if not obliterated. The economy will transform as will many business operations. My position today is that it’s very early days for AI, and everything we are doing now is going to look ridiculously mundane compared to what we’re going to be doing in just a few years. The full potential of AI isn't here, but what we have now is embedded everywhere, touching just about everything we do – from getting a ride to finding out your ETA, navigating the roads, getting loans, or enjoying music and movies. And kudos to us: as an industry, we've already accommodated the progress we see today and we continue to do so, but we are reaching our limits when considering the traditional data center and related infrastructure.?

As Ajay Agrawal said, we are in the?"between times" for AI—between the demonstration of the technology’s capability and the realization of its promise.?But we have reached in inflection point that demands each company within the telecoms industry reacts properly to this new phenomenon. Incremental improvements won’t do it. We have to emerge from our constraints, not in spite of them.?

Oren C.

Principal, One One Three Growth Strategies I Founder with 2X exits I Partnership Broker I Corporate Developer I Revenue Growth Strategist I Startup Advisor

1 年

We definitely need to reconsider our use of AI in telecoms. There is so much opportunity that isn't being embraced. That being said, it's only natural that time and need will bring more innovation and organizational improvement.

impressive presentation and discussion, Maxie!

Ryan Correll

Senior Estimator - Estimating Dept IT Manager (Landscape Development Inc.)

1 年

Wonderful article, Maxie. I've been using ChatGPT to flesh out project ideas and have it check code for mistakes or optimization. It has been more like an assistant with a notepad instead of me just writing ideas down.

Duane Dunston, Ed.D

Senior Adversarial Engineer at Cloud Range | Mitre Threat Intell Cert | Mitre Adversary Emulation Methodology Cert |TEDx Speaker | Mossé Institute Student

1 年

I miss your writing and thought provoking analysis. Great article!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了