A Prediction: Highly Optimized Computing Platforms
Two conceptual paradigms have been churning away in my head, one old and one new, that made me see what I think is the future of all computing. I don’t know what to call it, so I just made up a name: Highly Optimized Computing.
Co-Processors
The old paradigm involved is what I would traditionally term a “co-processor.” In the original days of computers, early computers had a hard time doing complex math, especially math with lots of exponents and decimal points involved. This is termed “floating-point math” (https://en.wikipedia.org/wiki/Floating-point_arithmetic).
Early PC CPU chips couldn’t handle floating-point math very well. It could do it, but it not quickly or efficiently. Regular CPUs essentially had to brute force many complex math calculations. Imagine calculating 1000 digits of pi (3.14…) or doing quadratic formula calculations without an easy way to move around decimal points after doing the more basic underlying mathematics.
In the 1970’s, someone/company created a “math co-processor” that could be added to computers. The computer, detecting floating-point math-type problems, could direct the involved floating-point math calculation to the more efficient math co-processor, which would do the needed calculations and then drop the answer back off to the main CPU, where it would continue to be processed along with the other problems.
From the 1970s to the 1990s, the math co-processor was a separate chip. The computer’s motherboard had to have a separate math co-processor “slot,” and the computer’s creator or buyer could separately buy and add the math co-processor if desired. I remember buying math co-processors for hundreds of dollars, plugging them into my motherboards, and feeling as if I was creating some sort of mini-supercomputer. I might as well personally owned a Cray cracking crypto puzzles based on my emotional commitment.
The Intel 80386 chip introduced the first mass-produced built-in math co-processor. No longer did a computer buyer have to buy and install a separate chip. And just like that, math co-processors just became a part of the main CPU chip and everyone had one.
This is likely what is going to happen with specialized AI and quantum chips. Right now, they are separate computers or chips…or will become separate chips when they eventually make it down to chip-size. Yes, even quantum computers, most of which now take up large rooms full of specialized environmental systems, protection, and teams of people, will eventually just be a chip you plug into your motherboard. Many countries (e.g., China, New Zealand, etc.) and groups of people are continually making increasing strides into putting quantum on a chip.
Many people question if quantum can be put on a quantum chip. The answer is yes. All computer chips (really everything in existence) only work because of quantum. The difference is that quantum chips will process, computer, and store information based on quantum particles or properties of quantum particles (versus larger collections of quantum particles and light). This is to say that today’s chips store information based on charges or collections of electrons or photons, whereas quantum chips will do the same using polarization or other quantum properties. But both types of chips work solely using quantum mechanics. Neither would be possible without quantum. So, yes, quantum will be on a chip one day…and I don’t think it’s that far out time-wise.
So, we’ll have AI-specialized chips, we’ll have quantum-specialized co-processor chips, and eventually those might be moved onto the main CPU chip, if history is any guide. Either way, the specialized functionalities will be a part of your computing device and not some separate computer (well, for small-scale jobs). The computer’s main chip will detect the type of problems that need to be solved and hand off the jobs that need AI to the AI-chip or component and hand off the jobs needing quantum solving to the quantum chips or components.
I don’t think I’m making a huge leap of faith in making this particular prediction. It’s how computers have progressed…from separate large components to integrated components that fit on chips. Even hard drives that used to fill entire rooms (or file cabinet-sized components) now fit on a chip. Computer CPUs now contain a considerable size of storage space directly on the chip, and we have L1, L2, and more permanent storage chip arrays.
And this leads to the second, newer paradigm, that has already happened. Although we’ve just seen the first instance/use of it, but I’m predicting it’s going to become the way most computers work.
Upscale Offloading Paradigm
Apple recently introduced its new Private Cloud Compute platform. I wrote about it here (https://www.dhirubhai.net/pulse/huge-new-cybersecurity-paradigm-study-understand-roger-grimes-cmgpc). It’s a brand new paradigm that is likely the future of all computing. Basically, the way it works is Apple has new AI functionality located on Apple devices (e.g., iPhones, etc.). It will do what it can do, but mobile phone devices aren’t super powerful computers compared to desktop computers (memory- and processing-wise) and especially what is available up in the cloud. Anything AI-related that the local device can do will be done completely on the device using its built-in capabilities (e.g., processing power, data, storage, etc.).
领英推荐
If the job can’t be done on-device in a timely manner, it will be moved to Apple’s own Private Cloud Compute cloud service. If that’s not good enough, Apple will ask for permission to move it to some much larger-scale cloud product, such as ChatGPT. This on-device-to-vendor cloud-to-large-scale cloud move is likely soon to be the model for all specialized computing. Apple is doing it now; Google, Amazon, and every other vendor will soon follow if they aren’t already doing some form of it already.
Offloading computing loads to increasing stronger processors just makes good sense. It’s the long-predicted consequence of faster networking, Internet, and clouds. It moves the personal computer into the cloud.
Note: There’s a good chance that in the future, your “personal computer” will simply be an empty vessel that loads your portable OS image to whatever device you happen to be working on at that particular moment. It’s already happening.
Highly Optimized Computing Platform
If you put both paradigms together, what you get is a computing vessel that has different processors for different types of computing jobs (e.g., traditional binary computing, quantum computing, AI, etc.). These different types of jobs will be handled by various chips or co-processors located on the device you are using. And they will be offloaded from the device, to a vendor of your choice, to a larger scale computing cloud platform as needed.
You input the problem that needs to be solved, and the software and/or hardware of your computing device or image will decide what goes where and when, just like yesteryear’s floating point math co-processors. When all the co-processors and upscaling involved is finished, the resulting answer will be pushed back down to the device and output like any other computer answer.
If you add both paradigms together, you get a logical computing diagram that looks something like this:
I can see other types of co-processors, vendors, and clouds being added as they develop and offer value, such as specialized crypto-chips, identity assurance chips, privacy-protecting chips, DNA computing, quantum random number generators, etc.
I could be wrong…I’m way out of my field of expertise…but I do tend to see trends early as they develop. I think it’s one of my few superpowers. And this is what I see happening as a very natural, orderly procession of what’s already going on today.
And as much as I love the net result of all this specialized optimization because it will result in faster, better answers, new inventions, new services, and many other things we cannot even imagine today…I can’t imagine the cybersecurity nightmare that will befall everyone who has to protect all of that!
It’s a quagmire of integrated systems that will require all sorts of new risk management analysis, new management paradigms, and new ways of thinking. Perhaps a company, who we probably haven’t heard of today, will help to make managing all that easier and become the next Google, the next Microsoft, the next ChatGPT. We’re going to need the help. ?
Application Services Manager II at Star of Hope
6 个月Thanks Roger Grimes for this post. It seems to me that cybersecuriy challenges will get bigger because hackers will start using the same tools that you talk about. The potential risks will be much bigger, as large mission-critical systems can go down, as the attack tools would have become more sophisticated. Is there any other way you think cybersecuriy landscape will be different, other than faster and more sophisticated, using cloud quantum computing resources? Fundamentally, as I see it, the cybersecuriy world will be the same as now, just more brutal. One way to avoid it is to rethink the whole landscape, so the nightmare of patches, updates, social engineering, etc. is not possible at all in the new system, as this is where most systems become vulnerable. Thanks for your insightful post, as always.
Author of Designing Secure Software: A guide for developers
6 个月In your "Upscale Offloading Paradigm" won't our devices become increasingly hobbled by any lack of connectivity? Already phones work very poorly without a connection, won't this make it worse?