The History and Trend of Digital Computing Platforms -- Part I : From Big Irons to Embedded Gadgets

The History and Trend of Digital Computing Platforms -- Part I : From Big Irons to Embedded Gadgets

With the invention of transistor computers in the 50's the expansion of the computing trend has boosted by many folds and computers became available and affordable to non-governmental organizations. Back in the 60's, when the only available computing platform was mainframes, every company, university, agency, bank, etc. who wanted to utilized a computer, had to purchase a mainframe. There was no universal architecture and operating system (OS). Beside some special purpose programming languages, many software applications were written in machine language. Therefore, the usage of computers was limited to large organizations who could afford dedicating resources (money, space, experts, etc.) for a mainframe machine.

The introduction of mini and mid-range computers (which later became high-end to mid-range servers and workstations) in the 70's made it possible for mid-size companies to purchase their required computing platform without spending a large amount of money on mainframes. In that era, the adoption trend of computers accelerated especially with the help of microprocessors, portable general purpose operating systems (i.e., UNIX), and better programming languages (i.e., C). However, mainframes did not go extinct, only their market shrank (prior to mini computers there was almost one single option for companies to have computing platforms: mainframes). Moreover, mini computers did not rapidly gain market share; it took years for them to establish a large margin in the market. Nevertheless, purchasing a computer for individuals was still not an option.

During the 80's another wave of products were launched that changed the direction of the IT world: micro and personal computers. Afterwards, buying desktop computers became affordable for both small companies and households. As a result, many companies could switch to desktop computers for their lightweight work loads from big expensive machines. Although, it took a decade for desktop computers, with the help of the Internet to become the first choice in many places, they changed the direction of the IT world dramatically: It became possible for individuals to install a computer in their home!

With the booming trend of the world wide web (www), during the 90's, the next wave of computing platforms that gained popularity were laptops. As a result, instead of buying or assembling a desktop machine, one might buy a laptop computer to handle their day to day tasks. Initially, laptops did not replace desktops, in fact their popularity increased a decade later with the help of easier to use operating systems such as Windows xp.

In the early 2000, the beginning of the high-speed mobile internet (3G) and WiFi network era developed . These mobile networks helped newcomers in the IT world to gain popularity: PDA and Smartphone devices. Since then, smartphones have kept packing more horsepower, larger memory, higher-resolution screen and camera, etc. They essentially became a replacement for a personal computer in many aspects. The first models were not able to compete with laptops, in fact they had to wait another decade to affect the personal computer's market. By using smartphones, it became feasible for a person to ditch their larger devices (i.e., laptop and desktop computers) in many situations, as smartphones can perform many of those tasks efficiently.

What is the current decade innovation that can potentially replace many older platforms and devices? Well, you've probably guessed: wearable devices such as smartwatches, smart glasses, and other upcoming wearable gadgets including smart pens, rings, belts, etc. Although, maybe after 2020 most of our smartphones functionality may be embedded in our wearable devices, they are not going to replace our smartphones and tablets completely. Much like nowadays we still have mainframes (though, with a very limited footprint, e.g., core banking), large high-end UNIX servers (e.g., in telecom companies), high-end desktops (e.g., many companies and some homes), and laptops (many still use laptops for daily tasks). Each new technology may shrink the market of the previous platforms, but can never cannibalize them completely. Today, high-end workstations and desktops belong to the right segment of the market. Two decades ago, in order to send an email from home, one had to assemble a complete desktop, but now you can use your watch to do the same task.

Then, what will be the next iteration in the computing platforms and devices in the next decade? By miniaturizing wearable devices and by providing direct interface to human brain, it may be possible that in 20's we will have small devices (similar to Bluetooth hands frees) that directly connect to our brain. Sure they will need another decade to replace many other gadgets but eventually lots of our daily tasks can be handled by them. We can think about something and send the command to our tiny device, or we can hear a voice in our head that reads emails to us without using any sound frequency. We may see objects without actually using our eyes, and we may send directions to our car without using any manual or voice command.

Potentially, what will be the direction of this trend in the 30's? By increasing the computing capability and functionality, it may be possible to embed tiny devices into human's brain to enhance the human's brain capabilities; to actually build a superbrain and superhuman. This kind of technology will probably need years to expand its footprint into human's life, but at the end, who doesn't want to have a super fast brain, super memory and superhuman functionality?

Finally, in the 40's we may have the technology to have our brain completely uploaded to a small super chip to have an immortal life with super capabilities. Who knows, maybe if we age into the 50's, we may be able to live forever as a super human...

要查看或添加评论,请登录

Vahid Noormofidi的更多文章

社区洞察

其他会员也浏览了