The Library of Alexandria, Big Data and Babel Inferno
Fernando Jimenez Motte
NEUROMORPHIC TECHNOLOGIES Founder & CEO 16K (Twitter @stockfjm) Worldwide expert in Control Systems Engineering, Robotics , Machine learning, Radars and Electromagnetics
We live on a planet that has witnessed an explosion of technology without precedents mainly driven by a revolution in digital processing of data.
La Biblioteca Real de Alejandría or Antigua Biblioteca de Alejandría (1), was at its greatest time in the world. Located in the Egyptian city of Alejandría, it was estimated that it was founded by the works of Siglo III B.C. by Ptolomeo I Sóter, and expanded by his father Ptolomeo II Filadelfo, who published up to 900,000 manuscripts.
It is estimated that in the library you will be able to deposit the following number of books:
200,000 volumes in the time of Ptolomeo I
400,000 in the time of Ptolomeo II
700,000 in the year 48 B.C., with Julio Cesar
900,000 when Marco Antonio offered 200,000 volumes to Cleopatra, traídos de la Biblioteca de Pérgamo.
Today, one of the most disruptive technologies of the last decades, the Internet, will be converted into a Zettabyte information system, where one Zettabyte is equivalent to 1 billion terabytes.
To keep track of the different climbs we can handle the information, observe the following definitions:
One exaflop is equivalent to 1,000,000 teraflops.
One teraflop is equivalent to 1,000 gigaflops.
One gigaflop is equivalent to 1,000,000,000 mathematical operations per second.
We can appreciate a radical change in the process of data processing between the great hit to be published by the Library of Alejandría of 900,000 manuscripts and the data that was processed in the year 2017
Google processes more than 20 petabytes per day (2), the German investigative center in the climate will generate 10 petabytes per year and the CERN Gran Collisionador de Hadrones produces 60 gigabytes per minute. (3)
The CERN scientists must sift through 30 petabytes of the data produced annually to determine whether the collisions are caused by a pattern of interesting physical behavior.
The CERN Data Center produced approximately a petabyte of data every day, equivalent to around 210,000 DVDs. The center has 11,000 servers with 100,000 processing nodes. Cada 6,000 cambios x segundo will be carried out in the basis of data.
If we observe the global financial transaction system, more than 50 million credit card transactions occur every day in the United States.
Today, we thought we lived in a world full of data, but in comparison with the future, the number of data today is relatively small.
In the year 2016, the average person generated 650MB of data last day through the use of our PCs, mobile phones and wearables. For 2020, most projects will generate 1.5 GB of data every day. In 2024, the average internet user consumes 533.8GB while on the internet plan at home, with recent information from OpenVault. The number of internet users that consumes more than 500GB every year is constantly increasing every year.
This is an impressive increase of more than 200 years for less than four years but we can compare it with what we have seen in the revolution of autonomous vehicles.
We are planning on 10 million autonomous vehicles, which are on the car tracks for year 2020 (4). Currently there are more than 30 million cars without driver in the world. Please note that this is an exponential increase in the fact that you are using Tesla as a vehicle when you start your vehicle.
Cars connected to the red will have around 25 gigabytes of data at night every hour. Autonomous vehicles will process and monitor their response data, speed, damage to their components, and road conditions.
In the current situation, vehicles that connect to the Internet of Things or the Internet of Things IoT have 40 microprocessors and decades of sensors that detect telematics and driver behavior and can be analyzed in real time to maintain the vehicle. performance, efficiency and safety of the vehicle under control. It also provides vital information for towns and cities based on traffic volume and road design.
What could we do with such a huge amount of data? The answer is the prediction, recognition and classification of behavioral patterns and trends. The questions that we would ask ourselves as a result of the processing, analysis and interpretation of large volumes of data would be:
What can we predict about this phenomenon?
How can we extract knowledge from data to help humans make decisions?
How can we automate decisions based on data analysis?
How can we adapt systems dynamically to enable a better user experience?
领英推荐
PwC determined that this growth reached about 4.4 ZB in 2019, and Statista estimated that 120 ZB of data will be reached in 2023. In fact, IDC predicts that global data will grow to 175 ZB by 2025.
With the growth of data-driven, Machine to Machine M2M business models from industrial manufacturing to consumer electronics and energy delivery, organizations are running advanced analytics on sensor data to define and project new products, services, business processes and generation of added value. All of these cases are driven by the analysis of petabytes of data acquired from sensors and M2M communications.
In 2012, 20 typical broadband homes generated more traffic than flowed across the entire Internet in 2008. By the end of 2010, half a zettabyte of data traveled across the Internet, equivalent to the information contained in a library. of shelves 36 billion long (10 times the distance from Earth to Pluto) (6). There are 25 Petabytes (10^15) created every day and released onto the Internet. This is 70 times larger than the United States Library of Congress.
The Tower of Babel is a key building in the Judeo-Christian tradition, mentioned in the Old Testament. The main interpretations of chapter 11 of Genesis affirm that, with the construction of the tower, men intended to reach Heaven. It is not known exactly when the tower of Babel was built, but it is probably estimated that it existed in the period 1792-1750 BC. C.).
In the story of the tower of Babel, from the book of Genesis in the Old Testament, God "punishes" humanity for its arrogance and continuous confrontation, exposing man to the confusion of languages. If we look at the current global scenario, many centuries have passed since that time and the nature of the human being does not change in its essence.
In its quest to reach super scale, humanity could be approaching the Babel Inferno scenario, a term that future generations will use in the year 4,000 AD to refer to the Internet of Things (IoT) and congested mega cities, oversaturated with our era, if our civilization does not self-extinguish as a result of the Thermo Nuclear holocaust. Let us not forget that in the Big Data scenario, the infrastructure for launching the nuclear missiles of the super powers, mainly those of the USA, is also interconnected to the computer network.
?There are currently more than 30 million driverless cars in the world. This figure is expected to grow exponentially as companies like Tesla invest money in the development of such vehicles."
References:
Fernando Jiménez Motte Ph.D (c) EE, MSEE, BSEE
CEO of NEUROMORPHIC TECHNOLOGIES NT Robotics, Control Systems, Artificial Intelligence AI
Follow me on Twitter : @stockfjm