The Smart Connected Stampede

Smarter Everyday Devices

The quiet familiarity of our current smart devices is a clear sign of the silent advance of the technology. As we only actually see the devices we interact with, it is important to understand the current scope of the devices we do not see.

Aside from our smartphone (#1 in the list, no doubt about it) and adding all wearables generously, plus home and car automation associated devices, the shortlist of “other” smart devices range in the hundreds of thousands to several million.

The meaning of the “smart” part of everyday use, but somehow, invisible devices, is not based on their artificial intelligence or cleverness. It points to their ability to get connected, both to a command center as well as among themselves.

The purposes of establishing those communication paths are required to both exchange, for example, operational status, as well to transmit collected data and, eventually, receive new programming instructions to improve their automated actions. (This is the explicit intention, so far)

But the number and classes of the smart devices, the so-called IoT for “Internet of Things,” is reaching unexpected numbers, covering unforeseen areas, and doing it in shorter than predicted times.

The metering industry alone is reaching today the projected numbers for 2022 (130 million gas and electricity IoT meters in LATAM). The EU expected to replace 80% of their utility metering by this year, which no-surprise, was barely affected by the Covid-19 pandemic.

The commoditization of home automation devices, hubs, and controllers, reached the DIY status, and the Smart Car Automation is following the same path. Some carriers considered home automation a feasible model based on managed services; however, the model is uncertain and lacks the desired traction to be regarded as a smart line of business.

Notwithstanding, carriers are providing connectivity and, in some cases, deployment and control plane transport to large utilities, especially in the electrical market, both for convenience as well as symbiosis. Electrical utilities can leverage power line data transmission technologies and negotiate an attractive bargain with telcos and ISPs.

Other utilities are relaying in the assumed fact that first world residential customers will have a broadband connection to allow a low traffic VPN from the IoT meter to the utility control center. In exchange, the customer would have access to monitoring and statistics or some interesting perks.

No matter how we see the future growth of IoT, it is pretty clear that classical computing, networking, and storage solutions (Among a hundred others needed) will not escalate at the speed and volumes required today. Let alone with the projected trillions of new connected devices in the next five years.

The subject is particularly interesting if we consider that in five years or less, some of the devices could be inside our bodies for several beneficial purposes. From a broad range of medical applications to the ultimate bio-electronic non-repudiable identification and location systems, the scenario is not free of controversy and moral judgment. Still, it is, nevertheless, coming soon.

So, aside from the ethical considerations, legal implications, and voluntarily or imposed adoptions, we can be sure that the traffic and processing workloads required by these devices must be provided in a safe and trustworthy manner. Nobody would expect his self-driving car to crash itself or a patient receiving an incorrect dose from his automated insulin delivery device because the smart things went offline.

Open Stack Base Architecture

The simplest model behind large cloud-based solutions and automation requires some architectural explanation to grasp the bigger picture of the massive solution needed. Large big-data based companies extensively used the Open Stack model before developing their proprietary solutions, but the open-source model is still available and a valid path to follow.

We can synthesize this architecture as a top to bottom model, so let’s put our imaginary IoT Control Center Application at the top of it.

We will need basically to orchestrate a minimal set of: Computing power, networking, and storage services for our application. These services are orchestrated by a web front-end called “Horizon.”

“Horizon” is not the only front-end, but for simplicity, let’s also mention EC2API (API Proxies) that allows Application Program Interfaces to operate bi-directionally in the solution. This enables security integration.

Up to this point, it may be clear that traditional servers (Physical) would not handle very well our imaginary monstrous workloads. Therefore, we would need massive hardware but virtualized to exploit the resources better. Virtualization allows for 100% system resources utilization, so we need Open Stack Compute Services to provision and manage a massive number of virtual machines. Here Open Stack provides “Nova” for VMs. Let’s take it for granted that compute services also consider containers, functions, controls, resiliency, and everything else behind.

“Neutron” provides Open Stack Layer 2 and Layer 3 networking. Again, let’s consider DNS, Load Balancing, Redundancy, and everything else included in the SDN based Neutron service.

“Cinder” provides Open Stack block storage services, but also can be included objects as well as filesystems.

There is not such a thing as a virtual solution without a physical substrate. Open Stack is no exception, and also can manage “Bare Metal” resources, both branded servers as well as white label or generic systems, managing CPUs, memory, controllers, connectivity and recently, vast arrays of Graphics Processing Units or GPUs, capable of crunching complex mathematical workloads in a fraction of the time required by traditional CPUs.

The beauty of manipulating all of it from an abstraction perspective relies on the pure software nature of the whole process, which by the way, includes the no less important aspects of shared services. This aspect allows the integration of OpenStack solutions with, so to say, “external” and “internal” services devoted to specialized tasks, foremost of all, security.

Finally, as any other solution, it must be designed, tested, deployed, operated, and maintained. No wonder, these tasks can also be performed within the OpenStack solution itself.

Drawing the model as an organization hierarchy, we arrive at something like this (No art pretended here):

No alt text provided for this image

With this humble and oversimplified representation, we can grab some basic premises about how cloud-based IoT solutions are working today and will work in the next years. The key concepts behind are not necessarily obvious: Highly flexible and agile escalation driven by extensive automation sustained by the development team and the operations team, currently known as “DevOps.”

Several models are parallel or similar to OpenStack; however, it is still the most simple to start a new cloud-based service, especially hybrid.

There are several challenges ahead. Security is still the main objection for cloud adoption, so several vendors and entities work in this subject (Cloud Security Alliance is a well known one). It is interesting to notice that despite the standards behind cloud services, cloud services itself are not necessarily interoperable or transparent. Interoperability is a work in progress, a customer’s claim, and concern.

Meanwhile, R&D is pushing with even more classes and devices that will undoubtedly challenge our perception and scale of cloud services in a very short time.

What The Next Industry X.0 Means

I want to recommend the book “Industry X.0: Realizing Digital Value in Industrial Sectors” by Eric Schaeffer, Senior Managing Director at Accenture. While my note is not based in his book, it is a serious and authorized source for the subject. By the way, Accenture coined the term “Industry X.0” AFAIK.

My literary version is as follows:

Intuitively, software and firmware engineers are used to manage “release train” models, where a new core software and features receive the X.0 denomination. Significant modifications from bugs, new features, and similar internal or external causes, trigger the “X.n” releases, and minor “sub-n” changes, including cosmetic issues, take the form of “X.n.m” releases. With few variations and nomenclatures, this is more or less the simplified general approach for continuous software and firmware development.

A jump from version X to X+1 is a significant event, surrounded by a colossal effort driven by the Quality Assurance Team and the Development Team, involved in a close dance of compilations, tracing, testing, and rewritings until the new major release passes the required assurance tests.

Among some crucial considerations, we can mention: The backward compatibility interoperation, ecosystem interoperation, OS dependencies, the hardware compliance for the upgrade, product series that will become obsolete due to the new version, existing contracts with rights to upgrade that will be affected, assessment of major GUI and CLI changes, ease of the upgrade procedure, recovery of previous configurations, services disruption, standards compliance, regulations compliance, and an overwhelmingly long list of checks to do even “before starting” the actual writing and testing of a new major release.

Even so, it is easy to verify that in most, if not all cases,  the “Release Notes” include several known bugs or undesirable effects to be aware of before deciding to upgrade or not one device or a large installed base.

Now, this is a predictable process of continuous accepted change. It was like this since the first written software in the world.

Now focus your imagination to apply such a concept to a vast,  entire industry, not only changing the business model radically; but telling the Development and Operations teams that the model will continuously change from now on. No wonder that both teams merged and are known as “DevOps.” It is scary, at least.

We have new technologies to manipulate reality through enhanced, extended, mixed, and virtual interfaces, and these opened a landscape where classical paradigms and changes are unpredictable. Therefore, it seems absurd to foresee something like “Industry 5.0 will follow industry 4.0” because the frontiers have vanished with digitalized operations and digital products that can literally “materialize” anywhere. The willingness to accept that “X.0” will be replaced by “(X+1).0”, is the sheer acceptance that the coming changes will be disruptive, and also implicitly, depicts that there will be no room for adjustments like X.1 or X.2.5 in between.

IoT is a key component both as an object and as part of new materialized objects. Objects that come to exist elsewhere and need to report its existence and status to their X.0 industrial owner. These objects can be inside a human being, and report their host operational status, trigger an emergency procedure or even, could start a defibrillation procedure within their hosts.

And the common enabler is, again, no wonder, the tiny intelligence embedded in the object itself.

The artificial intelligence is, in this case, the insurance policy against the “disconnected automata.” fears, especially when we consider that an X.0 industrial object failure can potentially cause a catastrophic disaster or kill his human host. Predict and prevent is the target just as AI-assisted security.

Cloud services architectures evolved to maintain afloat an entirely virtual control of physical computing, networking, and storage environments. The Industry X.0 disruption will present us with a completely virtual industry, producing in some cases, only digital assets that someone else will materialize on demand, as needed and anywhere in the world. Observe that this X.0 industry will have a profit from these objects without the need for any physical assets. Science fiction?. No.

A crude example would be the industry of object designing for 3D printing. More sophisticated examples will include hydraulic simulation for large dams or climate change modeling with predictive scaled calendars. Is there a limit?

The cocktail of AI, analytical power, enhanced reality, mixed reality, real-time GPU array-based modeling, and “SMAC” (Social, Mobile, Analytics, Cloud), among other emerging and re-emerging technologies, is shaping it all. We can recognize a significant change vaguely as “mixed reality.” But adding the IIoT or the “Industrial Internet of Things” can blow up the entire “Industry 4.0 workshop model” to an entirely parallel universe where “there is no workshop at all.” Do not forget that virtual security glues it all. No choice here.

Schaeffer is a master explaining what capabilities are needed to stay afloat in the coming transforming years. The understanding of software, analytics, uPLM (Unified Product Lifecycle Management), Agile Manufacturing, “as a service” model, innovation ecosystems, are crucial for the digital survival of any industry. A fatal error would be even to consider that the mass extinction will spare your sector because it will spare none.

Are We Going Be A Part Of It?

We will probably be part of the next transformation, tagged names or adjectives does not matter.

The shift from what we know as the industry today to the next transformation will polarize many sectors, economies, and geopolitical areas. The strategic or collateral role of each one will depend on the early or later understanding of the transformation itself.

While early adopters will embrace the most advantageous roles and positioning, the slower ones will remain -most of them in denial- in their current models, ending up as the commoditized part of the industrial chain. That translated directly in “loss of added value,” and consequentially, loss of market value and share.

The virtual industrial adopters will replace the top of the food chain, making the current industrials closer to the raw materials than terminated products. The produced objects will no longer depend on traditional factories and globalized competing supply chains, but in more competitive and agile cooperating ecosystems, where locations are increasingly meaningless.

Massive automation and robotics will decimate human jobs, at least mechanically replaceable human jobs. But AI is already replacing humans in otherwise considered human tasks, such as sales management, call center QA and even call quality coaching. Fewer jobs and more humans pouring in the world is a bad combination. The World Economic Forum considers a big reset. Currently, several social experiments are dealing with UBI or “Universal Basic Income.” UBI is interesting, but there is even more ahead.

Between the world domination conspiracy theories and the new world order (Which was more related to the decade 1991-2000 for practical purposes), there are entirely disrupting initiatives. Some are ranging from robots getting citizenship (There is at least one robot in this situation in Saudi Arabia) to the abolition of the “Legal Person” and the “Dissolution of the Corporate Legal Entity” and “Virtualized Economy.” Still, none is perceived as more perverse or shocking than “Posthumanism.”

As humanism emerged against medieval authoritarian religions and popular superstition in the European Renaissance, it envisioned the human destiny shifting from the whimsical divinities hands to the reason of men (White men, at that time). Humanist ethics emerged and challenged the old dogmatic ethics, and won. "Universal declarations followed".

Posthumanism presupposes that the guidance of the next destination of humanity, may not be in human hands. That is: Our technology could be the next step in our evolutionary process, and we, humans, could become a by-product of that evolutionary step. Moreover, humans could become an obstacle or waste for the next evolutionary step. The fossil record has plenty of clear biological examples of this scenario, starting with the extinction of nitrogen dependent organisms when oxygen-dependent organisms evolved and thrived on the planet. That was the oldest recorded massive extinction on Earth. Thousands of more extinctions followed, including some humans too.

Transhumanism stays somewhere in the middle of that road, proposing the evolution and enhancement of human capabilities through technology implants.

Imagine AI-driven genetic modifications and some kind of “Cyborg” as the common ancestor of “Robmanity,”(2) new class of robots capable of replacing humans,  and another branch of pure robots generation,  unrelated with biological humans, except for legal considerations.

It is fascinating to notice that, while technology and science will have virtually no problem reaching the transhumanist ideals in less than a decade, the only barriers are the ethical opposition and the current body of laws. But speaking honestly about ethics and laws, both are human conventions. And human conventions eventually change. Segregation was perfectly legal in some US states well in the 1960s, although the US constitution declares that all men are born equal. Slavery is still happening in several places in the world despite the Declaration of Universal Human Rights, Human trafficking and child labor persist in the third world, so maybe human conventions are not as strong as we may think.

And maybe if machines were in charge of the next evolutionary step, they probably follow cold,  pure logical inferences instead of human conventions or ethics.

Perhaps the machines could develop their own ethics and challenge ours, as the Renaissance free thinkers did with the old belief based medieval superstitions. Will they win? (Notice "they")

Conclusions

We must be vigilant and demand that whatever the next transformation is, be built with the “Three Laws” (1) embedded at its innermost code, protected with non-contradictory and “non-challengeable” logic. Or we are done as species.

Seriously: The coming transformations require a firm willingness to take counterintuitive actions, such as forget everything you know and learn it everything again from zero. We all were warned of this by the year 2000. Well: The time is today. (I'm on it).

Besides the obvious practical advice to study, train, and practice virtual security, it is a good idea to take advantage of the COVID-19 pandemic and revisit the foundational technologies of cloud-based services.

Take a look at the “as a service”  models, review UNIX and Linux administration concepts (or take a course), Analyze DevOps challenges and solutions, try to understand Kubernetes (Quantum Mechanics is proper training before attempting that)

Give a nobler use and a second life to that old gaming laptop and build an OpenStack based private cloud at home. It sounds weird, but it is fun!

Thanks if you reached this far!. This text is an open writing exercise. Welcomed all comments.

Luis Guembes

Annotations:

(1) The Three Laws:

Introduced in Isaac Asimov’s  1942 short story "Runaround" (included in the 1950 collection I, Robot), The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are:

·         First Law

o   A robot may not injure a human being or, through inaction, allow a human being to come to harm.

·         Second Law

o   A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

·         Third Law

o   A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

o    

(2) “Robmanity” is a neologism pretending to be a contraction of “Robotic Humanity.” As far as Google, the term did not exist before these writings, so the definition is entirely arbitrary and accidentally mine.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了