How 'Bots'? may become a part of our everyday life. Future Tech - AI & IOT

How 'Bots' may become a part of our everyday life. Future Tech - AI & IOT

It's 7:30 AM and Arnold hears sweet music playing as a wake-up alarm played by Alexa, which delayed the alarm by an hour, so that Arnold gets ample 7 hours of sleep, as he had slept an hour late than his usual time. Alexa captured this information from the fitness app connected to this smartwatch. Further, Alexa turned on the Geyser to 48 Deg Celsius hot, at sharp 7:40 so that by the time Arnold completes his morning routine, he will have hot water adjusted to his preference. At sharp 7:55 Alexa turns off the Geyser, once it receives the signal that hot water is used and now the tap is turned off.

No alt text provided for this image


Image Source -https://www.nakima.es/alexa-nueva-interfaz-de-comunicacion

As Arnold is grabbing morning coffee and breakfast, the official Voice assistant (Jarvis) is reciting the latest news and then pulls details of his schedule of meetings of the day. It also further guides him, that there are 5 pending tasks that are due today. It directs Arnold that with 3 meetings today and 2 critical tasks for CEO, he will not be able to complete all tasks in days’ time and suggests pushing 3 tasks to the next day. To which Arnold instructs to only postpone 2 tasks to tomorrow and delegates 1 to Smith. Who immediately gets an alert on his email for the task assigned to him?

?

By 8:30 Alexa turns On the Tesla Model X and adjusts the AC to 23 Deg so that by the time Arnold enters the car at 8:40, he has the car cooled. He gets in the car, and the car shows today's route to Arnold, adjusts to delays in traffic and bases his meeting plans. Once the car gets the confirmation it gets into Self-drive mode, and Arnold sits back and goes through his mailbox. He dictates responses to various emails to Jarvis, instead of typing and finishing that work, while the car drives through the city, following the route charted and adjusting to any congestion updates from Google Maps.

No alt text provided for this image


Image Source - https://www.vox.com/2016/4/21/11447838/self-driving-cars-challenges-obstacles

?

One of the tasks requires him to provide a quarterly regional Sales forecast, for which he calls out Jarvis to pull out sales data of last 3 years, further instructs Jarvis to run forecast on Sales Data with adjustments to new regulatory policy, and 1-month higher demand basis new product launch in the upcoming festive season. Jarvis runs multiple forecasts model, and presents the output through various descriptive visualization charts, generating insights on key dependent variables and providing recommendations on which regions to target for improved sales.

Arnold reviews the output, adjusts some recommendations, and asks Jarvis to publish the report in PowerPoint. By the time he reaches office, his Sales forecast report is ready.?As he enters, the face recognition camera captures Arnold's entry, runs face recognition models, and verifies to grant entry to Arnold. It also updates the HR system for Arnold’s attendance. and marks his presence. He gets alert on the meeting room booked by Jarvis for the Sales forecast presentation. The same alert is sent to all the participants?

As the meeting starts, Jarvis checks with Arnold if the meeting needs to be recorded. Arnold instructs not to record the meeting. But take notes and action items. Jarvis circulates the meeting notes to all invitees at the end of the meeting and updates Action items in the Task list of task owners identified in the meeting.

Wouldn’t that be amazing if all that was real for us now??Well, it may not be real now, but I guess this will come true sooner than we may anticipate, or imagine. Some parts of it are already around us, it’s just we are not adjusted to the level of assistance that AI can offer or the interconnectivity of IoT has not been achieved to a fully connected home, car, office, or shop floor. But imagine that there will come a time when a Corporate Jarvis (Official AI Assistant) will talk to Personal Alexa to get updates about Arnold’s routine and vice versa. Then these assistants adjust, optimize the daily, weekly plan for Arnold and Arnold can prepare to fight the Terminator. ??

No alt text provided for this image

We would have seen this in Avengers with Iron Man doing pretty much similar stuff, of course, scaled 10 times of what I just described.

I will possibly break some of these scenarios to share a better understanding of how AI/Ml is being embedded in Day in a Life of Arnold (We), in another article. Please note while the complete cycles will have steps of interconnectivity, automation, rules-based guidance, etc. I will probably cover that while explaining the process. However, will not extend beyond AI/Ml Use Cases. The other components could be part of another article on Interconnectivity and Embedded automation.

Some of these high-tech Use cases are being discussed, designed, and implemented around us, and fortunately, I am glad to be part of such a level of innovation and driving the same at various enterprises.

Note: I used some of the real-life references like Alexa, Tesla, Google Maps, or Google Home, etc. to bring this seamless view of the future in light of where we are today. By no means do I endorse or oppose these products/brands.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了