Technology Trends & Impact of Artificial Intelligence on Healthcare

Technology Trends & Impact of Artificial Intelligence on Healthcare

The age of Artificial Intelligence (AI) is here. Applications leveraging AI can be found in every industry but the impact AI can have on healthcare industry is tremendous. We are talking about savings lives by reducing doctor errors and at the same time increasing resource efficiency resulting in cost savings. As AI progresses from assisted to augmented to autonomous stage, so is the technology supporting AI moving from cloud to edge/fog to device/end-point. It has resulted in coining of a new term called embedded AI (e-AI). Some recent products enabling the AI ecosystem are discussed along with real world AI applications for healthcare.

Definition and Levels of AI

The theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

The potential for digital platforms and AI to underpin and grow the future of work is unbounded. To better understand the role of AI in this context, it is helpful to think of three levels of intelligent digitalization.

Impact of AI on Healthcare Market by Numbers

Artificial intelligence "is rewiring our modern conception of healthcare delivery," according to a new Accenture report that shows an array of clinical AI applications are already well on their way to saving the industry $150 billion over the next 10 years.

AI to Embedded-AI (e-AI) Technology Trends


Certainly, the cloud offers the most power and data storage, allowing the most immense and powerful of systems. However, when it comes to agility, responsiveness, privacy, and personalization, the cloud looks less attractive. This is where edge computing and shallow learning through adaptation can become extremely effective. “Little” data can have a big impact on a particular individual. Think how accurately and how little data is required for a child to learn to recognize its mother.

As the Internet of Things (IoT) continues to accelerate and businesses realize the immense benefits, the next breakthrough capability is to enable IoT devices themselves to evolve. In early IoT solutions, most IoT devices simply sent telemetry to and received commands from the cloud, with the logic that found insights in device telemetry residing in the cloud. As billions of devices get connected and send trillions of messages, it makes sense to move some of the cloud intelligence out to IoT devices themselves. When IoT devices start running cloud intelligence locally, we refer to them as “IoT edge” devices. Enabling intelligence on IoT edge devices means enabling analytics and insights to happen closer to the source of the data, saving customers’ money and simplifying their solutions. IoT edge devices range from small footprint devices (e.g. smaller than a Raspberry Pi) and gateways to industrial machines and autonomous vehicles. Instead of simply generating data and sending it to the cloud, these IoT edge devices can process and analyze data to gain insights, and then quickly act on them locally and autonomously. For example, a PERs device needs immediate response time to alert emergency service provider in-case of a patient fall since this could be the difference between life and death. While the benefits of edge intelligence are immense, the challenge it poses is how to develop, deploy and manage this cloud intelligence for IoT devices in a secure and scalable way. 

Seamless Software

Microsoft Azure IoT Edge has a revolutionary set of capabilities that extends existing IoT gateway offering to continue to simply IoT further. Azure IoT Edge is a capability spanning cloud and IoT edge devices that make it easy to securely distribute cloud intelligence locally. Azure IoT Edge is cross platform, running on both Windows and Linux, and on devices even smaller than a Raspberry Pi with as little as 128MB of memory.

New Breed of Semiconductor Solutions for IoT Edge Devices

For Edge/Fog – Nvidia and Intel

Nvidia is aiming to put laptop gamers on par with their desktop counterparts, and by releasing the GTX 10 Series for notebooks, the company more than hits the mark. The recent rollout of Nvidia's new Pascal-based 10 Series mobile Graphics Processing Unit (GPU) will give laptop users about the same level of gaming eye candy as desktop owners. Due to the revolutionary Pascal architecture, mobile systems equipped with Nvidia's 1060, 1070 and 1080 GPUs will show consistent performance oomph over the previous 9 Series chips. In fact, the company touts that the GTX 10 Series is miles ahead of any existing mobile GPU.

Let's start by checking out the specs. The cream of the crop, the GTX 1080 mobile, comes carrying 8 GB of GDDR5x RAM, no less than 2,560 CUDA cores and it runs at a clock speed of 1,733 MHz. The GTX 1070 mobile has the same amount of RAM, sports 2,048 cores and is able to run at a speed of 1,645 MHz. Last, but not least, the GeForce GTX 1060 mobile only holsters 6 GB of GDDR5 RAM, 1,280 CUDA cores and runs at 1,670 MHz.Those who are not familiar with the technical side of things should know that the figures stand pretty close to the equivalent desktop GPUs released by Nvidia earlier this year.

When comparing the laptop and desktop performance, the GPU manufacturer managed to deliver about the same level of performance, regardless of platform. The desktops will have a leg up because of better cooling capabilities and power efficiency, but the difference should stay at about 10 percent. To put it in perspective, know that the pre-Pascal GPU generation delivers a performance that is 76 percent lower than the one on the new GTX 10 Series.

Intel’s dedicated Movidius Myriad X Vision Processing Unit (VPU), will decrease operational costs and increase efficiency when handling AI payloads. Intel says that its latest VPU is the first industrial system-on-chip (SoC) that utilizes a dedicated neural compute engine for hardware acceleration by efficiently distributing AI workload across various types of hardware. The company says that the chip is designed specifically for the operation of deep neural network tasks at high speeds and low cost, enabling developers to build the "next generation" of deep neural network applications on Windows clients. Microsoft boasted about the artificial intelligence (AI) capabilities in Windows 10 at its Developer Day conference. Machine Learning (ML) was also a focus of the event, with the Redmond giant announcing that with the next feature update to Windows 10, developers will be able to use pre-trained ML models in their Windows apps.

For Device – Renesas and ST Microeletcronics

Renesas’ "e-AI" technology implements artificial intelligence technology in embedded devices. As a first step of the e-AI solution, it introduced a new set of tools which are plug-in compatible with the open source Eclipse-based integrated development environment "e2 studio". And let you implement the result of deep learning in the endpoint embedded device.

STMicro’s development tool "STM32CubeMX.AI" for executing inference processing of AI with Arm Coretx-M microcomputer "STM32". In general, there is an image that a large power consumption IC such as a GPU is required if it refers to AI processing, but this time it demonstrated that AI processing can be used even with a low power consumption MCUs. "STM32CubeMX.AI" get as input learned CNN (Convolutional Neural Network) developed using well-known AI tools and libraries such as Lasagne, Keras, Caffe, ConvNetJS. First sending the learned CNN to the server of the ST running "STM32CubeMX.AI". At that time, the user designates the type (product number) of the microcomputer for which CNN is desired to be mounted, the priority of performance and power consumption, the IDE of the microcomputer to be used, and the like. Software on the server automatically generates binary code optimized according to user's specification and sends it to the user. It is a procedure to input it to the specified IDE and install it in the target microcomputer together with other code.

For example, in the demonstration to estimate the environment where the microphone was placed from the sound picked up by the environment obtained with a general MEMS microphone, on low power MCU "STM32L4" based on Arm Cortex-M4 as the CPU core and it can distinguish indoors / outdoors / cars from sound (background sound). In another demonstration, AI function can judge actions such as" running," walking "," standing "by users wearing smart watch from data such as acceleration sensor, and they showed whole process to implement of such AI to STM32F7 using Arm Cortex-M7.

 Applications of AI in Healthcare

AI technology is finding many ways to contribute to the healthcare industry. Few of the applications which have gained prominence are listed below:

  • Virtual Doctor - Patients who feel a little unwell or think they need medical advice will dial into a telehealth service and talk to a nurse. Data on their condition and symptoms may be uploaded in real time from a smart phone or smart sensors, and an artificially intelligent system will suggest next steps to the nurse on the line. And by the way, smartphones will be used to regularly send pictures or videos which a computer will read and recommend how to proceed.
  • Uniform Access - Patients who feel sufficiently unwell will not go to a hospital urgent care department and instead will mostly go to a conveniently located small clinic, probably in a local mall or chain pharmacy. There the patient will be seen by a nurse practitioner who will be able to take into account a patient’s entire medical history by pulling up a universally accessible, privacy protected, electronic health record, or EHR.
  • Virtual Care - Patients with chronic conditions will be cared for at home by visiting nurses and doctors (matched by smart platforms and “Uber”-type technology) who can then call in as frequently as necessary either in person or via telehealth means. People who are not ambulatory at all will also be able to be watched over by AI robots that also provide some basic care in situ.
  • Smart Equipment - Where a hospital is still needed, for say major surgery, these will be making extensive use of technology, much of which will be available in every patient room (or portably deliverable to it) like a mini ICU. AI will feature significantly in these rooms and will be blended with human resources.
  • Interactive Patient Education - In-Patients (in hospitals, surgery centers, clinics, skilled nursing centers, hospices etc.) will have multiple screens around them which can deliver tailored education by AI means and be responsive to patient requests for feedback (by just using their voice as a command).
  • Dynamic Resource Planning - Human medical staffing ratios will be adjusted constantly according to the individual patient’s need as determined by AI risk-monitoring and treatment algorithms and by adjusting according to a continually updated electronic health record.
  • Single EHR - Most orders and notes from doctors will be entered into the EHR through natural language voice recognition software. Each patient will control his or her own EHR, a digital compendium of clinician-generated notes and data with patient-generated information and preferences (all of which will be simply analyzed, charted and displayed as a patient wishes).
  • Smart Alerts - Patient Alerts will be calibrated to clearly distinguish life-threatening issues and problems from minor conditions or ignorable symptoms.
  • Assisted Decision Making - Doctor’s efforts will be greatly assisted, especially when engaged in differential diagnosis and evidence-based treatment and precision medicine practice by cognitive computing systems like IBM’s Watson.
  • Pattern based Decision - Artificial intelligence applied to cloud-based “Big Data” will assist clinicians by comparing and contrasting individual patient’s characteristics with other patients in the database with similar conditions in order to find the best possible diagnoses and solutions.

Conclusion

In the coming years impact of AI on healthcare will be measurable and end applications very clear. The ecosystem needed to enable new AI based applications will evolve and be more accessible to developers for creating new applications. I think the future of AI in healthcare is bright and it will make a great impact on the health of our society.

Cheong Ang

Led tech #innovation at UCSF Health, IBM | Boost EBITDA w/ Agile, AI, App, Cloud, Coordinated Care | CIO.com Contributor

6 年

Ash, thanks for sharing your insights re: e-AI.? For our Remote Patient Monitoring use cases, I would like to learn more how quickly we can deploy e-AI tech like Renasas' at the edge since our core competency is to bridge the gap between remote devices and the clinic or hospital workflows so true positive AI insights could be acted on promptly.? Would love to connect and chat.

回复

要查看或添加评论,请登录

Ash Patel的更多文章

社区洞察

其他会员也浏览了