#16: First-hand experience makes you want better healthcare for all

#16: First-hand experience makes you want better healthcare for all

Afternoon all (or hello, whenever the algorithm puts this in front of you!).

Apologies for the gap last weekend; it's fair to say our family had some first-hand experience of hospital care for probably the tenth time this year, but this time an emergency situation.

But for the first time, we were able to witness some joined-up data helping to progress a case that was moving slowly and worryingly peaked out of the blue. A paramedic was able to gather some data from a small portable device, read it out to a hospital clinician, who called up the most recent routine diagnostics on file, who said "bring them in ASAP".

Now as much as we talk about the digital side of tech, sometimes you need humans to glue it together. Maybe one day the portable machine will join the dots and suggest a trip to hospital, maybe mixing the diagnostics with triage rules and availability (let's be honest, care is clearly rationed unless we have limitless money) and first-responders just deal with the immediate problem and transport, but for once we were quite glad machines weren't in charge.

Onwards.


We'll start with an early "word from our sponsors" this week as I wanted to get it top of the newsletter. And it's not really an advert this time either, something more important and independent.

Gary Crawford and Kevin Stewart of Waracle, on stage at Tech Circus's "Product Design Week". Thanks to Simon Hull for the photo.

This week, Gary Crawford & Kevin Stewart from Waracle, illuminated a packed-out Tech Circus audience at London Product Week when they presented their in-demand talk, “Designing for run-time intelligence: When software understands the user”.

Before the event, CInO Gary said:

For decades, we’ve crafted digital products that have transformed industries and delighted users. Yet despite their sophistication, these software-enabled creations have lacked true intelligence. They’re just a coded version of the rules and constraints we predefine at design time.
With AI, however, we can design and build experiences that behave more like a human—experiences that problem-solve, learn, adapt and interact with users in ways that suggest empathy and emotional intelligence.

The other half of the dynamic duo, Kev, added:

To build such experiences, we must change our mindset, practices, and processes. We must rethink conventional wisdom around designing and building products and adapt them for run-time digital intelligence.
We think this is why there’s such rapid growth and contribution around the Intelligent Experiences Manifesto. AI is a paradigm shift, and people need guidance. We hope to see more people getting involved and helping define how we build valuable and ethical intelligent experiences.

To dive in further, please have a look at the manifesto and the lads would love to hear your own thoughts on it. Comment below or send them a DM!


Passing the time last week sitting in hospital I was listening to Scott Galloway 's podcast noodling on Amazon's advances in health. His view was that where in the past, "rich families in the Hamptons" would pay $100,000 a year to access the best possible clinicians in a co-operative of sorts, tech and smart business might offer something similar (if not quite as grand) to the rest of us.

The announcement was a couple of weeks ago now, but Amazon has built upon its acquisition of One Medical last year to offer a sub-$10 monthly subscription that effectively gives you a high-quality virtual GP service. Quoting Amazon's online testimonials, the reason for the service would ring a bell with most of us:

I love One Medical. Blew my mind that I spent my entire appointment with my doctor, vs. sitting alone in a cold room for 20 min after a nurse takes vitals, only to see my doc for 1-2 min. Truly a revolutionary experience.

Having used a workplace medical benefit to use a virtual GP for my son this past year, and had a reassuring half-hour conversation as opposed to the usual rushed, triaged one with his NHS surgery, you can see why parents in particular might pay for this - particularly at a knockdown Amazon-subsidised price. Amazon then announced further partnerships with hospital organisations in the US to expand coverage and embed the One Medical service further.

And it makes you wonder, given Amazon is clearly (from the outsider's point of view) catching up in the AI arms race, how they might leverage some of their commercial and customer advantage to level up in specific domains they can implement more quickly.

For example it would come as no surprise to see Amazon going down a more advanced, rapid route in healthcare, given AWS is gaining some traction in the space too, using commodity cloud services for particular benefit.

Even though Amazon is a pretty distributed company with each division owning its own distinct goals, it would be fallacy to believe people don't talk to each other where mutual benefit exists. Amazon has been slowly positioning in health for years and it seems more likely to gain traction than Google, for example, which offers many stories in healthcare but you don't see too many press releases on the commercial side. Google suffered the same in travel 13 years ago and dropped the thick end of a billion quid to buy a direct connection to the trade. Over time it started to eat various lunches and you can see Amazon doing the same in health.

In a week the Alexa team was once again scaled back (with great regret in seeing good ex-colleagues and acquaintances suffering from bad decisions) you'd think wiring up the hundreds of millions of communication devices to that medical care might be a good idea and something those staff could progress. They do it in a closed environment in care homes already, why not expand out to a subscribed Prime healthcare base?

And, given this is normal in the US, would people in Europe pay for enhanced care? At £10 per month I bet plenty would.


With that out of the road, there were a few things that caught the eye this past fortnight.

Firstly having worked on a device that tries to detect a diseased foot by listening to audio from it, we were fascinated by another platform that can see through your eye to predict heart conditions.

Toku’s starting point is that there is a strong link between glaucoma and heart-related conditions, so examining a patient’s eye can give a clinician an idea of how that patient’s cardiovascular system is working. Its main product is a non-invasive, AI-powered retina scan and technology platform it calls CLAiR, which can detect cardiovascular risks and related diseases such as stroke and type 2 diabetes.
The platform is groundbreaking in its approach: CLAiR uses AI to “read” tiny signals from the blood vessels captured in the retinal images, and Toku claims that it can calculate heart disease risk, hypertension or high cholesterol in 20 seconds. And because the platform integrates with existing retinal imaging cameras, the diagnostics it measures can feasibly become a part of any routine eye exam.

As someone who runs a blood test every 3 months to track cholesterol, I'd prefer to point my phone at my eye and get a reading from that. This platform is quite a distance from phone use but everything starts with baby steps...


Along a similar theme (and you'll notice a few of these stories are close to home this week), Cytovale gained a big investment to advance its IntelliSep diagnosis tool for sepsis.

Earlier this year we had some experience of a slow sepsis diagnosis and a hospital trip so will keep an eye on this story too:

Sepsis is a dangerous, fast-moving condition that can result in death if not identified and treated quickly," said Cytovale CEO Ajay Shah. "Our flagship diagnostic tool, IntelliSep, with a blood-to-answer time frame of under 10 minutes, helps healthcare providers recognize sepsis early and make critical, time-sensitive clinical decisions. With the support of our investors, we are now able to expand efforts to get our tool in the hands of more providers so they can address the potential deadly outcomes patients currently face.

Another thing we're hoping to work on in the day job is a finger-prick test that can give instant results for various conditions, using a small test device similar to the ones we all experienced during Covid. If that could be trained to pick up things like sepsis much earlier than a lab test might do, if you're lucky enough to get one, the 330 deaths per 100,000 people might just be avoided.


Elsewhere this week, Answer Digital was in the news with a fascinating project promoting faster diagnoses of conditions within the NHS. The article explains all:

It receives a live stream of medical imaging data which allows clinicians to access near real-time AI analysis in seconds. Following the analysis of the data by the AI technology, results are sent directly to the EPR [Electronic Patient Record] to support clinical decision-making. This is helping to both speed up and improve diagnosis and care across patient pathways including stroke, dementia, heart failure and cancer.
Federated Data platform FLIP allows data from multiple NHS trusts to be used to train new AI models for future clinical use. The patient data is never pooled or shared outside of the originating NHS trust thanks to privacy-preserving Federated Learning technology.
So successful has AIDE’s use been at Kings College Hospital, that it has now been rolled out to a further six NHS trusts, supporting its role as an effective channel for widespread AI deployment in the NHS.

I first came across federated learning in the early chatbot days, when smart speakers were presumed to be "always listening", and the big tech companies had to find ways around collecting useful data whilst not doing it all the time. Collecting pools of data for specific tasks on the devices and separating those into separate contextual models (for example, the ways people ask to control lighting) eventually led to those things not sending audio to the cloud at all, the device can do it all itself.

It seems that finally this technology has a use, taking a baseline model with patient data removed, allowing clinicians and scientists to come up with bigger and better, or more specific applications.


Elsewhere a paper including input from Debbie Wake explored the possibilities of using AI and sensor data to monitor and treat diabetes. Debbie's venture, MyWay Digital Health Ltd , is already advancing in this area but the paper takes a deep dive into the possibilities and pitfalls of using AI to go a lot further.

Many companies are starting to capitalise on commoditised glucose monitors, with some (such as Dexcom UK ) specialising directly in diabetes monitoring, others using the devices for general health advice and models built around the glucose data - but there is clearly so much more to do:

The potential to integrate lifestyle data, such as physical activity and nutritional intake, enables more sophisticated automated nudge solutions to encourage activity and behaviour change and provide feedback to the individual [16,17,18,19,20]. Further, this data can enhance glucose and insulin dose calculations. The advent of smartphone applications that can predict nutritional macronutrients from photographs [18], phone accelerometers and wearables to collect activity information, and continuous and flash glucose-monitoring systems for high-frequency automatic glucose tracking is improving the ease and volume of home-recorded real-time data delivery to drive AI advice tools for both patients and clinicians.

From recent work we've been doing in Waracle, this type of interaction certainly rings true - and one of the illustrations from the paper certainly gives some food for thought on how far we can still go (hope they don't mind me sharing this, but it is a public paper):

The possibilities for diabetes care and AI which are still very much to be discovered. Image from the paper noted earlier in this article.

Please do read the paper because it's genuinely fascinating but here's an abstract from the conclusion to give you a flavour:

AI-based tools alone will be no panacea, their benefits must not be ignored. Such tools can be delivered at low cost and scaled throughout a population or clinical workforce to deliver significant benefit.

Lastly something crossed the mind whilst sitting in hospital waiting rooms and car parks quite a bit lately - how poor the connectivity is. And whilst it's a luxury to think you can have robust free wifi in the hospital, and the cellular signal isn't really the NHS's business or top priority, it makes you wonder how some of the advances talked about in these newsletters might be available to people that can't afford a good connection at home or on the move.

As mentioned at the start of this week's scribble, it took humans talking on an old-fashioned phone to sort out an emergency case last weekend. In a previous visit, a prescription couldn't be made because the ward's computer couldn't get connected, then needed replaced entirely.

And it seems like a problem at both ends of the spectrum. If the likes of OpenAI and its customers dominate the rush for silicon to power bigger and better models, how do we ensure social benefit comes first? What's to stop the NHS being priced out of the market? How do we stop advancements being centralised around the all-powerful cloud providers who can wear the cost?

Doug Bierbower posted an interesting piece going into exactly these problems and their effect on healthcare. It's not quite as broad as my moan above, but gets into some of the problems hospitals will face in simply being as connected as patients expect.

Now, consider cutting-edge technology use cases that are still on the horizon. Future healthcare technology might include remote surgery enabled by augmented reality (AR) and virtual reality (VR) or the use of complex artificial intelligence (AI) and machine learning (ML) algorithms to conduct patient diagnoses. Tomorrow’s healthcare providers will use the Internet of Things (IoT) to monitor and maintain diagnostic equipment and other medical devices. These applications require mission-critical, secure, reliable, and ubiquitous connectivity to ensure optimum patient outcomes.

And remember to read the Digital Health Roundup:


A(nother) word from our sponsor, ie. the company that pays me through the week so I can waste Sundays on this ??

Gary and Kev have also co-authored a piece for Waracle's web site on the future of AI and how it redefines the notion of interaction. The principle is going from "clicks" to "connections" but for the rest, you can read it yourself below...


Small print: This newsletter goes out to subscribers and across LinkedIn most Sunday nights around 7:30 pm. Feel free to contact me if you've seen or are creating something interesting in digital health. I work for?Waracle, but all opinions and content selections are my own. Anything in which I have a work or personal interest will be declared.

Cover image was generated by Playground AI using the simple prompt "OpenAI's new robot CEO".

Jings, 2500 words this week, if you got this far then well done. Make yourself a pie:


要查看或添加评论,请登录

社区洞察

其他会员也浏览了