Midas touch – the technologies driving the haptics revolution
This is an abstract from an article in the publication "The Engineer" and features comments from Fundamental VR's Chris Scattergood who recently presented their FeelRealVR surgical simulator at the Royal Academy of Engineering's seminar on haptics.
By Andrew Wade 18th September 2017 8:00 am
Haptic technologies are advancing rapidly, in tandem with virtual reality and as standalone devices. Andrew Wade reports.
Our sense of touch is so integral to our existence that it’s difficult to imagine a world without it. Unlike vision and hearing, we can’t easily mask it with blindfolds or earplugs, or dull it as we can our sense of smell by holding our nose. Touch is an omnipresent function that completely envelops us via our skin, providing a layer of protection from our surroundings while at the same time enabling us to interact with them.
For centuries, sensory technology has focused largely on sight and hearing, the twin pillars that form the basis of communication. But the science of haptics, or kinaesthetic communication, is undergoing something of a revolution. The emergence of smartphones and proliferation of touchscreens have brought the technology into the mainstream. Now, the rising popularity of virtual reality (VR) and augmented reality (AR) is fuelling rapid advances, with sectors including healthcare and robotics discovering its potential across a range of applications.
“Our sense of touch is absolutely fundamental,” Dr Alastair Barrow, director of Generic Robotics, told the audience at a recent RAEng haptics seminar. “It’s the first sense to develop in the womb and it’s an ever-present always-on protector.”
Barrow, who has a PhD in cybernetics, co-founded Generic in 2013. He has more than a decade of experience in VR, haptics and robotics, and has collaborated on a number of medical training simulators for different branches of surgery, as well as procedures such as catheterisation and hernia repair.
“It’s no surprise that haptics in healthcare is a huge topic,” he said. “However, the number of applications where haptics is being beneficially used in healthcare right now is very, very small.”
As Barrow’s body of work suggests, surgery simulation is one of those areas. Currently, surgeons learn predominantly from theory, observation, cadaver and close-monitored patient trial, where senior colleagues oversee their work. A refined sense of touch and hand-eye coordination is obviously vital.
“When you’re practising to be a surgeon you need to develop an incredible array of abilities,” said Barrow. “You need to be able to have great academic knowledge, decision making, you need to look good in blue! One really important aspect is this close association of feedback between the sense of touch and dexterous motion…so we can use haptic devices to simulate doing procedures.”
“We’re starting to look at using actual scans of real patients in simulation, so that a surgeon can practise doing a real procedure in simulation before they do it on a real person.”
There’s a pretty clear incentive for innovation here; the more accurate that simulation can become, the more likely it is we wake up from real-life surgery. And while major strides are being made, haptic surgical simulation is not yet commonplace. However, according to Chris Scattergood, co-founder of Fundamental VR, many surgeons are improving their skills using more orthodox technology.
“Surprisingly, a lot of surgeons will actually refine their skills using YouTube,” he said. “If you go to YouTube there’s about 170,000 [surgery] videos on there, and we’ve met senior consultants who have learned an entire procedure by watching YouTube. Once they are then confident that they can perform it, they say they’re confident, and they go and do it.”
Like Barrow, Scattergood is operating at the crux of VR and haptics, with a particular focus on healthcare. Fundamental VR works with medical device manufacturers, pharmaceutical companies and hospitals in the UK and the US, and is an official development partner for Microsoft’s Hololens AR device.
The company’s FeelReal VR platform uses devices such as the HTC Vive and Oculus Rift to simulate surgical environments. Once immersed, haptic feedback mimics incisions, injections and other procedures, while proprietary software maps and calibrates over 20 different tissue types, such as tight and loose skin, sub-cutaneous fat, cartilage and bone.
“For each one of those there are different values,” Scattergood explained. “Whether that’s initial resistance, the feel across the top of it, the pop – the amount of pressure you need to put through. We’ve mapped all of those into a system.”
Tracking the incisions
A surgeon overseeing multiple juniors practising on cadavers needs to physically monitor the procedures carried out. But software cannot only simulate the feel of various tissues, it can also track exactly what the scalpels and syringes are doing. In combination with VR and haptics, the technology can help sort the Christiaan Barnards from the Dr Nicks.
“For the first time we’ve got measurable feedback that allows us to see how well somebody’s doing and how fast they’re learning,” said Scattergood.
That almost primal, instinctive connection to our tactile senses is one of the things that makes haptics such an exciting area of engineering development. It’s a relatively nascent technology, but one that has the potential to resonate with us on a deep level. Whether in combination with VR or in standalone devices, haptics is opening up a world of new sensory possibilities. And we’ve only just begun to scratch the surface.
CEO & Founder at JAG Tech Consulting, Principal at Invereen Ventures, Advisor, Mentor & Investor
7 年I had the opportunity to play with such a demonstrator at the recent YFC'17 event in Edinburgh. Once you touch it and feel it the proposition for haptic AR becomes clear and compelling!
CEO at UniversityTech.io
7 年Saishreyas Sundarajoo
GEN AI/ML |Digital Architect| TOGAF|GraphDB|Cloud |Data Science
7 年Remebering the moment you and me worked on this haptics device with 3d monitor in our IIT Labs.