AI Biofeedback Interactive Movie: discover how viewer emotions guide the narrative direction of a story

AI Biofeedback Interactive Movie: discover how viewer emotions guide the narrative direction of a story

This research project is a game changer in cinematic interactive storytelling! Imagine you’re cognitions and emotions drive the narrative direction of a movie in real time. Advances in artificial intelligence (AI) machine learning, biosensor technology and narrative database make this possible today! 

In 2010 I created E-movie?, biofeedback interactive movie concept which was selected for the final Global Cisco I-Prize Innovation Competition (2010), from 800 submissions (147 countries). E-movie re-structures the story to match viewer cognitive-emotion responses in real time while they watch the movie. Viewer physiological responses to story elements control the narrative direction of the story, enabling the viewer to be storyteller, director and editor. Thereby, generating a unique personalised story experience every time the viewer watches the same movie. This is because previous viewing experience will modify their cognitive and emotional states when they watch the movie repeatedly, each time changing the narrative trajectory to optimise engagement.

Although, in 2010 AI/machine learning techniques, biosensor technology had not been sufficiently developed for E-movie to become a reality, today AI machine learning capabilities and biosensor technology innovations, make biofeedback interactive storytelling for movies and media possible.

Audience Experience: AI and Biosensor Technology Innovation

In March 2014 Technicolour announced they had created an intelligent platform called ‘Empath Analytics’ that combines biometrics with AI machine learning to measure and analyse audience responses automatically. The AI machine learning technology capabilities drive the measuring and analytical process. Technicolour claimed that ‘Empath Analytics’ can inform filmmakers how viewers respond to specific story elements such as plot points and character development, providing a deep understanding of audience experience.

According to Lauren Goode, from the verge.com (19th March 2017), Dolby Laboratories are investigating audience engagement by measuring and analysing different psychophysiological recording techniques such as eye tracking, skin temperature, electrical changes in the skin, cardiovascular and neurological measurements. In the article, Chief Scientist Poppy Crum said that Dolby are still in the process of understanding viewer engagement using psychophysiological recording techniques and at present are not collaborating with content providers to improve viewer engagement.

More recently though, Shelby Rogers from www.interestingengineering.com reported in April 18th, 2018 Dolby Laboratories have developed sensors combined with artificial intelligence that can read bodily micro-expression responses to our surroundings. In the article Poppy Crum says ‘We broadcast our emotions. We will know more about each other than we ever have.’ The sensors not only track micro-facial expressions and eye dilation but also increases in skin temperature, which often occurs when we feel anxious. Dolby’s intelligent system recognises cues to hidden feelings and analyses how people are feeling.

In October 27th, 2017, Michelle Zamora, Head of Marketing at IBM research presented at #digitalks in Sydney Australia, presenting ‘Watson’, IBM’s artificial intelligent system which generates personalised digital experiences. For example, Zamora presents how AI was used to create a personalised movie trailer experience for the horror film Morgan (Luke Scott, 2016), which is an AI story about a robot that has artificial intelligence.

20th Century Fox approached IBM to see whether it was possible to use AI to analyse the movie and create a trailer automatically. In an interview John Smith, Head of AI Tech for IBM Research said ‘Watson was able to model the scene visually to determine, was the scene scary? Was it tender moments, was there sadness or happiness?’ While filmmaker Zef Cota said ‘Watson is the tool that’s helping arrange the visuals, but it still needs the human element, so I can come in and just supervise the creative aspect’, which indicates that ‘Watson’ could not understand the creative elements of story context or how these relates to human emotion. This was echoed by the film director Luke Scott who said ‘I don’t think AI is any value until it does start to understand and calibrate those emotions itself’ rather than the filmmaker.

Although these intelligent systems developed by Technicolour, Dolby Laboratories and IBM are very sophisticated they have not yet been developed to create new forms of cinematic storytelling, film production techniques or deliver an authentic personalised viewer experience.

However, this challenge can be addressed by applying AI and biosensor technology to story context cause and effect paradigms. Based on my conceptual framework for E-Movie this could be achieved by measuring and analysing story context cause and effect paradigms in relation to actors’ performance (fiction character), dialogue and how congruent their emotions and actions are in relation to other story elements, characters, stage props, production design, sound and lighting. Combining this framework with my 5-year scientific research study into the construction of suspense and audience engagement it is very feasible to develop AI machine learning capability that can determine viewer predictable responses to story elements in a movie and adjust the narrative direction accordingly.

Interactive Movies and Personalisation

The aim of developing a biofeedback interactive movie is to develop a unique personalised movie experience to exceed consumer expectations. Interestingly, it has been filmmakers, artists and designers that have been the pioneers of interactive movies rather than technologists and data scientists. For example, Kinoautomat (Raduz Cincera, 1967) was the first interactive movie, presented in the Czech Pavilion at Expo 67 in Montreal. In this instance the interactive element occurred when the movie stopped at specific points during the movie and the audience were given choices to alter the narrative direction.

However, it was not until the digital age that interactive films become more popular and enabled viewers to make a conscious choice at specific moments during a movie and change the narrative direction by a mouse click. Late Fragment (Daryl Cloran, Anita Doron and Mateo Guez, 2007) enabled viewers to select the narrative direction based on character decisions at different moments during the movie. Although recently smartphone apps have been used in cinema interactive movie experiences, the mouse click is still common today for interactive shorts and feature films.

In 2005 there was an interesting development in the field of interactive storytelling which did not depend on a mouse click to change the narrative direction. Soft Cinema, navigating the database (Lev Manovich and Andreas Kratky, 2005), was an experiment in how database narrative could be transformed into interactive movies. The software edits movies in real time, selecting filmic elements from the database according to the rules defined by Manovich and Kratky.

Filmmaker and researcher Pia Tikka went further than Manovich and Kratky developing Enactive Cinema (2005), an ‘Eisensteinian montage machine' which explores the concept of a biofeedback interactive film using 4 screens and tracking viewers unconscious physiological responses and modifying the movie based on these changes. Although Tikka’s art installation was a pioneering development in biofeedback interactive film, Tikka did not create or use an empirical based framework that measured story context or viewer biodata to specific story cause and effect paradigms, such a narrative features and cinematic storytelling techniques (e.g. cinematography, sound, editing). Nevertheless, Tikka’s filmmaking and technical approach demonstrated the capabilities of biofeedback interactive movies as an authentic personalised viewing experience.

Development Opportunity: narrative database biofeedback interactive movie

During the last year I have been reviewing E-Movie? an award winning narrative database biofeedback interactive movie concept and my 5-year research study - the construction of suspense and audience engagement. The research study defined a framework to measure suspense and audience engagement to story cause and effect paradigms which include: narrative features, micro-narrative structures, cinematography (camera shot, frame, angle, duration, movement, lighting), sound (diegetic and non-diegetic), stage props, editing (e.g. pacing) and acting performance.

A mixed methods approach was taken, triangulating three data sets: story context/filmmakers’ intentions to engage the audience, recording viewer anxiety (electrodermal activity – emotional sweating) and viewer self-reports to contextualise their subjective experience. The outcomes of the research discovered filmmakers ‘narrative blind spots?’ that reduce story comprehension and audience engagement. The data from the research study also enabled the development of scientific-creative storytelling options that can resolve filmmakers’ ‘narrative blind spots’.

Building on these research findings I have now created a robust methodology and scientific evidence-based framework that will measure and analyse audience engagement and story context cause and effect paradigms using biosensor technology and AI machine learning that will lead to personalised seamless interactive biofeedback movie experiences.

At present I am interested to talk to AI technology/biosensor companies that have sufficient resources and funding to develop E-movie or to work in collaboration on creating novel interactive film/media personalised experiences.

Contact [email protected] now or go to www.receptivecinema.com to find out more information about Dr Bound’s research, awards and work.

Dr Keith Bound is a narrative design and audience engagement consultant and a pioneer in the science of storytelling. He has an interdisciplinary Ph.D. (film studies, media psychology, psychophysiology and computer science), MA Design & Digital Media and BA (Hons) Industrial Design. He is an award-winning designer and has a substantial senior management background in innovation, design and the creative industries.

Dr Bound now specialises in designing compelling narratives for movies and commercials, and leading interdisciplinary research teams: screenwriters, filmmakers, data/AI scientists/analysts, physiologists, neuroscientists and audience researchers. To deliver new forms of storytelling that exceed 21st century consumer expectations and maximising returns.

#ArtificalIntelligence #AI #MachineLearning #Biometrics #Biofeedback #InteractiveMovies #InteractiveStorytelling #NarrativeDatabase #Personalisation #AudienceEngagement #E-Movie #NarrativeBlindSpots #IBMWatson #DolbyLabs #20thCenturyFox #Technicolour #ReceptiveCinema #DrKeithBound

Richard Fleming

I make movies. For entertainment, for marketing, and to train complex skills no one else can. Hire me when you need media that's GUARANTEED to work.

6 年

This is great, Keith. Thanks for sharing. Have you followed any of Richard Ramchurn's work? He just released "The Moment," which uses EEG to shift the narrative thread. Thought you might be interested. The film itself looks gorgeous too. Also, not sure if you know about the SCSMI conference, but I just got back from it. Fascinating discussions! Many around Tikka's work.

Very interesting and so helpful, like many of your articles. Thanks for sharing Keith.

Dimi Nakov

Filmmaker / Futurist / Beneficial AGI Researcher / Mindful Optimist

6 年

Amazing. I would love to be part of a project like this.

回复
Shah Hardik

Data Centre | IT Infrastructure | Colocation Service Provider | Global Switch | CloudEdge | Investor | Entrepreneur

6 年

You’ve sparked my interest Keith, thanks for sharing.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了