Machine learning & Augmented Reality driving immersive experiences
Machine driven data visualization and synchronized video overlays

Machine learning & Augmented Reality driving immersive experiences

Imagine a world where machines can assemble content and deliver interactive experiences with little or no human creative or interaction. While the creative part of machine learning is still a bit of a stretch, the reality is that machines can already follow design rules and incorporate recipes & templates combining such elements as social for editorial, synchronize to cameras that are also servers, talk to sensors and other data (statistics, human biometrics, scoring, etc) points. These all layer in as the templates direct further synchronizing with time, location and frame accurate for create immersive experiences that could only be achieved in trucks and production facilities not so long ago.

Tier One events and productions will continue to use expensive and resource intensive processes and people for the foreseeable future as the money is enough to feed the continued proliferation of inefficiencies. For everyone else, the playing field is about to level when the costs are machine and cloud based and you can now deliver more production rich features in an automated environment managed like IT (utility) and not like SI or agency (hours).

As consumers hunger for more rich content they are becoming less interested in best effort raw social shared video. Brands are certainly looking for better representation through produced content so it only makes sense that there is an intersection coming where the bar will continue to rise for expectations on both sides. Since the realities of the old school providers are so entrenched in hours of creative it is unlikely that the immediate innovations will come from those production/people heavy corners of the industry to meet the immediate needs for efficiency. The IT industry are focused on Everything as a Service, except creative, so it takes a unique set of capabilities and partnerships to develop this hybrid approach and move us to this new level of creative automation.

We have been focused on the first step transforming the camera into an intelligent contributor to the creative workflow and network versus a stranded lens. With microprocessing and sensor advances driven from smartphone and action camera development combined with IoT communication protocols, we can now network these microprocessors into a mesh of synchronized video & data contributors (servers with lenses) to harmonized experiences. It is going to be an amazing road ahead as the advancements AI applied to software defined media workflows allows the market to move beyond the bookends of expensive tier one broadcast and social video to market wide branded, interactive experiences accelerated by machines and automation.

What could this all look like? Take a peak into the future by clicking the link below.

Click here for an immersive multi-camera data synchronized experience

要查看或添加评论,请登录

Darcy Lorincz的更多文章

  • Copy of Hybrid CDN & SD-WAN + 5G Broadcast delivery Coming soon to a Network near you!

    Copy of Hybrid CDN & SD-WAN + 5G Broadcast delivery Coming soon to a Network near you!

    I was reading an article on a recent panel comparing CDN's and Broadcast. Apparently the tech world is still trying to…

    9 条评论
  • The Convergence of AI and Computervision

    The Convergence of AI and Computervision

    After posting a couple of recent Data Driven Experience videos I had a number of questions on Computer Vision for…

    3 条评论
  • The Convergence of Streaming and The Metaverse

    The Convergence of Streaming and The Metaverse

    Over the past couple of years, I have been asked by a number of media and tech leaders and analysts, how I see…

    1 条评论
  • DIGITAL DISRUPTION AHEAD GAMES & COMPETITIVE GAMING

    DIGITAL DISRUPTION AHEAD GAMES & COMPETITIVE GAMING

    As we enter the last few weeks of this ‘decade of digital disruption’ it is interesting to see what disruptors are…

    1 条评论
  • Do rising tides really float all boats?

    Do rising tides really float all boats?

    These past weeks and months have been exciting times with Tech Giants and incumbents all coming out with innovation and…

    1 条评论
  • Software Defined Live Media Experiences

    Software Defined Live Media Experiences

    We are asked a lot how we get such incredible motorsports experiences with seemingly low touch, compared to complex and…

    2 条评论
  • NAB 2017 - The Year of Augmented Reality?

    NAB 2017 - The Year of Augmented Reality?

    The annual pilgrimage to Las Vegas is upon us. We will all start arriving now and throughout the next week to share…

    1 条评论
  • Wake Up Call - Part ?

    Wake Up Call - Part ?

    Sports have been feeling the pain of the aging traditional audience and fickle attention deficit millennial audience…

  • Hybrid CDN & SD-WAN + 5G Broadcast delivery Coming soon to a Network near you!

    Hybrid CDN & SD-WAN + 5G Broadcast delivery Coming soon to a Network near you!

    I was reading an article on a recent panel comparing CDN's and Broadcast. Apparently the tech world is still trying to…

    4 条评论
  • AerNow in eSports

    AerNow in eSports

    AERNOW, a global platform for rendering immersive live experiences, announced today its partnership with IDEAS+CARS…

社区洞察

其他会员也浏览了