David Kulczar, Senior Product Manager at IBM, Talks to nexttv.news about IBM Watson Media at NAB NY 2017. Photo: Matt Welton

IBM’s newly launched Watson Media division had a strong presence at NAB NY 2017 in New York, showcasing examples of their cognitive and artificial Intelligence technologies in applications for media and entertainment companies.

We spent a few minutes with David Kulczar at their booth, and talked about how IBM Watson Media is bringing these new technologies into everyday media content production.  After a demo of Video Enrichment, IBM Watson Media’s first released product, it was not difficult to see how the application of artificial and cognitive intelligence will quickly take a significant role in media content management workflows and bring a sea change to the media and entertainment industry.

IBM Watson Media workflow. Source: IBM
ADVERTISEMENT
[adrotate group="20"]

[adrotate group="20"]

[adrotate group="20"]

[adrotate group="20"]

Through Watson’s cognitive and artificial intelligence technologies, producers now have to ability to automate metadata creation in video clips.  In the case of pre-produced media, once the clip is uploaded into Video Enrichment, both video and contextual audio is examined, at a rate of 2 frames a second. Clip scenes are broken up at logical start and end points in the content and turned into separate searchable elements. David mentioned that the technology does not just use a simple camera switch as a scene divider, but it works “more like a DVD chapter, but way better” by taking into account the context of both the video and audio during the analysis. Audio transcripts in “high level closed-captioning quality” are also extracted and made available with the metadata.

The technology goes farther. A new “concept” data type also becomes available to producers, with searchable scene elements that did not come solely from audio transcription – a result of intelligent contextual video analysis.  To generate metadata for a particular actor, facial recognition is also available, provided the system is trained beforehand.  Also, new searchable metrics based on emotional analysis is extracted as well, either on an individual scene or on aggregate.

A completed uplift and emotion analysis. Source: IBM

One of the first uses of this technology was at the 2017 U.S. Women’s tennis open in September, where IBM Watson Media used the technology to automatically create “cognitive highlights” during tennis matches, which were then made available for distribution to tennis fans during the match.

IBM Watson Media’s US Open Cognitive Highlights product in use during the 2017 US Women’s tennis open. Source: IBM

IBM Watson Media analyzed the content library TED.com, and extracted varied types of searchable metadata, giving users more ways to search through the library, including concepts such as “global warming,” that may have been touched on in a particular presentation.  The application is available publically at watson.ted.com.

IBM Watson Media powers video content discovery for TED, the online learning series. Source: IBM

We asked David whether this technology could be used in the future to automate content creation to such an extent that a concept of completely automated production could be possible.  “I think the ‘holy grail’ of this is to serve an audience of one. I can come in and say ‘here’s my mood,’ and the system will dynamically develop something for me.”  Using elements for the 2016 science fiction movie “Morgan” provided by the studio, and an algorithm based on what they wanted to see in a trailer, Watson automatically produced a usable trailer for the movie.

 

Facebook Comments - fb.com/nexttvnews