MediaFlow is an authoring tool and API developed by the Kinetica project for creating applications for streaming and realtime media, using computation on media and its metadata. These applications that let you find, combine, and play media to create new, customized and novel experiences.
The main idea in MediaFlow is to combine metadata and functional dependencies to create content-savvy media applications. Metadata includes any description of the media content that is at a higher level of abstraction than the media itself. For example, video metadata can be locations of shot breaks, or locations and identities of objects within the video; audio metadata can be volume, speech vs. music discrimination, and speaker ID. Functional dependencies refers to a simple programming style in which the main action is to apply functions to inputs. In MediaFlow, you apply functions to compute metadata from media, and to synthesize new media based on metadata.
MediaFlow Key Features
- Provides a simple API for capture, playback, analysis, and synthesis of video and audio.
- Built-in support for multiple media data types (video, audio, MIDI, ...)
- Interrelates different media types (e.g., rumble the video based on the audio)
- Integration of live and stored media processing
- reuse same functions for live & stored media
- synchronize live & stored media
- Support media analysis, synthesis, & content representation
- Extensible object-oriented data type system
- support "first-class" user-defined data types and functions
- customizable by adding app-specific media and content representation types & functions
- Can plug in other media processing tools
- Functional programming style
- all dataflow is through parameters; facilitates parallel execution
- facilitates visual programming interface
- Incremental data-pull architecture
- improves efficiency by deferring computation
- supports rapid prototyping
- Mathematical collection data types: sets, records, ...
- facilitates data abstraction for content representation
- supports efficient symbolic computation
MediaFlow Applications
- AutoBuddy Movie Kit (1996) - Generates a movie from a network video game experience.
- Soundtrack Generation - (1997) Automatic generation of movie soundtrack from music clips with metadata.
- Godzilla vs. Interval Movie Kit - (1997) Generates a movie of you saving Interval from Godzilla.
- Music Lego (1998) - Assembles songs out of song clips using constraints and musical morphing.
- Ski Montage (1998) - Computationally assembled a montage video from video captured with a Vertov camera pack.
- LoopJammer (1998) - Interactively record, loop, play, and combine multiple audio/video clips.
Example Application: Godzilla Movie Kit
- Makes a 2 minute movie in which you save Interval from an attack by Godzilla
- Inputs are 5 short video clips of you
- Functions used in Godzilla app:
- detect sounds
- cutting on dialog
- cutting on motion
- motion level and position analysis
- background subtraction and compositing
- rumbling video based on audio
- audio mixing
- generic stream functions: substream, concatenate
GUI Components
- Modular, configurable components
- Used for rapid application prototyping
- Fundemental components include Players and Timelines
Timelines
- Leverage lazy streams (only computes frames needed)
- Display any number of stored streams of various types on common time axis
- Simple UI controls (zooming and scrolling in time, vertical resizing)
- Caches stream displays for fast refresh
- Live GUI inputs
- video, audio, numeric
- sliders and buttons which correspond to live streams
Player
- Supports synchronized playback of multiple live and stored audio and video streams
- Uses data-pull and lazy streams to compute only what is needed on the fly
- Provides a simple UI for play, pause and rewind