LIVE is an integrated, multidisciplinary initiative that will contribute to the IST strategic objective 'Semantic-based Knowledge and Content Systems' and 'Exploring and bringing to maturity the intelligent content vision'. The LIVE project results aim at the development of semantic-based and user/content-aware systems to pioneer intelligent self-describing iTV content by creating an intelligent media framework so the content can find an interested consumer by itself.
LIVE will produce in real-time a non-linear multi-stream TV broadcast of the 2008 Olympic Games in Beijing which adapts to the interest of the viewers. For this innovative TV experience the multiple incoming video signals and the available archive material will be indexed and structured by semi-automatic metadata extraction tools. The identified video objects will be filtered and visualised to the professional users in the control room of a broadcast station. Additionally, feedback coming in from the TV consumers over a back channel mechanism will be analysed by a recommender system. At the intelligent media framework layer the semantic connections between the user preferences and annotated video material are made. These results are fed into the control room to guide the production process. The experiments and the development of an integrated prototype will be carried out at ORF in Austria.
The LIVE project promotes a new, third market segment in the digital interactive television sector that does not exist today: intelligent television programming and services. This means creating non-linear, multi-stream and real-time content formats related to major media events, which adapt to the interests of the consumer. For this goal classical AV oriented media needs to be enriched with sophisticated metadata up to intelligent semantics, archive material and live streams have to be properly linked together in real-time, and TV consumer feedback has to be considered for convenient programme adjustments.
Technically we will develop a knowledge kit and a toolkit for an intelligent live content production process including dynamic human annotation and automated real-time annotation, that addresses the key question: how to turn a video into an intelligent multimedia object that reacts to the needs of the interested consumer?
- Project duration is 45 months commencing January 2006.
- Coordinator is Fraunhofer IAIS, St.Augustin, Germany
- The overall budget is approx. 11.3 million euro.
- Nine partners from five European countries.
Project ManagementProject Manager: Jobst Loeffler, Fraunhofer IAIS,
Technical Coordinator: Dr.-Ing Joachim Köhler, Fraunhofer IAIS,