Skip to content

Sections
Personal tools
You are here: Home » News » Idiap researchers involved in the organisation of the next International Conference on Multimodal Interaction (ICMI2011).
Navigation
... IM2 8th Site Visit Best Student Paper Award Recordings of the workshop on Foundations of Social Signal Processing Next four-year period, 2010-2013, of IM2 approved Alessandro Vinciarelli has been appointed Associate Editor for the Social Sciences Column of the IEEE Signal Processing Magazine. IM2 is actively participating in AERFAISS'2010, the Summer School on Pattern Recognition and Learning in Multimedia Systems of the Spanish Association for Pattern Recognition Ed Gregg, new Idiap Financial Manager From research to application - Detecting emotions revealed by our eyes Indexed brainstorming Kooaba’s Reality Augmented With $3 Million Funding KeyLemon signe avec un des plus importants producteurs d’ordinateurs au Brésil Koemei, new Idiap spin-off is born ! 2010 IMD startup competition winners - Koemei, Idiap Spin-off, selected nViso - Un nouveau logiciel suisse décrypte les émotions Award-winning publication for Afsaneh Asaei, Idiap research assistant. Idiap researchers involved in the organisation of the next International Conference on Multimodal Interaction (ICMI2011). Active participation of IM2 in a festival for children Successful scientific workshops during the "Hérisson sous gazon" festival. Tracking the gaze and facial expression. IM2 SI 2011 photos and videos now available. BOB is born Un challenge pour start-up à Martigny Big success for the 1st edition of the International Create Challenge IM2 and "La Journée : Oser tous les métiers" KeyLemon awarded with the CTI Start-up label Klewel competes for the Oscar of Swiss Informatics! Newsletters Events IM2 Technology transfer event 2013 – Martigny Contact ...
Address
NCCR IM2
c/o Idiap
Centre du Parc
Rue Marconi 19
Case Postale 592
CH-1920 Martigny
Switzerland

tel. +41 27 721 77 11
fax +41 27 721 77 12


 

Idiap researchers involved in the organisation of the next International Conference on Multimodal Interaction (ICMI2011).

Document Actions

Five of our researchers (Herve Bourlard (General Chairs, Advisory Board), Daniel Gatica-Perez (Program Chairs), Jean-Marc Odobez (Area Chairs), Alessandro Vinciarelli (Area Chairs), Andrei Popescu Belis (Advisory Board)) are involved in the organisation of the next International Conference on Multimodal Interaction (ICMI2011) which will take place, November 14-18, 2011, in Alicante, Spain, http://www.acm.org/icmi/2011.

This year, the International Conference on Multimodal Interfaces (ICMI) and the Workshop on Machine Learning for Multimodal Interaction (MLMI) are combined for form the new ICMI, which continues to be the premium international forum where multimodal signal processing and multimedia human-computer interaction are presented and discussed. The conference will focus on theoretical and empirical foundations, varied component technologies, and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development. ICMI 2011 will feature a single-track main conference which includes:
* keynote speakers
* technical full and short papers (including oral and poster presentations)
* special sessions
* demonstrations
* exhibits and doctoral spotlight papers
The main conference will be held November 14-16, 2011 and followed by a 2-day workshop.

TOPICS OF INTEREST include but are not limited to:
* Multimodal and multimedia interactive processing: multimodal fusion, multimodal output generation, multimodal interactive discourse and dialogue modeling, machine learning methods for multimodal interaction.
* Multimodal input and output interfaces: gaze and vision-based interfaces, speech and conversational interfaces, pen-based and haptic interfaces, virtual/augmented reality interfaces, biometric interfaces, adaptive multimodal interfaces, natural user interfaces, authoring techniques, architectures.
* Multimodal and interactive applications: Mobile and ubiquitous interfaces, meeting analysis and meeting spaces, interfaces to media content and entertainment, human-robot interfaces and interaction, audio/speech and vision interfaces for gaming, multimodal interaction issues in telepresence, vehicular applications and navigational aids, interfaces for intelligent environments, universal access and assistive computing, multimodal indexing, structuring and summarization.
* Human interaction analysis and modeling: modeling and analysis of multimodal human-human communication, audio-visual perception of human interaction, analysis and modeling of verbal and nonverbal interaction, cognitive modeling.
* Multimodal and interactive data, evaluation, and standards: evaluation techniques and methodologies, annotation and browsing of multimodal and interactive data, standards for multimodal interactive interfaces.
* Core enabling technologies: pattern recognition, machine learning, computer vision, speech recognition, gesture recognition.


Last modified 2011-04-13 11:31
 

Powered by Plone