to get instant updates about 'W3C MMI' on your MyPage. Meet other similar minded people. Its Free!


All Updates

The Multimodal Interaction Activity is an initiative from W3C aiming to provide means (mostly XML) to support Multimodal Interaction scenarios on the Web.

This activity was launched in 2002. The Multimodal Interaction Framework Working group has already produced :
  • the Multimodal Interaction Framework, providing a general Framework for Multimodal Interaction, and the kinds of markup languages being considered.
  • A set of Use cases.
  • A set of Core Requirements, which describes the fundamental requirements to address in the future specifications.

The set of devices that are considered are : Mobile phones, automotive telematics, PC connected on the Web.

Current work

The following XML specifications (currently in advanced Working draft state) are already addressing various parts of the Core Requirements :
  • EMMA (Extensible Multi-Modal Annotations) : a data exchange format for the interface between input processors and interaction management systems. It will define the means for recognizers to annotate application specific data with information such as confidence scores, time stamps, input mode (e.g. key strokes, speech or pen), alternative recognition hypotheses, and partial recognition results etc.
  • InkML - an XML language for digital ink traces : an XML data exchange format for ink entered with an electronic pen or stylus as part of a multimodal system.
  • Multimodal Architecture : A loosely coupled architecture for the Multimodal Interaction Framework that focuses on......
  • ...

Read More

No feeds found

Posting your question. Please wait!...

No updates available.
No messages found
Suggested Pages
Tell your friends >
about this page
 Create a new Page
for companies, colleges, celebrities or anything you like.Get updates on MyPage.
Create a new Page
 Find your friends
  Find friends on MyPage from