View Single Post
Old 05-05-2011, 08:30 PM   #1
buisness9127
 
Posts: n/a
Default Office Enterprise 2007 Scientific Commons Identif

Identifying the Addressee in Human-Human-Robot (2008) Interactions Determined by,Office Professional Plus 2010, Michael Katzenmaier
Abstract With this work we investigate the energy of acoustic and visual cues,Office 2010 Key, and their mixture,Office 2010 Professional, to determine the addressee in a very human-human-robot interaction. Determined by eighteen audiovisual recordings of two human beings as well as a (simulated) robot we discriminate the interaction with the two humans from the interaction of 1 human together with the robot. The paper compares the consequence of three methods. The primary strategy makes use of purely acoustic cues to seek out the addressees. Very low stage,Office Enterprise 2007, function primarily based cues at the same time as higher-level cues are examined. From the 2nd tactic we check regardless of whether the human's head pose is a suitable cue. Our results show that visually estimated head pose is a more reliable cue for the identification with the addressee in the human-human-robot interaction. From the third tactic we combine the acoustic and visual cues which results in significant improvements.
Details der Publikation Download Quelle Mitarbeiter CiteSeerX Archiv CiteSeerX - Scientific Literature Digital Library and Search Engine (United States) Keywords attentive interfaces, focus of attention,Windows 7 Ultimate, head pose estimation Typ text Sprache Englisch Verknüpfungen 10.1.1.6.1719, 10.1.1.28.8271
  Reply With Quote

Sponsored Links