Human Computer Confluence


A. Ferscha, S. Resmerita, C. Holzmann - Human Computer Confluence - Proceedings of the 9th ERCIM Workshop on User Interfaces for All (UI4All): Interaction Platforms and Techniques for Ambient Intelligence, Königswinter, Deutschland, 2006, pp. 14-27


Pervasive Computing has postulated to invisibly integrate technology into everyday objects in such a way, that these objects turn into smart things. Not only a single object of this kind is supposed to represent the interface among the "physical world" of atoms and the "digital world" of bits, but a whole landscapes of them. The interaction among humans and such landscapes of technology rich artifacts happens to be more confluent, rather than on a per device basis. To address the confluence among humans and computing landscapes we study human gesticulation and the manipulation of graspable and movable everyday artifacts as a potentially effective means for the interaction with the physical environment. In detail, we consider gestures in the general sense of a movement or a state (posture) of the human body, as well as a movement or state of any physical object resulting from human manipulation. Further, based on the tangible user interface paradigm, we propose employing intuitive tangible universal controls that translate physical motions into actions for controlling landscapes of smart things. Such intuitive "everyday"-gestures have been collected in a series of user tests, yielding a catalogue of generic body and artifact gesture dynamics. We present a systematic approach to selecting and steering using tangible artifacts by associating a flip-movement to service selection and a turn-movement to parameter steering. An implementation of this approach in a general software framework and several experiments with various fully functional artifacts and devices are described.