TY - GEN
T1 - The camera-driven interactive table
AU - Schwarz, Christopher
AU - Da Vitoria Lobo, Niels
PY - 2007
Y1 - 2007
N2 - One emerging application of computing technology is that of interactive rooms and furniture. For instance, Interactive Tables allow for a workspace that is intuitive, natural, and conducive to creating multi-user collaborative work environments. Although several interactive table prototypes have been developed, we have engineered a method using a computer vision system instead of touch screen technology, which allows increased flexibility to the end user because of its ability to ignore or even make use of objects placed upon the table and its decreased likelihood of accidental input. In this work we present a method for implementing such a camera-driven interactive table with a ceiling-mounted camera and demonstrate some of its potential uses. The vision system makes use of a novel hand detection and segmentation technique designed to be tolerant of any level of background complexity on the display and any reasonable range of indoor lighting conditions, thus allowing the highest level of freedom to the end user. It searches the results of multi-scale line and curve finding systems to locate thimbleshaped finger models, marking them as candidate fingers and performing a set of geometric and texture-based tests on each to remove false positives. Finally, it groups finger detections that are similar to each other in location and appearance, while allowing the reintroduction of weak candidates that are supported by strong neighbors, into hand detections with finger and palm locations. Results demonstrate the system's ability extract enough information from images of hands in very complex backgrounds to allow for finger and palm placement recognition.
AB - One emerging application of computing technology is that of interactive rooms and furniture. For instance, Interactive Tables allow for a workspace that is intuitive, natural, and conducive to creating multi-user collaborative work environments. Although several interactive table prototypes have been developed, we have engineered a method using a computer vision system instead of touch screen technology, which allows increased flexibility to the end user because of its ability to ignore or even make use of objects placed upon the table and its decreased likelihood of accidental input. In this work we present a method for implementing such a camera-driven interactive table with a ceiling-mounted camera and demonstrate some of its potential uses. The vision system makes use of a novel hand detection and segmentation technique designed to be tolerant of any level of background complexity on the display and any reasonable range of indoor lighting conditions, thus allowing the highest level of freedom to the end user. It searches the results of multi-scale line and curve finding systems to locate thimbleshaped finger models, marking them as candidate fingers and performing a set of geometric and texture-based tests on each to remove false positives. Finally, it groups finger detections that are similar to each other in location and appearance, while allowing the reintroduction of weak candidates that are supported by strong neighbors, into hand detections with finger and palm locations. Results demonstrate the system's ability extract enough information from images of hands in very complex backgrounds to allow for finger and palm placement recognition.
UR - http://www.scopus.com/inward/record.url?scp=34547210626&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=34547210626&partnerID=8YFLogxK
U2 - 10.1109/WACV.2007.58
DO - 10.1109/WACV.2007.58
M3 - Conference contribution
AN - SCOPUS:34547210626
SN - 0769527949
SN - 9780769527949
T3 - Proceedings - IEEE Workshop on Applications of Computer Vision, WACV 2007
BT - Proceedings - IEEE Workshop on Applications of Computer Vision, WACV 2007
T2 - 7th IEEE Workshop on Applications of Computer Vision, WACV 2007
Y2 - 21 February 2007 through 22 February 2007
ER -