Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
We introduce a novel approach to cultural heritage experience: by means of ego-vision embedded devices we develop a system which offers a more natural and entertaining way of accessing museum knowledge. Our method is based on distributed... more
We introduce a novel approach to cultural heritage
experience: by means of ego-vision embedded devices we develop a system which offers a more natural and entertaining way of accessing museum knowledge. Our method is based on distributed self-gesture and artwork recognition, and does not need fixed cameras nor RFIDs sensors. We propose the use of dense trajectories sampled around the hand region to perform selfgesture recognition, understanding the way a user naturally
interacts with an artwork, and demonstrate that our approach
can benefit from distributed training. We test our algorithms
on publicly available datasets and we extend our experiments to
both virtual and real museum scenarios where our method shows robustness when challenged with real-world data. Furthermore, we run an extensive performance analysis on our ARM-based wearable device.
Research Interests: