AI Interactive Menu
Real-time tracking and projection mapping extend touch capabilities beyond screens and displays.
Using only a webcam and a projector, we experimented with adding a layer of interaction to any surface regardless of the material, texture, and color.
In the AI Interactive Menu prototype, our gesture visualizer recognizes finger touch points on the menu projected on a black cloth. We used an open-source machine learning framework and MediaPipe Handpose and Fingerpose models to detect and track finger positioning in real time. Once items are selected, the user can do a thumbs up to complete the order.
The prototype can be customized in both content and scale to bring life to restaurants, hotels, family entertainment centers, airports, visitor centers, theme parks, and more. It offers a flexible and powerful solution for any space or venue without requiring a large investment in hardware.