1. Onsite audience interaction - movement detection
Using millimetre-waves technology, we can track the movements of the onsite audience. This data is used to map their positions to virtual avatars, allowing the onsite audience to seamlessly become part of the virtual world.

Actual integration of detecting movement.

Demonstration of detecting movement.
2. Performance interaction - music viusalization
By utilizing sensors from the startup HC MUSIC CO. and capacitive sensors, we can detect the sounds produced by instruments like the guitar and cajon. These inputs are then transformed into visual elements, creating an immersive music-visualization environment.

Actual integration of Music visualization.

Demonstration of Music visualization.
3. Online audience interaction - sending emojis and messages
We integrated Audience, a 360-streaming platform, with our system, enabling the online audience to participate in the virtual experience using their phones, computers, or VR headsets. They can also interact by sending emojis and messages to each other.

Actual integration of sending emojis.

Demonstration of sending emojis.