Abstract |
The XR Jam project presents an integrated system designed to enhance musical interaction and spatial awareness in Extended Reality (XR) environments. Leveraging AI-driven Virtual Musical Agents (AIVMs), the platform elevates traditional musical experiences. Built on an XR framework, the system pushes conventional boundaries by enabling novel forms of interaction, such as object manipulation and social drumming with AIVMs and remote players. Furthermore, avatar-based interfaces help overcome social inhibitions, offering players a unique, uninhibited interactive experience. Pilot studies revealed the possibility for effective use of haptics in spatial localization. While haptic feedback enhanced spatial orientation in specific scenarios, front and back directions remained challenging to discern. The system promotes musical interaction, social engagement, and spatial orientation, although further refinements are needed for synchronization and player experience. Overall, this research opens new avenues for immersive musical experiences through a fusion of AI, XR, and haptic technologies. |
Authors |
Torin Hopkins , Suibi Che Chuan Weng , Rishi Vanukuru , Sasha Novack , Chad Tobin , Emma Wenzel , Amy Banić , Mark D. Gross , Ellen Yi–Luen
|
Journal Info |
Institute of Electrical and Electronics Engineers | 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)
|
Publication Date |
1/17/2024 |
ISSN |
Not listed |
Type |
article |
Open Access |
closed
|
DOI |
https://doi.org/10.1109/aixvr59861.2024.00021 |
Keywords |
Music Information Retrieval (Score: 0.580833) , Audio Event Detection (Score: 0.554289) , Environmental Sound Recognition (Score: 0.54833) , Network Coding (Score: 0.533891) , Acoustic Ecology (Score: 0.517311)
|