LyricBathe

LyricBathe

This research project explores the creation of interactive lyric experiences using Mixed Reality (MR) technology.

LyricBathe

We are developing LyricBathe, a Mixed Reality (MR) system designed to enhance lyric engagement through immersive experiences. LyricBathe offers an innovative and interactive music experience, allowing users to appreciate song lyrics more deeply in their everyday music-listening environments. Utilizing an optical see-through head-mounted display (HoloLens 2), this MR system synchronizes lyrics with the progression of a song, rendering each character as an interactive object in the physical space. By engaging with lyrics through body movements and interactions with the surrounding environment, users can experience a new dimension of lyric bathing.
In LyricBathe, MR technology is used to render lyrics as objects within the physical space, allowing users to experience lyrics spatially. Beyond simply viewing lyrics, users can engage in physical interactions such as "touching the lyrics with their hands," "lyrics emerging from their bodies," and "bathing themselves in a shower of lyrics," enabling a new way to experience and enjoy song lyrics.
To investigate how users focus their attention during the lyric experience created by LyricBathe, we conducted an analysis of gaze patterns. Using the eye-tracking functionality of HoloLens 2, we measured users' gaze points during music experiences across eight different visual interactivity modes. Gaze points were categorized based on the objects users focused on, enabling an in-depth analysis of gaze behavior in the context of music interaction.
Through iterative development, we have implemented 24 different visual interactivity modes. From these modes, we extracted 12 design primitives that define lyric representation in the MR environment, identifying key elements such as spatial arrangement, timing of text display, and types of user interactions.

Presentation

S