Episodi

  • TAFFC 2025 Music Emotion: Are We There Yet? A Brief Survey of Music Emotion Prediction Datasets, Models and Outstanding Challenges
    Jan 20 2026

    Music has long been known to evoke powerful emotions, but can machines truly understand and predict these emotional responses? This survey paper takes stock of the field of music emotion recognition (MER), examining the datasets, computational models, and persistent challenges that shape this research area. The authors review how emotion is represented—from categorical labels to dimensional models like valence-arousal—and analyze the most widely used datasets including the Million Song Dataset and MediaEval benchmarks. They trace the evolution from traditional machine learning approaches using hand-crafted audio features to modern deep learning architectures. Despite significant progress, the paper identifies fundamental challenges: the subjective nature of emotional responses to music, the difficulty of obtaining reliable ground truth labels, and the gap between controlled laboratory studies and real-world listening contexts.

    Jaeyong Kang and Dorien Herremans. 2024. Are We There Yet? A Brief Survey of Music Emotion Prediction Datasets, Models and Outstanding Challenges. IEEE Transactions on Affective Computing, vol. 16, no. 4, 2024. https://doi.org/10.1109/TAFFC.2025.3583505

    Mostra di più Mostra meno
    12 min
  • UIST 2025 Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin Stretch
    Jan 16 2026

    Virtual body extensions like wings or tails offer exciting new experiences in VR, but using them naturally—especially parts you can't see, like a tail—requires proprioceptive feedback to sense position and force without relying on vision. This paper introduces the "Imaginary Joint," a novel approach that uses skin-stretch feedback at the interface between your body and a virtual extension. A wearable device stretches and compresses skin on both sides of the waist to convey joint angle and torque from a virtual tail. The system simultaneously communicates both rotation and force by superimposing skin deformations. In controlled experiments, skin-stretch feedback significantly outperformed vibrotactile feedback in perceptual accuracy, sense of embodiment, and naturalness—with participants reporting the sensation felt remarkably like having an actual tail.

    Shuto Takashita, Jürgen Steimle, and Masahiko Inami. 2025. Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin Stretch. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25), September 28–October 01, 2025, Busan, Republic of Korea. ACM, New York, NY, USA, 15 pages. https://doi.org/10.1145/3746059.3747800

    Mostra di più Mostra meno
    14 min
  • UIST 2025 eTactileKit: An Open-Source Toolkit for Electro-Tactile Haptic Design
    Jan 7 2026

    Electro-tactile interfaces—which deliver tactile sensations through electrical stimulation of skin nerves—offer unique advantages like fast response times, thin flexible form factors, and the ability to simulate textures, softness, and even coldness. But designing them has been notoriously difficult, requiring deep electronics expertise and custom hardware. eTactileKit changes that. This open-source toolkit provides end-to-end support: modular hardware that scales from 8 to 128+ electrodes, design tools for creating 2D and 3D electrode layouts, a Processing-based pattern creator with visual simulation, a GUI for real-time testing and calibration, and APIs for Python and Unity. A three-week study with both novice and experienced designers showed the toolkit significantly lowered the barrier to entry while improving design workflows—enabling rapid prototyping of applications from VR haptic buttons to 3D-printed interactive toys.

    Praneeth Bimsara Perera, Ravindu Madhushan Pushpakumara, Hiroyuki Kajimoto, Arata Jingu, Jürgen Steimle, and Anusha Withana. 2025. eTactileKit: A Toolkit for Design Exploration and Rapid Prototyping of Electro-Tactile Interfaces. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25). Association for Computing Machinery, New York, NY, USA, 17 pages. https://doi.org/10.1145/3746059.3747796

    Mostra di più Mostra meno
    13 min
  • VRST 2025 Self-Confrontation in VR: How Seeing Yourself Shapes Motor Skill Reflection
    Jan 5 2026

    What happens when you watch yourself perform in VR—and then have to critique that performance? This study explored self-reflection in motor skill learning using a Karate training task. Participants were embodied as a "trainer" avatar and asked to give verbal feedback on a "trainee"—which was either their own 3D-scanned appearance or a stranger, performing either their own recorded movements or an expert's. The results revealed a psychological "role conflict": participants felt split between being the evaluator and being evaluated. Seeing their own appearance triggered deeper, more emotional reflection, while recognizing their own movements created bodily connection even in a stranger's avatar. The findings suggest VR embodiment isn't binary but multi-faceted, with implications for training and therapy.

    Dennis Dietz, Samuel Benjamin Rogers, Julian Rasch, Sophia Sakel, Nadine Wagener, Andreas Martin Butz, and Matthias Hoppe. 2025. The 2×2 of Being Me and You: How the Combination of Self and Other Avatars and Movements Alters How We Reflect on Ourselves in VR. In Proceedings of the 31st ACM Symposium on Virtual Reality Software and Technology (VRST '25). Association for Computing Machinery, New York, NY, USA, 11 pages. https://doi.org/10.1145/3756884.3765986

    Mostra di più Mostra meno
    14 min
  • AHs 2025 Pseudo-Heartbeat Feedback for Meditation Support
    Jan 2 2026

    Can fake heartbeat sounds trick your body into relaxing? This system generates pseudo-heartbeat audio at rates slightly slower than your actual heart rate to induce calm and support meditation. Using a contactless radar sensor to detect real heartbeats, it creates slower auditory feedback (10-30% below actual BPM). Tested with 120 participants at SIGGRAPH Asia 2024, results showed that hearing slower heartbeats made people feel their heart rate was decreasing—even when they knew the sounds were manipulated. The findings suggest potential for non-pharmacological treatment of insomnia through enhanced interoceptive awareness.

    Akari Shimabukuro, Seioh Ezaki, and Keiichi Zempo. 2025. Meditation Support System Utilizing Pseudo-Heartbeat Auditory Feedback to Enhance Cardiac Interoceptive Awareness. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 4 pages. https://doi.org/10.1145/3745900.3746096

    Mostra di più Mostra meno
    12 min
  • AHs 2025 Purrfect Pitch: Learning Musical Intervals through Audio-Haptic Feedback
    Dec 29 2025

    Learning to identify musical pitch intervals usually requires tedious rote practice. Purrfect Pitch offers a new approach: a wearable haptic vest that translates sound into touch. When users hear two musical notes, they simultaneously feel vibrations at corresponding vertical positions on their back—leveraging our natural "high/low" pitch metaphor. In a study with 18 participants, those using the audio-haptic system identified intervals 20% more accurately and 1.67 seconds faster than audio-only learners. However, the performance boost didn't persist after removing the haptic feedback, suggesting the vest enhances task performance but doesn't accelerate long-term skill acquisition.

    Sam Chin, Cathy Mengying Fang, Nikhil Singh, Ibrahim Ibrahim, Joe Paradiso, and Pattie Maes. 2025. Purrfect Pitch: Exploring Pitch Interval Learning through an Audio-Haptic Interface. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 12 pages. https://doi.org/10.1145/3745900.3746079

    Mostra di più Mostra meno
    14 min
  • AHs 2025 GazeLLM: Multimodal LLMs incorporating Human Visual Attention
    Dec 27 2025

    Processing high-resolution video with AI requires massive computational resources. GazeLLM offers an elegant solution inspired by human vision: use eye-tracking to focus only on what matters. By cropping first-person video to a small region around the user's gaze point, the system reduces pixel input to just one-tenth while achieving task comprehension equal to or better than full-resolution video. User evaluations across six real-world activities—cooking, bike repair, first aid, and sports—showed that gaze-focused video produces higher quality task descriptions than both full videos and center-cropped alternatives.

    Jun Rekimoto. 2025. GazeLLM: Multimodal LLMs incorporating Human Visual Attention. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 10 pages. https://doi.org/10.1145/3745900.3746075

    Mostra di più Mostra meno
    12 min
  • AHS 2025 Cuddle-Fish: Flying Robots That Are Safe, Huggable Companions
    Dec 13 2025

    Traditional quadrotor drones pose safety concerns with their spinning blades. Cuddle-Fish takes a different approach: a helium-filled soft robot with bio-inspired flapping wings that's safe enough to touch, hug, and interact with physically. In testing with 24 participants, people spontaneously engaged in affective behaviors like patting, stroking, and even hugging the robot. Users reported positive emotional responses and felt safe during interactions, with some participants touching the robot to their cheeks, demonstrating trust and comfort.

    Mingyang Xu, Jiayi Shao, Yulan Ju, Ximing Shen, Qingyuan Gao, Weijen Chen, Qing Zhang, Yun Suen Pai, Giulia Barbareschi, Matthias Hoppe, Kouta Minamizawa, and Kai Kunze. 2025. Cuddle-Fish: Exploring a Soft Floating Robot with Flapping Wings for Physical Interactions. In Proceedings of the Augmented Humans International Conference 2025 (AHs '25). Association for Computing Machinery, New York, NY, USA, 14 pages. https://doi.org/10.1145/3745900.3746080

    Mostra di più Mostra meno
    14 min