HCI Deep Dives copertina

HCI Deep Dives

HCI Deep Dives

Di: Kai Kunze
Ascolta gratuitamente

3 mesi a soli 0,99 €/mese

Dopo 3 mesi, 9,99 €/mese. Si applicano termini e condizioni.

A proposito di questo titolo

HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). AI-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.Copyright 2024 All rights reserved. Scienza
  • TAFFC 2025 Music Emotion: Are We There Yet? A Brief Survey of Music Emotion Prediction Datasets, Models and Outstanding Challenges
    Jan 20 2026

    Music has long been known to evoke powerful emotions, but can machines truly understand and predict these emotional responses? This survey paper takes stock of the field of music emotion recognition (MER), examining the datasets, computational models, and persistent challenges that shape this research area. The authors review how emotion is represented—from categorical labels to dimensional models like valence-arousal—and analyze the most widely used datasets including the Million Song Dataset and MediaEval benchmarks. They trace the evolution from traditional machine learning approaches using hand-crafted audio features to modern deep learning architectures. Despite significant progress, the paper identifies fundamental challenges: the subjective nature of emotional responses to music, the difficulty of obtaining reliable ground truth labels, and the gap between controlled laboratory studies and real-world listening contexts.

    Jaeyong Kang and Dorien Herremans. 2024. Are We There Yet? A Brief Survey of Music Emotion Prediction Datasets, Models and Outstanding Challenges. IEEE Transactions on Affective Computing, vol. 16, no. 4, 2024. https://doi.org/10.1109/TAFFC.2025.3583505

    Mostra di più Mostra meno
    12 min
  • UIST 2025 Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin Stretch
    Jan 16 2026

    Virtual body extensions like wings or tails offer exciting new experiences in VR, but using them naturally—especially parts you can't see, like a tail—requires proprioceptive feedback to sense position and force without relying on vision. This paper introduces the "Imaginary Joint," a novel approach that uses skin-stretch feedback at the interface between your body and a virtual extension. A wearable device stretches and compresses skin on both sides of the waist to convey joint angle and torque from a virtual tail. The system simultaneously communicates both rotation and force by superimposing skin deformations. In controlled experiments, skin-stretch feedback significantly outperformed vibrotactile feedback in perceptual accuracy, sense of embodiment, and naturalness—with participants reporting the sensation felt remarkably like having an actual tail.

    Shuto Takashita, Jürgen Steimle, and Masahiko Inami. 2025. Imaginary Joint: Proprioceptive Feedback for Virtual Body Extensions via Skin Stretch. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25), September 28–October 01, 2025, Busan, Republic of Korea. ACM, New York, NY, USA, 15 pages. https://doi.org/10.1145/3746059.3747800

    Mostra di più Mostra meno
    14 min
  • UIST 2025 eTactileKit: An Open-Source Toolkit for Electro-Tactile Haptic Design
    Jan 7 2026

    Electro-tactile interfaces—which deliver tactile sensations through electrical stimulation of skin nerves—offer unique advantages like fast response times, thin flexible form factors, and the ability to simulate textures, softness, and even coldness. But designing them has been notoriously difficult, requiring deep electronics expertise and custom hardware. eTactileKit changes that. This open-source toolkit provides end-to-end support: modular hardware that scales from 8 to 128+ electrodes, design tools for creating 2D and 3D electrode layouts, a Processing-based pattern creator with visual simulation, a GUI for real-time testing and calibration, and APIs for Python and Unity. A three-week study with both novice and experienced designers showed the toolkit significantly lowered the barrier to entry while improving design workflows—enabling rapid prototyping of applications from VR haptic buttons to 3D-printed interactive toys.

    Praneeth Bimsara Perera, Ravindu Madhushan Pushpakumara, Hiroyuki Kajimoto, Arata Jingu, Jürgen Steimle, and Anusha Withana. 2025. eTactileKit: A Toolkit for Design Exploration and Rapid Prototyping of Electro-Tactile Interfaces. In The 38th Annual ACM Symposium on User Interface Software and Technology (UIST '25). Association for Computing Machinery, New York, NY, USA, 17 pages. https://doi.org/10.1145/3746059.3747796

    Mostra di più Mostra meno
    13 min
Ancora nessuna recensione