Music. We use it to express, to understand, and to remember. Many of us strive to enjoy it better, but few of us are successfully trained. How can the learning of machines help humans learn music? How do we unlock the potential of a multi-modal high-throughput human-computer interface for music education? My work aims to revolutionize the way music is taught in the information age. The smart tutoring system our team has been developing uses haptic gloves, real-time feedback, and student-centric computing to fabricate an unconventional but significantly effective way to learn music.
Publications
- Chin, Daniel and Gus Xia. "A Computer-aided Multimodal Music Learning System with Curriculum: A Pilot Study." Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2022.
- Li, Michael Yue, Daniel Chin, Charles Puelz and Pejman Sanaei. "Simulating liquid-gas interfaces and moving contact lines with the immersed boundary method". Physics of Fluids 34, 053323. 2022. https://doi.org/10.1063/5.0086452.
- Chin, Daniel and Gus Xia. "Measuring a Six-hole Recorder Flute's Response to Breath Pressure Variations and Fitting a Model." arXiv preprint arXiv:2107.08727 2021.
- Chin, Daniel, Ian Zhang, and Gus Xia. "Hyper-hybrid Flute: Simulating and Augmenting How Breath Affects Octave and Microtone." Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2021.
- Chin, Daniel, Yian Zhang, Tianyu Zhang, Jake Zhao, and Gus Xia. "Interactive Rainbow Score: A Visual-centered Multimodal Flute Tutoring System." Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2020.
- Zhang, Yian, Yinmiao Li, Daniel Chin, and Gus Xia. Adaptive Multimodal Music Learning via Interactive-haptic Instrument. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2019.
Research Interests
HCI