Moi! This is Yi-Chi!


I’m a PhD student at Aalto University researching Computational Interaction as part of the User Interfaces Group, led by Prof Antti Oulasvirta. Before this, I received my bachelor’s and master’s degree in Information Management at National Taiwan University. My interests are designing and building physical interfaces utilizing computational methods such as Bayesian data analysis, optimization, reinforcement learning, and meta-learning. My ultimate goal is to bring HCI one step closer to optimized touch and haptic interactions.

I currently research in assisting UI design via multi-objective Bayesian Optimization, and modelling how humans interact with physical interfaces with meta-reinforcement learning methods. My publications are in peer-reviewed venues, including CHI, UIST, and SIGGRAPH, the top human-computer interaction venues from the ACM.

I also lecture on various topics in Aalto, including input and sensing, Bayesian decoders, Bayesian data analysis, and Deep Learning. I am invited to review for multiple ACM conferences and IEEE journals.


Selected Publications


Button Simulation and Design via FDVV Models [CHI’20, 10-page Paper]

Designing a push-button with desired sensation and performance is challenging because the mechanical construction must have the right response characteristics. In this paper, we extend the typical force-displacement (FD) modeling to include vibration (V) and velocity-dependence characteristics (V). The resulting FDVV models better capture tactility characteristics of buttons. They increase the range of simulated buttons and the perceived realism relative to FD models. The paper also demonstrates methods for obtaining these models, editing them, and simulating accordingly. Our approach enables the analysis, prototyping, and optimization of buttons, and supports exploring designs that would be hard to implement mechanically.

In Proc. CHI’20 // [Project Page], [Paper], [30s Video], [Full Video].



Dwell+: Multi-Level Mode Selection Using Vibrotactile Cues [UIST’17, 10-page Paper]

This paper presents Dwell+, a method that boosts the effectiveness of typical dwell select by augmenting the passive dwell duration with active haptic ticks which promptly drives rapid switches of modes forward through the user’s skin sensation. Dwell+ enables multi-level dwell select using rapid haptic ticks. To select a mode from a button, users dwell-touch the button until the mode of selection being haptically prompted. Applications demonstrated implementing Dwell+ across different interfaces; ranging from vibration-enabled touchscreens to non-vibrating interfaces.

In Proc. UIST’17 // [Project Page], [Paper], [30s Video], [Full Video].



Outside-In: Visualizing Out-of-Sight Region-of-Interests in a 360 Video Using Spatial Picture-in-Picture Previews
[UIST’17, 9-page Paper]

We propose Outside-In, a visualization technique which re-introduces off-screen ROIs into the main screen as spatialpicture-in-picture (PIP) previews. The geometry of the pre-view windows further encodes the ROIs’ relative directions tothe main screen view, allowing for effective navigation.

In Proc. UIST’17 // [Project Page], [Paper], [Video].



EdgeVib: Effective Alphanumeric Character Output Using a Wrist-Worn Tactile Display. [UIST’16, 6-page Paper]

Yi-Chi Liao, Yi-Ling Chen, Jo-Yu Lo, Rong-Hao Liang, Liwei Chan, Bing-Yu Chen

“Transferring rich spatialtemporal tactile messages while retaining the recognition rates” has been a major challenge in the development of tactile displays. We present EdgeVib, a set of multistroke alphanumeric patterns based on EdgeWrite. Learning these patterns takes comparable period to learning Graffiti (15min), while the recognition rates achive 85.9% and 88.6% for alphabet and digits respectively.

In Proc. UIST’16 // [Project Page], [Paper], [Video].


ThirdHand: Wearing a Robotic Arm to Experience Rich Force Feedback. [Siggraph Asia’15 Emerging Technology]

Yi-Chi Liao, Shun-Yao Yang, Rong-Hao Liang, Liwei Chan, Bing-Yu Chen

ThirdHnad is a wearable robotic arm provides 5-DOF force feedback to enrich the mobile gaming experience. Comparing to traditional mounted-on-environment force-feedback devices such as phantom, ThirdHand provides higher mobility due to its wearable form. Also, comparing to the muscle-propelled and gyro-effect solutions, our approach enables more accurate control with stronger forces.

In Proc. Siggraph Asia’15 Emerging Technology // [Project Page], [Paper], [Video].


Paper Review

IEEE Haptics Symposium: 2020
IEEE Transactions on Haptics: 2019
ACM CHI: 2016-2020
ACM MobileHCI: 2017-2020
ACM TEI: 2017-2018
ACM UbiComp/ISWC: 2017
Augmented Human: 2016-2017


Other Professional Activities

In 2020, I gave a lecture about Bayesian Statistics and its applications in User Research course, Aalto University (by PhD. Aurélien Nioche). I also gave a lecture on Deep Learning in Computational User Interface Design, Aalto University (by Prof. Antti Oulasvirta).

In 2019, I gave a lecture in Computational User Interface Design course, Aalto University (by Prof. Antti Oulasvirta) for introducing Probilistic Decoding. In another course, Engineering for Humans, Aalto University (by Prof. Antti Oulasvirta), I talked about Input Sensing Pipeline and Data Processing.

During 2014 to 2016, I’ve been a teaching assistant of Introduction to HCI (lectured by Prof. Rong-Hao Liang and Prof. Bing-Yu Chen), and Computer Architecture (lectured by Prof. Bing-Yu Chen) in National Taiwan University.

I also student volunteered at Siggraph Aisa 2016, and held the biggest HCI workshop in Taiwan, OpenHCI’15 and OpenHCI’16. I also organized the other workshop, HoCuIn’17, to introduce more research-oriented HCI to gruaduate students in Taiwan.