Moi! This is Yi-Chi!


Hi, I’m a Ph.D. student working on Computational Interaction in the User Interfaces research group at Aalto University led by Prof. Antti Oulasvirta. Before that, I received my bachelor’s (B.B.A.) and master’s degree (M.B.A.) at National Taiwan University. As a believer of computational methods, my research focus is building future touch and haptic interfaces via modeling, controlling, optimization, simulation, and machine learning.

My previous works have be presented at top-tier conferences in the Human-Computer Interaction field (CHI, UIST, Siggraph Asia E-Tech). Those projects include Button Simulation and Design - a novel model for button tactility and an end-to-end simulation/design pipeline, Dwell+ - augmenting traditional dwell-touch using short vibration ticks, Outside-In - a visualization technique re-introduces off-screen objects into the main screen as picture-in-picture (PIP) previews, EdgeVib - using a 2x2 vibrotactile array to transfer alphanumeric characters, and ThirdHand - a wearable robotic arm generates rich force feedback to the users.


Selected Publications


Dwell+: Multi-Level Mode Selection Using Vibrotactile Cues [UIST’17, 10-page Paper]

This paper presents Dwell+, a method that boosts the effectiveness of typical dwell select by augmenting the passive dwell duration with active haptic ticks which promptly drives rapid switches of modes forward through the user’s skin sensation. Dwell+ enables multi-level dwell select using rapid haptic ticks. To select a mode from a button, users dwell-touch the button until the mode of selection being haptically prompted. Applications demonstrated implementing Dwell+ across different interfaces; ranging from vibration-enabled touchscreens to non-vibrating interfaces.

In Proc. UIST’17, 12-page paper // [Project Page], [Paper], [Short Video], [Full Video].



Outside-In: Visualizing Out-of-Sight Region-of-Interests in a 360 Video Using Spatial Picture-in-Picture Previews
[UIST’17, 11-page Paper]

We propose Outside-In, a visualization technique which re-introduces off-screen ROIs into the main screen as spatialpicture-in-picture (PIP) previews. The geometry of the pre-view windows further encodes the ROIs’ relative directions tothe main screen view, allowing for effective navigation.

In Proc. UIST’17, 10-page paper // [Project Page], [Paper], [Video].



EdgeVib: Effective Alphanumeric Character Output Using a Wrist-Worn Tactile Display. [UIST’16, 6-page Paper]

Yi-Chi Liao, Yi-Ling Chen, Jo-Yu Lo, Rong-Hao Liang, Liwei Chan, Bing-Yu Chen

“Transferring rich spatialtemporal tactile messages while retaining the recognition rates” has been a major challenge in the development of tactile displays. We present EdgeVib, a set of multistroke alphanumeric patterns based on EdgeWrite. Learning these patterns takes comparable period to learning Graffiti (15min), while the recognition rates achive 85.9% and 88.6% for alphabet and digits respectively.

In Proc. UIST’16, 7-page paper // [Project Page], [Paper], [Video].


ThirdHand: Wearing a Robotic Arm to Experience Rich Force Feedback. [Siggraph Asia’15 Emerging Technology]

Yi-Chi Liao, Shun-Yao Yang, Rong-Hao Liang, Liwei Chan, Bing-Yu Chen

ThirdHnad is a wearable robotic arm provides 5-DOF force feedback to enrich the mobile gaming experience. Comparing to traditional mounted-on-environment force-feedback devices such as phantom, ThirdHand provides higher mobility due to its wearable form. Also, comparing to the muscle-propelled and gyro-effect solutions, our approach enables more accurate control with stronger forces.

In Proc. Siggraph Asia’15 Emerging Technology // [Project Page], [Paper], [Video].


Paper Review

IEEE Haptics Symposium: 2020
IEEE Transactions on Haptics: 2019
ACM CHI: 2016-2020
ACM MobileHCI: 2017-2020
ACM TEI: 2017-2018
ACM UbiComp/ISWC: 2017
Augmented Human: 2016-2017


Other Professional Activities

In 2019, I gave guest lecture in Computational User Interface Design course, Aalto University (by Prof. Antti Oulasvirta) in which I talked about Probilistic Decoding. In the other course, Engineering for Humans, Aalto University (by Prof. Antti Oulasvirta), I gave a lecture about Input Sensing Pipeline and Data Processing.

During 2014 to 2016, I’ve been a teaching assistant of Introduction to HCI (lectured by Prof. Rong-Hao Liang and Prof. Bing-Yu Chen), and Computer Architecture (lectured by Prof. Bing-Yu Chen) in National Taiwan University.

I also student volunteered at Siggraph Aisa 2016, and held the biggest HCI workshop in Taiwan, OpenHCI’15 and OpenHCI’16. I also organized the other workshop, HoCuIn’17, to introduce more research-oriented HCI to gruaduate students in Taiwan.