I am Linping YUAN (袁林萍), currently a research assistant professor at the Department of Computer Science and Engineering (CSE), The Hong Kong University of Science and Technology (HKUST). I received my Ph.D. at HKUST VisLab at CSE of HKUST, supervised by Prof. Huamin Qu. Before that, I obtained my bachelor’s degree in Software Engineering from Xi’an Jiaotong University (XJTU) in 2019.

My research interests lie in the intersection of virtual/augmented reality (VR/AR), human-computer interaction (HCI), and data visualization (VIS). I design and develop algorithms, interactive tools, and visual analytic systems to facilitate the creation of various artifacts, including 2D visualizations and infographics, AR data videos, and VR animated stories. Specifically, my research 1) provides computational creativity support by mining design practices from large datasets with deep learning techniques, and 2) adopts data-driven approach to facilitate creators understand and improve the way audience interact with their artifacts.

Welcome to drop me an email if you are interested in my research or want to explore research collaboration :D

Latest News

I am serving as an Area Chair of the Computational Interaction Subcommittee for CHI 2025. Thanks for the invitation!
I obtained my Ph.D. degree and joined CSE of HKUST as a research assistant professor.
Two papers have been accepted by UIST 2024, in which we propose a wonderful MR system for animation prototyping and a proactive chatbot to help visually impaired people reminisce with a photo collection.
Our paper VisTellAR Embedding Data Visualization to Short-form Videos Using Mobile Augmented Reality has been accepted by IEEE TVCG. We proposed an amazing tool to create AR data videos!
Our paper Generating Virtual Reality Stroke Gesture Data from Out-of-Distribution Desktop Stroke Gesture Data has been accepted by IEEE VR 2024! It is an important milestone in my journey to becoming an independent researcher.
Our paper VirtuWander Enhancing Multi-modal Interaction for Virtual Tour Guidance through Large Language Models has been conditionally accepted by ACM CHI 2024. It is an interesting exploration to see how LLMs can enhance VR experiences.
I am very excited to start a new journey at Intelligent Interactive Systems Group in University of Cambridge as a visiting student, supervised by Prof. Per Ola Kristensson.
The exhibition China in Maps - 500 Years of Evolving Images has been launched in HKUST! Glad to be an AR developer in this amazing project.
VR data story is a promising and interesting storytelling form. Wanting to know how to design an effective and interactive VR story? Check our recent IJHCS paper From Reader to Experiencer!
Our research proposal Augmenting Situated Visualizations with Tangible User Interfaces was awarded over 1M HKD by the General Research Fund, the Research Grants Council (RGC) of Hong Kong.

Generating Virtual Reality Stroke Gesture Data from Out-of-Distribution Desktop Stroke Gesture Data

IEEE Conference on Virtual Reality and 3D User Interfaces (2024)

Exploring Interactions with Printed Data Visualizations in Augmented Reality

IEEE Transactions on Visualization and Computer Graphics (2022)
Honorable Mention Award

InfoColorizer: Interactive Recommendation of Color Palettes for Infographics

Linping Yuan, Ziqi Zhou, Jian Zhao, Yiqiu Guo, Fan Du, Huamin Qu
IEEE Transactions on Visualization and Computer Graphics (2021)

Deep Colormap Extraction from Visualizations

Linping Yuan, Wei Zeng, Siwei Fu, Zhiliang Zeng, Haotian Li, Chi-Wing Fu, Huamin Qu
IEEE Transactions on Visualization and Computer Graphics (2021)

Show All Publications

Show More Projects