I am Linping YUAN (袁林萍), currently a research assistant professor at the Department of Computer Science and Engineering (CSE), The Hong Kong University of Science and Technology (HKUST). I received my Ph.D. at HKUST VisLab at CSE of HKUST, supervised by Prof. Huamin Qu. Before that, I obtained my bachelor’s degree in Software Engineering from Xi’an Jiaotong University (XJTU) in 2019.
My research interests lie in the intersection of virtual/augmented reality (VR/AR), human-computer interaction (HCI), and data visualization (VIS). I design and develop algorithms, interactive tools, and visual analytic systems to facilitate the creation of various artifacts, including 2D visualizations and infographics, AR data videos, and VR animated stories. Specifically, my research 1) provides computational creativity support by mining design practices from large datasets with deep learning techniques, and 2) adopts data-driven approach to facilitate creators understand and improve the way audience interact with their artifacts.
Welcome to drop me an email if you are interested in my research or want to explore research collaboration :D
Latest News
I am serving as an Area Chair of the Computational Interaction Subcommittee for CHI 2025. Thanks for the invitation! | |
I obtained my Ph.D. degree and joined CSE of HKUST as a research assistant professor. | |
Two papers have been accepted by UIST 2024, in which we propose a wonderful MR system for animation prototyping and a proactive chatbot to help visually impaired people reminisce with a photo collection. | |
Our paper VisTellAR Embedding Data Visualization to Short-form Videos Using Mobile Augmented Reality has been accepted by IEEE TVCG. We proposed an amazing tool to create AR data videos! | |
Our paper Generating Virtual Reality Stroke Gesture Data from Out-of-Distribution Desktop Stroke Gesture Data has been accepted by IEEE VR 2024! It is an important milestone in my journey to becoming an independent researcher. | |
Our paper VirtuWander Enhancing Multi-modal Interaction for Virtual Tour Guidance through Large Language Models has been conditionally accepted by ACM CHI 2024. It is an interesting exploration to see how LLMs can enhance VR experiences. | |
I am very excited to start a new journey at Intelligent Interactive Systems Group in University of Cambridge as a visiting student, supervised by Prof. Per Ola Kristensson. | |
The exhibition China in Maps - 500 Years of Evolving Images has been launched in HKUST! Glad to be an AR developer in this amazing project. | |
VR data story is a promising and interesting storytelling form. Wanting to know how to design an effective and interactive VR story? Check our recent IJHCS paper From Reader to Experiencer! | |
Our research proposal Augmenting Situated Visualizations with Tangible User Interfaces was awarded over 1M HKD by the General Research Fund, the Research Grants Council (RGC) of Hong Kong. |
Featured Publications
Featured Projects
ViePIE aims to promote sustainable lifestyle with AR gamification and digital twin. We enable individuals to perceive the environmental implications of their measurable activities, immerse themselves in the climate change impacts, and empower both individuals and organizations with climate-smart choices.
AR Ricci Map is an AR mobile app that blendes digital maps with a physcial map. It shows different versions of ancient maps and transforms 2D maps into 3D globes, which provides exhibition visitors a new and engaging way to understand these historical maps.
Immersive Walk is an app that celebrates the HKUST 30th Anniversary by allowing students to go through the history of HKUST 30 years history in Augmented Reality. This is one step towards our goal of building AR/VR-enhanced HKUST campuses and enriching students’ and staff’s living experience.