top of page
Publication
Publication
Recent Projects
VR_environment3.png

Exploring Familiarity and Knowledgeability in Conversational Virtual Agents

This study contributes to the field of conversational human-agent interaction in VR by providing empirical evidence on how adapting both familiarity and knowledgeability of virtual agents can significantly enhance user experience...

Fig4_fit.jpg

Embodied Conversational Agents in Extended Reality: A Systematic Review

Published in: IEEE Access 2025

Our work identified the gap between the existing reviews and the current trends in XR ECAs. We began with 1,717 related papers from January 2014 to June 2024. We narrowed down the selection to 23 papers using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework...

effects_knowledge.png

The Effects of Depth of Knowledge of a Virtual Agent

We explored the impact of depth of knowledge on conversational agents and human perceptions in a virtual reality (VR) environment. We designed experimental conditions with low, medium, and high depths of knowledge in the domain of game development and tested them among 27 game development students...

avoid.png

Avoiding Virtual Characters: The Effects of Proximity and Gesture

Our study revealed that 1) the proximity factor impacted how our participants rated their co-presence and behavioral interdependence, as well as whether they decided to pass through or around the virtual characters, and 2) the gesture factor impacted how participants rated their behavioral interdependence, emotional reactivity, perceived politeness, and also affected their duration, trajectory length, and speed...

siggraph.jpg

Holographic Sign Language Interpreters

Published in: 2022 ACM SIGGRAPH 2022 Educator's Forum

We describe the implementation of a prototype system of 3D holographic sign language interpreters. The signing avatars, observed through wearable Mixed Reality (MR) smartglasses (e.g., Microsoft HoloLens), translate speech to Signed Exact English (SEE) in real-time...

Fig6_ExperimentalSetup.jpg

Holographic sign language avatar interpreter: A user interaction study in a mixed reality classroom

Published in: 2022 CASA Computer Animation & Virtual World

We explored user interactions with a holographic sign language interpreter in mixed reality (MR) classroom for deaf and hard of hearing students...Our study explored user interaction with the MR system, intending to provide design guidelines for digital MR sign language interpreters...

bipolar1.png

Embodiment for the Difference: A VR Experience of Bipolar Disorder

Published in: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)

The objective of this project is to simulate the symptoms of bipolar disorder through a virtual reality application. We aim to provide an experience of how people live with bipolar disorder...

Recent Projects
Projects
TeaserImage.png

Co-solving Jigsaw Puzzle with a LLM-powered Virtual Agent

2025 - Purdue Virtual Reality Lab

Intelligence and knowledgeability are sometimes treated interchangeably in virtual agents, yet they shape interaction in different ways. We disentangled these traits and tested how each drives human perceptions and interaction in virtual reality (VR)...

museum_edited.jpg

LLM-powered Virtual Museum Tour Guide

2024 - Purdue Virtual Reality Lab

We explored how responsiveness (i.e., the ability to answer questions) and awareness (i.e., the ability to navigate toward the user in the virtual environment) of a virtual agent acting as a tour guide impact study participants in a virtual museum.

effects_knowledge.png

LLM-powered Virtual Professor

2024 Ongoing Research - Purdue Virtual Reality Lab

This series of projects utilized LLM and the ChatGPT API to incorporate natural conversations between users and virtual agents. We ran studies and asked participants to converse with virtual professors built with different depths of knowledge in their field of study, and evaluated participants' perceptions. Our research aims to explore human perceptions interacting with embodied agents through natural speech in VR.

MRclassroom10_v2.png

Holographic ASL Interpreter in Mixed Reality Classroom

2020 - Purdue IdeaLab & Virtual Reality Lab

This project aims to develop a prototype of 3D holographic American Sign Language (ASL) interpreters that increase learning efficacy for deaf students. Apart from the ongoing research for ASL interpretation digitalization, we conducted qualitative usability tests to observe user interactions toward such MR application.

flameforest_sceneshot.png

Flame in the Forest - Unreal Oculus game

2021 Game Development Group Project

This is a VR game built in Unreal Engine. Oculus VR plugin and VR inputs are added to run the application through Oculus Link on a Oculus Quest device.

Team members: Syed Tanzim, Brantly McCord,  Joshua Heller's, Angel Lam, Metali Mangal.

handtrackAnim.png

Hand Tracking Animation Trigger in Oculus Quest

2020 Personal Research POC

I built a hand gesture identifier through Oculus plugins and perform animation control that triggers different character animation states. This is a fundamental POC for hand gesture animation creation within a VR environment...

VR_room.png

Multi-users Virtual Interior Design Application in Oculus Quest

2019 Research POC

The goal is to build a multiplayer Virtual Reality game on Oculus Quest, powered by Photon Engine, allowing players to decorate a virtual room together in real-time...

CSG.png

Character Humor v.s. Learning Outcome in Pedagogical Games

2019-2020 Research POC - Purdue IdeaLab

The objective of this project is to build 2 pedagogical games through Unity, one version with game character performing humorous movement and the other performing normal movement, and test out the learning outcome of each group...

Graphics Programming & Computational Photography
Others
UXUI
bottom of page