Zhongyuan Yu
#Research Highlights
[CHI '23] PEARL: Physical Environment based Augmented Reality Lenses
Pearl is a mixed-reality approach for the analysis of human movement data in situ. As the physical environment shapes human motion and behavior, the analysis of such motion can benefit from the direct inclusion of the environment in the analytical process. We present methods for exploring movement data in relation to surrounding regions of interest, such as objects, furniture, and architectural elements. [Project Page][Paper PDF][Code]
[VRST '23] Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos
In Dynascape, we explore technological possibilities for the capturing and immersive playback of real-world dynamic scenes such as a busy street or crowded marketplace. We utilize capturing techniques based on cutting-edge mobile sensors and record the dynamic scene into a single portable spatially tracked RGB-D video. Such video is designed to be an informative representation of the scene being captured and contains camera movements and depth frames along with traditional color frames. Within Dynascape, we developed a suite of tools for immersive exploration of the spatially tracked RGB-D video dataset in mixed reality. Our tool supports editing and compositing functions. For editing, we proposed immersive widgets supporting the spatial, temporal, as well as appearance editing functions. For compositing, we provide functions to let users composite multiple tracked RGB-D videos in space. [Paper][Code]
[NICOint '24] An Immersive Method for Extracting Structural Information from Unorganized Point Clouds
In this paper, we introduced immersive methods to enhance the visual perception of the point clouds and a complete point cloud structuring pipeline including geometry and connectivity information extraction. The pipeline includes a collection of innovative algorithms involving human expertise in the traditional point cloud processing algorithm with a VR interface. We implemented an interactive prototype and obtained outstanding results on existing public datasets. [Paper][Code]
#Supervision Topics
- Immersive Authoring for Virtual Content Management
- Immersive Analytics and Data Exploration
- MUMR System Management
(More information on Theses and Research Projects. Please do not hesitate to contact me directly via email )
#Publication List
2024
-
An immersive labeling method for large point clouds , Nov 2024, In: Computers and Graphics. 124, 12 p., 104101Electronic (full-text) versionResearch output: Contribution to journal > Research article
-
Moving Experiences — Towards a Conceptual Framework for Performative Artefacts in Museums , Oct 2024Research output: Contribution to conferences > Paper
-
An Immersive Method for Extracting Structural Information from Unorganized Point Clouds , 2024, 2024 Nicograph International (NicoInt)Research output: Contribution to book/conference proceedings/anthology/report > Conference contribution
2023
-
ViewR: Architectural-Scale Multi-User Mixed Reality with Mobile Head-Mounted Displays , 7 Aug 2023, In: IEEE Transactions on Visualization and Computer Graphics. 30 (2024), 8, p. 5609-5622, 14 p.Electronic (full-text) versionResearch output: Contribution to journal > Research article
-
PEARL: Physical Environment based Augmented Reality Lenses for In-Situ Human Movement Analysis , 19 Apr 2023, CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsElectronic (full-text) versionResearch output: Contribution to book/conference proceedings/anthology/report > Conference contribution
-
An Immersive Labeling Method for Large Point Clouds , Mar 2023, Proceedings - 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023. IEEE, 2 p.Electronic (full-text) versionResearch output: Contribution to book/conference proceedings/anthology/report > Conference contribution
-
Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos , 2023, p. 1–12Electronic (full-text) versionResearch output: Contribution to conferences > Paper