Zhongyuan Yu
#Research Highlights
[CHI '23] PEARL: Physical Environment based Augmented Reality Lenses
Pearl is a mixed-reality approach for the analysis of human movement data in situ. As the physical environment shapes human motion and behavior, the analysis of such motion can benefit from the direct inclusion of the environment in the analytical process. We present methods for exploring movement data in relation to surrounding regions of interest, such as objects, furniture, and architectural elements. [Project Page][Paper PDF][Code]
[VRST '23] Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos
In Dynascape, we explore technological possibilities for the capturing and immersive playback of real-world dynamic scenes such as a busy street or crowded marketplace. We utilize capturing techniques based on cutting-edge mobile sensors and record the dynamic scene into a single portable spatially tracked RGB-D video. Such video is designed to be an informative representation of the scene being captured and contains camera movements and depth frames along with traditional color frames. Within Dynascape, we developed a suite of tools for immersive exploration of the spatially tracked RGB-D video dataset in mixed reality. Our tool supports editing and compositing functions. For editing, we proposed immersive widgets supporting the spatial, temporal, as well as appearance editing functions. For compositing, we provide functions to let users composite multiple tracked RGB-D videos in space. [Paper][Code]
[NICOint '24] An Immersive Method for Extracting Structural Information from Unorganized Point Clouds
In this paper, we introduced immersive methods to enhance the visual perception of the point clouds and a complete point cloud structuring pipeline including geometry and connectivity information extraction. The pipeline includes a collection of innovative algorithms involving human expertise in the traditional point cloud processing algorithm with a VR interface. We implemented an interactive prototype and obtained outstanding results on existing public datasets. [Paper][Code]
#Supervision Topics
- Immersive Authoring for Virtual Content Management
- Immersive Analytics and Data Exploration
- MUMR System Management
(More information on Theses and Research Projects. Please do not hesitate to contact me directly via email )
#Publication List
2024
-
An immersive labeling method for large point clouds , Nov. 2024, in: Computers and Graphics. 124, 12 S., 104101Elektronische (Volltext-)VersionPublikation: Beitrag in Fachzeitschrift > Forschungsartikel
-
Moving Experiences — Towards a Conceptual Framework for Performative Artefacts in Museums , Okt. 2024Publikation: Beitrag zu Konferenzen > Paper
-
An Immersive Method for Extracting Structural Information from Unorganized Point Clouds , 2024, 2024 Nicograph International (NicoInt)Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/Gutachten > Beitrag in Konferenzband
2023
-
ViewR: Architectural-Scale Multi-User Mixed Reality with Mobile Head-Mounted Displays , 7 Aug. 2023, in: IEEE Transactions on Visualization and Computer Graphics. 30 (2024), 8, S. 5609-5622, 14 S.Elektronische (Volltext-)VersionPublikation: Beitrag in Fachzeitschrift > Forschungsartikel
-
PEARL: Physical Environment based Augmented Reality Lenses for In-Situ Human Movement Analysis , 19 April 2023, CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsElektronische (Volltext-)VersionPublikation: Beitrag in Buch/Konferenzbericht/Sammelband/Gutachten > Beitrag in Konferenzband
-
An Immersive Labeling Method for Large Point Clouds , März 2023, Proceedings - 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023. IEEE, 2 S.Elektronische (Volltext-)VersionPublikation: Beitrag in Buch/Konferenzbericht/Sammelband/Gutachten > Beitrag in Konferenzband
-
Dynascape: Immersive Authoring of Real-World Dynamic Scenes with Spatially Tracked RGB-D Videos , 2023, S. 1–12Elektronische (Volltext-)VersionPublikation: Beitrag zu Konferenzen > Paper