Immersive 3D Medical Visualization in Virtual Reality using Stereoscopic Volumetric Path Tracing

Use this link to cite
http://hdl.handle.net/2183/40004Collections
- Investigación (FIC) [1615]
Metadata
Show full item recordTitle
Immersive 3D Medical Visualization in Virtual Reality using Stereoscopic Volumetric Path TracingDate
2024Citation
Taibo, J., Iglesias-Guitian, J. A. (2024). Immersive 3D Medical Visualization in Virtual Reality using Stereoscopic Volumetric Path Tracing. In 2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR), p. 1044-1053, doi: 0.1109/VR58804.2024.00123
Abstract
[Abstract] Scientific visualizations using physically-based lighting models play a crucial role in enhancing both image quality and realism. In the domain of medical visualization, this trend has gained significant traction under the term cinematic rendering (CR). It enables the creation of 3D photorealistic reconstructions from medical data, offering great potential for aiding healthcare professionals in the analysis and study of volumetric datasets. However, the adoption of such advanced rendering for immersive virtual reality (VR) faces two main limitations related to their high computational demands. First, these techniques are frequently used to produce pre-recorded videos and offline content, thereby restricting interactivity to predefined volume appearance and lighting settings. Second, when deployed in head-tracked VR environments they can induce cyber-sickness symptoms due to the disturbing flicker caused by noisy Monte Carlo renderings. Consequently, the scope for meaningful interactive operations is constrained in this modality, in contrast with the versatile capabilities of classical direct volume rendering (DVR). In this work, we introduce an immersive 3D medical visualization system capable of producing photorealistic and fully interactive stereoscopic visualizations on head-mounted display (HMD) devices. Our approach extends previous linear regression denoising to enable real-time stereoscopic cinematic rendering within AR/VR settings. We demonstrate the capabilities of the resulting VR system, like its interactive rendering, appearance and transfer function editing.
Keywords
Solid modeling
Three-dimensional displays
Stereo image processing
Lighting
Data visualization
Transfer functions
Virtual reality
Computing methodologies
Ray tracing
Human-centered computing
Graphical user interfaces
Three-dimensional displays
Stereo image processing
Lighting
Data visualization
Transfer functions
Virtual reality
Computing methodologies
Ray tracing
Human-centered computing
Graphical user interfaces
Editor version
Rights
© © 20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other Works