How does this work?#

Real-world problems such as fracture of objects with complex geometries made of heterogeneous materials, or phase separation problems must often be modeled in a three-dimensional setting. Visualizing in an insightful way three-dimensional results for scientific publications or presentations poses a challenge due to the inevitable reduction to two-dimensional images or to videos, adopting viewpoints chosen by the research authors. This is why we make use of augmented reality (AR) to make three-dimensional results more accessible and enable their fruition from user-defined viewpoints.

To do so, we employ a pipeline based on the paper ‘A brief note on building augmented reality models for scientific visualization’ by Mathur et. al, which fully relies on open-source software. The postprocessing of numerical results is done with ParaView, where we automated the creation of a surface representation with Python scripts. This is then further processed in Blender, where finally an AR model is exported. Finally, they are hosted on this website built with Sphinx, where the models can either be interactively viewed in the browser, or in AR on any recent mobile device by clicking on the button in the lower right corner of a viewer. For this, we rely on the model-viewer package.

For more information, see the presentation slides from our talk at the GAMM Phase-Field Workshop in Februrary 2024 as well as the Gitlab repository with all scripts used for the creation of this website and the AR-models.