Modeling symmetric positive definite (SPD) material properties, such as thermal conductivity, under uncertainty often leads to a substantial computational burden. Neural networks can help mitigate these costs, yet standard architectures do not inherently preserve SPD properties. To address this, we propose novel neural network layers that map the space of SPD matrices to a linear space using logarithmic maps. We evaluate the performance of these networks by comparing different mapping strategies based on validation losses and uncertainty propagation. Our approach is applied to a steady-state heat conduction problem in a patched cube with anisotropic and uncertain thermal conductivity, modeled as a spatially homogeneous, tensor-valued random variable. The results indicate that the logarithmic mapping of tensor eigenvalues significantly improves learning performance, highlighting its utility in handling tensor data in neural networks. Furthermore, the formulation facilitate separation of strength and orientational information.
Published on 28/10/24
Submitted on 28/10/24
Volume Recent Advances in Accelerated Simulations for Solid, Fluid and Coupled Problems: Implementations and Applications, 2024
DOI: 10.23967/eccomas.2024.259
Licence: CC BY-NC-SA license
Are you one of the authors of this document?