Abstract

Modeling symmetric positive definite (SPD) material properties, such as thermal conductivity, under uncertainty often leads to a substantial computational burden. Neural networks can help mitigate these costs, yet standard architectures do not inherently preserve SPD properties. To address this, we propose novel neural network layers that map the space of SPD matrices to a linear space using logarithmic maps. We evaluate the performance of these networks by comparing different mapping strategies based on validation losses and uncertainty propagation. Our approach is applied to a steady-state heat conduction problem in a patched cube with anisotropic and uncertain thermal conductivity, modeled as a spatially homogeneous, tensor-valued random variable. The results indicate that the logarithmic mapping of tensor eigenvalues significantly improves learning performance, highlighting its utility in handling tensor data in neural networks. Furthermore, the formulation facilitate separation of strength and orientational information.

Full Paper

The PDF file did not load properly or your web browser does not support viewing PDF files. Download directly to your device: Download PDF document
Back to Top

Document information

Published on 28/10/24
Submitted on 28/10/24

Volume Recent Advances in Accelerated Simulations for Solid, Fluid and Coupled Problems: Implementations and Applications, 2024
DOI: 10.23967/eccomas.2024.259
Licence: CC BY-NC-SA license

Document Score

0

Views 52
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?