Tensor representations allow compact storage and efficient manipulation of multi-dimensional data. Based on these, tensor methods build low-rank subspaces for the solution of multi-dimensional and multi-parametric models. However, tensor methods cannot always be implemented efficiently, specially when dealing with non-linear models. In this paper, we discuss the importance of achieving a tensor representation of the model itself for the efficiency of tensor-based algorithms. We investigate the adequacy of interpolation rather than projection-based approaches as a means to enforce such tensor representation, and propose the use of cross approximations for models in moderate dimension. Finally, linearization of tensor problems is analyzed and several strategies for the tensor subspace construction are proposed. This is a post-peer-review, pre-copyedit version of an article published in Journal of scientific computing.
Published on 01/01/2019
DOI: 10.1007/s10915-019-00917-2
Licence: CC BY-NC-SA license
Are you one of the authors of this document?