7–11 Apr 2025
Lecture and Conference Centre
Europe/Warsaw timezone

Mixed-Precision Parallel Tensor Train Operations

8 Apr 2025, 09:10
20m
Room 0.21

Room 0.21

Speaker

Eda Oktay

Description

Tensor Train (TT) decomposition is a widely used low-rank tensor factorization technique known for its memory efficiency and scalability in high-dimensional data. Its advantages have motivated the development of various TT-based methods and applications across fields, such as chemistry, quantum physics, and machine learning. However, constructing low-rank tensors and performing operations in TT arithmetic can be computationally intensive, often due to challenges on the cost of tensor construction and the complexity of numerical operations. To address these issues, high-performance computing (HPC) techniques such as parallelism and mixed-precision arithmetic have become essential tools for enhancing computational efficiency and reducing memory and communication requirements in (multi)linear algebra. In this talk, we discuss recent advances in HPC for TT arithmetic and introduce parallel TT operations using mixed-precision arithmetic. We then explore the potential of these developments to improve large-scale tensor computations and discuss their implications for future applications in scientific computing.

Co-authors

Presentation materials

There are no materials yet.