What's new in ITensors.jl

07/30/2021, 1:30 PM2:00 PM UTC
Purple

Abstract:

Tensor networks encapsulate a large class of low rank decompositions of very high order -- potentially infinite order -- tensors. They have found a wide variety of uses in physics, chemistry, and data science. ITensors.jl is a high performance Julia library with a unique memory independent interface that makes it easy to use and develop tensor network algorithms. In this talk, I will give an overview of the library and new features that have been added since its release last year.

Description:

In JuliaCon 2019, we gave an early preview of ITensors.jl, a ground-up pure Julia rewrite of ITensor, a high performance C++ library for using and developing tensor network algorithms. ITensors.jl v0.1 was officially released in May of 2020. Since then, there has been a lot of development of the library as well as a variety of spinoff libraries, such as ITensorsGPU.jl that adds a GPU backend for tensor operations, ITensorsVisualization.jl for visualizing tensor networks, PastaQ.jl for using tensor networks to simulate and analyze quantum computers, ITensorGaussianMPS.jl for creating tensor networks of noninteracting quantum systems, as well as more experimental libraries like ITensorsGrad.jl for adding automatic differentiation support and ITensorInfiniteMPS.jl for working with infinite tensor networks. In addition, many advanced features have been added to ITensors.jl and its underlying sparse tensor library NDTensors.jl, such as multithreaded block sparse tensor contractions, alternative dense contraction backends like TBLIS, contraction sequence optimization, and more. In this talk, I plan to give an overview of the current libraries and capabilities as well as lay out a roadmap for where the Julia ITensor ecosystem is heading.

Platinum sponsors

Julia Computing

Gold sponsors

Relational AI

Silver sponsors

Invenia LabsConningPumas AIQuEra Computing Inc.King Abdullah University of Science and TechnologyDataChef.coJeffrey Sarnoff

Media partners

Packt Publication

Fiscal Sponsor

NumFOCUS