Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
np.einsum
,dask.array.einsum
,pytorch.einsum
,tensorflow.einsum
) by optimizing the expression's contraction order and dispatching many operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch, Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially any library which conforms to a standard API. See the documentation for more information.
Package Version | Update ID | Released | Package Hub Version | Platforms | Subpackages |
---|---|---|---|---|---|
3.1.0-bp156.3.1 info | GA Release | 2023-07-22 | 15 SP6 |
|
|
3.1.0-bp155.2.12 info | GA Release | 2023-05-22 | 15 SP5 |
|
|
3.1.0-bp154.1.30 info | GA Release | 2022-05-09 | 15 SP4 |
|
|
3.1.0-bp153.1.14 info | GA Release | 2021-03-06 | 15 SP3 |
|
|
3.1.0-bp152.2.1 info | GA Release | 2020-05-18 | 15 SP2 |
|
|