You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To-do list for splitting off NDTensors.BlockSparseArrays as a separate registered package BlockSparseArrays.jl:
Remove the AnyAbstractBlockSparseArray type union in favor of an @derive macro, similar to Moshi.@derive, the Rust derive attribute for implementing traits, and ArrayLayouts.@layoutmatrix and related macros in ArrayLayouts.jl. This would basically automatically define getindex, map!, etc. as blocksparse_getindex, blocksparse_map!, etc. on a specified type or wrapper.
Assess which NDTensors.jl sub-modules BlockSparseArrays depends on and either remove those dependencies or assess what we need to do to split off those libraries into packages as well. For example:
GradedAxes is being used for things like dual and to provide some generic block axis slicing functionality that works for both graded and non-graded unit ranges.
TypeParameterAccessors for generically accessing type parameters, in particular it uses functionality for generically getting the type of the parent of a wrapper type. We've been planning to split that off for a while, though I think there are still some type instability issues and interface questions to decide on so I'm not sure how comfortable I am doing that right now.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
To-do list for splitting off
NDTensors.BlockSparseArrays
as a separate registered packageBlockSparseArrays.jl
:AnyAbstractBlockSparseArray
type union in favor of an@derive
macro, similar toMoshi.@derive
, the Rust derive attribute for implementing traits, andArrayLayouts.@layoutmatrix
and related macros inArrayLayouts.jl
. This would basically automatically definegetindex
,map!
, etc. asblocksparse_getindex
,blocksparse_map!
, etc. on a specified type or wrapper.NDTensors.jl
sub-modulesBlockSparseArrays
depends on and either remove those dependencies or assess what we need to do to split off those libraries into packages as well. For example:SparseArraysBase
is a major dependency so we will have to release that first, see `SparseArraysBase.jl` release to-do list SparseArraysBase.jl#1.BroadcastMapConversion
, which converts broadcast calls to map calls (which is heavily inspired by the broadcasting code logic inStrided.jl
). That library is also used in other sub-modules ofNDTensors.jl
, such asBlockSparseArrays
andNamedDimsArrays
.GradedAxes
is being used for things likedual
and to provide some generic block axis slicing functionality that works for both graded and non-graded unit ranges.TypeParameterAccessors
for generically accessing type parameters, in particular it uses functionality for generically getting the type of the parent of a wrapper type. We've been planning to split that off for a while, though I think there are still some type instability issues and interface questions to decide on so I'm not sure how comfortable I am doing that right now.NestedPermutedDimsArrays
will be used as the output ofblocks(::PermutedDimsArray)
.GradedAxes
andTensorAlgebra
for compatibility with those libraries.BlockSparseArrays
, particularly in light of any changes we decide to make toSparseArraysBase
, which are being discussed in `SparseArraysBase.jl` release to-do list SparseArraysBase.jl#1.Beta Was this translation helpful? Give feedback.
All reactions