diff --git a/docs/source/community/index.md b/docs/source/community/index.md index 403f2ca9..269520ff 100644 --- a/docs/source/community/index.md +++ b/docs/source/community/index.md @@ -1,6 +1,6 @@ # Community -Contributions to movement are absolutely encouraged, whether to fix a bug, +Contributions to `movement` are absolutely encouraged, whether to fix a bug, develop a new feature, or improve the documentation. To help you get started, we have prepared a statement on the project's [mission and scope](target-mission), a [roadmap](target-roadmaps) outlining our current priorities, and a detailed [contributing guide](target-contributing). diff --git a/docs/source/community/mission-scope.md b/docs/source/community/mission-scope.md index 4cb76453..224d2d30 100644 --- a/docs/source/community/mission-scope.md +++ b/docs/source/community/mission-scope.md @@ -3,7 +3,7 @@ ## Mission -``movement`` aims to **facilitate the study of animal behaviour** +`movement` aims to **facilitate the study of animal behaviour** by providing a suite of **Python tools to analyse body movements** across space and time. @@ -40,13 +40,13 @@ packages specialized for this purpose, which will be compatible with ## Design principles -movement is committed to: +`movement` is committed to: - __Ease of installation and use__. We aim for a cross-platform installation and are mindful of dependencies that may compromise this goal. - __User accessibility__, catering to varying coding expertise by offering both a GUI and a Python API. - __Comprehensive documentation__, enriched with tutorials and examples. - __Robustness and maintainability__ through high test coverage. - __Scientific accuracy and reproducibility__ by validating inputs and outputs. - __Performance and responsiveness__, especially for large datasets, using parallel processing where appropriate. -- __Modularity and flexibility__. We envision movement as a platform for new tools and analyses, offering users the building blocks to craft their own workflows. +- __Modularity and flexibility__. We envision `movement` as a platform for new tools and analyses, offering users the building blocks to craft their own workflows. Some of these principles are shared with, and were inspired by, napari's [Mission and Values](napari:community/mission_and_values) statement. diff --git a/docs/source/community/roadmaps.md b/docs/source/community/roadmaps.md index 78b1bd67..569a61db 100644 --- a/docs/source/community/roadmaps.md +++ b/docs/source/community/roadmaps.md @@ -1,23 +1,26 @@ (target-roadmaps)= # Roadmaps -The roadmap outlines **current development priorities** and aims to **guide core developers** and to **encourage community contributions**. It is a living document and will be updated as the project evolves. +This page outlines **current development priorities** and aims to **guide core developers** and to **encourage community contributions**. It is a living document and will be updated as the project evolves. -The roadmap is **not meant to limit** movement features, as we are open to suggestions and contributions. Join our [Zulip chat](movement-zulip:) to share your ideas. We will take community demand and feedback into account when planning future releases. +The roadmaps are **not meant to limit** `movement` features, as we are open to suggestions and contributions. Join our [Zulip chat](movement-zulip:) to share your ideas. We will take community demand and feedback into account when planning future releases. ## Long-term vision The following features are being considered for the first stable version `v1.0`. -- __Import/Export pose tracks from/to diverse formats__. We aim to interoperate with leading tools for animal pose estimation and behaviour classification, and to enable conversions between their formats. -- __Standardise the representation of pose tracks__. We represent pose tracks as [xarray data structures](xarray:user-guide/data-structures.html) to allow for labelled dimensions and performant processing. -- __Interactively visualise pose tracks__. We are considering [napari](napari:) as a visualisation and GUI framework. -- __Clean pose tracks__, including, but not limited to, handling of missing values, filtering, smoothing, and resampling. -- __Derive kinematic variables__ like velocity, acceleration, joint angles, etc., focusing on those prevalent in neuroscience. -- __Integrate spatial data about the animal's environment__ for combined analysis with pose tracks. This covers regions of interest (ROIs) such as the arena in which the animal is moving and the location of objects within it. +- __Import/Export motion tracks from/to diverse formats__. We aim to interoperate with leading tools for animal tracking and behaviour classification, and to enable conversions between their formats. +- __Standardise the representation of motion tracks__. We represent tracks as [xarray data structures](xarray:user-guide/data-structures.html) to allow for labelled dimensions and performant processing. +- __Interactively visualise motion tracks__. We are experimenting with [napari](napari:) as a visualisation and GUI framework. +- __Clean motion tracks__, including, but not limited to, handling of missing values, filtering, smoothing, and resampling. +- __Derive kinematic variables__ like velocity, acceleration, joint angles, etc., focusing on those prevalent in neuroscience and ethology. +- __Integrate spatial data about the animal's environment__ for combined analysis with motion tracks. This covers regions of interest (ROIs) such as the arena in which the animal is moving and the location of objects within it. - __Define and transform coordinate systems__. Coordinates can be relative to the camera, environment, or the animal itself (egocentric). +- __Provide common metrics for specialised applications__. These applications could include gait analysis, pupillometry, spatial +navigation, social interactions, etc. +- __Integrate with neurophysiological data analysis tools__. We eventually aim to facilitate combined analysis of motion and neural data. ## Short-term milestone - `v0.1` -We plan to release version `v0.1` of movement in early 2024, providing a minimal set of features to demonstrate the project's potential and to gather feedback from users. At minimum, it should include: +We plan to release version `v0.1` of `movement` in early 2025, providing a minimal set of features to demonstrate the project's potential and to gather feedback from users. At minimum, it should include: - [x] Ability to import pose tracks from [DeepLabCut](dlc:), [SLEAP](sleap:) and [LightningPose](lp:) into a common `xarray.Dataset` structure. - [x] At least one function for cleaning the pose tracks. @@ -25,4 +28,4 @@ We plan to release version `v0.1` of movement in early 2024, providing a minimal - [x] Public website with [documentation](target-movement). - [x] Package released on [PyPI](https://pypi.org/project/movement/). - [x] Package released on [conda-forge](https://anaconda.org/conda-forge/movement). -- [ ] Ability to visualise pose tracks using [napari](napari:). We aim to represent pose tracks via napari's [Points](napari:howtos/layers/points) and [Tracks](napari:howtos/layers/tracks) layers and overlay them on video frames. +- [ ] Ability to visualise pose tracks using [napari](napari:). We aim to represent pose tracks as napari [layers](napari:howtos/layers/index.html), overlaid on video frames. diff --git a/docs/source/index.md b/docs/source/index.md index 5f11a3af..96e83afa 100644 --- a/docs/source/index.md +++ b/docs/source/index.md @@ -17,7 +17,7 @@ Installation, first steps and key concepts. :link: examples/index :link-type: doc -A gallery of examples using movement. +A gallery of examples using `movement`. ::: :::{grid-item-card} {fas}`comments;sd-text-primary` Join the movement