Skip to content

Commit

Permalink
Merge pull request #295 from pynapple-org/dev
Browse files Browse the repository at this point in the history
Bumping 0.6.6.
  • Loading branch information
gviejo authored May 28, 2024
2 parents 8ac6e2d + 4577b45 commit 3190fb1
Show file tree
Hide file tree
Showing 7 changed files with 28 additions and 9 deletions.
2 changes: 1 addition & 1 deletion docs/HISTORY.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Around 2016-2017, Luke Sjulson started *TSToolbox2*, still in Matlab and which i
In 2018, Francesco started neuroseries, a Python package built on Pandas. It was quickly adopted in Adrien's lab, especially by Guillaume Viejo, a postdoc in the lab. Gradually, the majority of the lab was using it and new functions were constantly added.
In 2021, Guillaume and other trainees in Adrien's lab decided to fork from neuroseries and started *pynapple*. The core of pynapple is largely built upon neuroseries. Some of the original changes to TSToolbox made by Luke were included in this package, especially the *time_support* property of all ts/tsd objects.

0.6.6 (Soon)
0.6.6 (2024-05-28)
------------------

- Full lazy-loading for NWB file.
Expand Down
4 changes: 4 additions & 0 deletions docs/api_guide/tutorial_pynapple_nwb.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,13 @@
print(data['z'])

# %%
#
# Internally, the `NWBClass` has replaced the pointer to the data with the actual data.
#
# While it looks like pynapple has loaded the data, in fact it did not. By default, calling the NWB object will return an HDF5 dataset.
# !!! warning
#
# New in `0.6.6`

print(type(z.values))

Expand Down
5 changes: 5 additions & 0 deletions docs/pynajax.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
### Motivation


!!! warning

New in `0.6.6`

Multiple python packages exist for high-performance computing. Internally, pynapple makes extensive use of [numba](https://numba.pydata.org/) for accelerating some functions. Numba is a stable package that provide speed gains with minimal installation issues when running on CPUs.

Another high-performance toolbox for numerical analysis is
Expand Down
2 changes: 1 addition & 1 deletion pynapple/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "0.6.5"
__version__ = "0.6.6"
from .core import (
IntervalSet,
Ts,
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "pynapple"
version = "0.6.5"
version = "0.6.6"
description = "PYthon Neural Analysis Package Pour Laboratoires d’Excellence"
readme = "README.md"
authors = [{ name = "Guillaume Viejo", email = "[email protected]" }]
Expand Down
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,8 @@
test_suite='tests',
tests_require=test_requirements,
url='https://github.com/pynapple-org/pynapple',
version='v0.6.5',
version='v0.6.6',
zip_safe=False,
long_description_content_type='text/markdown',
download_url='https://github.com/pynapple-org/pynapple/archive/refs/tags/v0.6.5.tar.gz'
download_url='https://github.com/pynapple-org/pynapple/archive/refs/tags/v0.6.6.tar.gz'
)
18 changes: 14 additions & 4 deletions tests/test_lazy_loading.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@
import pynapple as nap




@pytest.mark.parametrize(
"time, data, expectation",
[
Expand Down Expand Up @@ -224,11 +226,14 @@ def test_lazy_load_nwb_no_warnings(data):
nwbfile.add_acquisition(time_series)
nwb = nap.NWBFile(nwbfile)

with warnings.catch_warnings():
warnings.simplefilter("error")
with warnings.catch_warnings(record=True) as w:
tsd = nwb["TimeSeries"]
tsd.count(0.1)
assert isinstance(tsd.d, h5py.Dataset)

if len(w):
if not str(w[0].message).startswith("Converting 'd' to"):
raise RuntimeError

finally:
if file_path.exists():
Expand All @@ -242,8 +247,7 @@ def test_tsgroup_no_warnings():
file_path = Path(f'data_{k}.h5')
with h5py.File(file_path, 'w') as f:
f.create_dataset('spks', data=np.sort(np.random.uniform(0, 10, size=20)))
with warnings.catch_warnings():
warnings.simplefilter("error")
with warnings.catch_warnings(record=True) as w:

nwbfile = mock_NWBFile()

Expand All @@ -255,9 +259,15 @@ def test_tsgroup_no_warnings():
nwb = nap.NWBFile(nwbfile)
tsgroup = nwb["units"]
tsgroup.count(0.1)

if len(w):
if not str(w[0].message).startswith("Converting 'd' to"):
raise RuntimeError


finally:
for k in range(n_units):
file_path = Path(f'data_{k}.h5')
if file_path.exists():
file_path.unlink()

0 comments on commit 3190fb1

Please sign in to comment.