-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Provide dedicated "Cannot load NWB 1.0" Exception/Error instead of an arbitrary crash #1077
Comments
One more example where I was told that file is probably NWB 1.0(git)smaug:/mnt/btrfs/datasets/datalad/crawl/crcns/ssc-7[master]git
$> dandi ls data/L4E_whole_cell/Exp_2015-09-05_001_0001-0162.nwb
Traceback (most recent call last):
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/bin/dandi", line 11, in <module>
load_entry_point('dandi', 'console_scripts', 'dandi')()
File "/usr/lib/python3/dist-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/usr/lib/python3/dist-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/usr/lib/python3/dist-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/lib/python3/dist-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/lib/python3/dist-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/home/yoh/proj/dandi/dandi-cli/dandi/cli/command.py", line 169, in ls
rec.update(get_metadata_pyout(path))
File "/home/yoh/proj/dandi/dandi-cli/dandi/cli/command.py", line 61, in get_metadata_pyout
meta = get_metadata(path)
File "/home/yoh/proj/dandi/dandi-cli/dandi/pynwb_utils.py", line 30, in get_metadata
nwb = io.read()
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/hdf5/h5tools.py", line 293, in read
return call_docval_func(super(HDF5IO, self).read, kwargs)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/utils.py", line 327, in call_docval_func
return func(*fargs, **fkwargs)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/io.py", line 31, in read
f_builder = self.read_builder()
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/hdf5/h5tools.py", line 308, in read_builder
f_builder = self.__read_group(self.__file, ROOT_NAME, ignore=ignore)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/hdf5/h5tools.py", line 385, in __read_group
builder = read_method(sub_h5obj)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/hdf5/h5tools.py", line 385, in __read_group
builder = read_method(sub_h5obj)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/hdf5/h5tools.py", line 385, in __read_group
builder = read_method(sub_h5obj)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/backends/hdf5/h5tools.py", line 393, in __read_group
ret = GroupBuilder(name, **kwargs)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/build/builders.py", line 179, in __init__
self.set_dataset(dataset)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/utils.py", line 438, in func_call
return func(self, **parsed['args'])
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/build/builders.py", line 289, in set_dataset
self.__set_builder(builder, GroupBuilder.__dataset)
File "/home/yoh/proj/dandi/dandi-cli/venvs/dev3/lib/python3.7/site-packages/hdmf/build/builders.py", line 259, in __set_builder
(name, self.obj_type[name], self.name, obj_type))
KeyError: "'electrode_name' already exists as attributes in membrane_potential, cannot set as datasets" |
aha -- nwb-schema talks about and my sample NWB 2.0 file does carry a core/2.0.1 group$> h5ls -S -r mouse1_fni16_150818_001_ch2-PnevPanResults-170808-180842.nwb/specifications/
/core Group
/core/2.0.2 Group
/core/2.0.2/namespace Dataset {SCALAR}
/core/2.0.2/nwb.base Dataset {SCALAR}
/core/2.0.2/nwb.behavior Dataset {SCALAR}
/core/2.0.2/nwb.ecephys Dataset {SCALAR}
/core/2.0.2/nwb.epoch Dataset {SCALAR}
/core/2.0.2/nwb.file Dataset {SCALAR}
/core/2.0.2/nwb.icephys Dataset {SCALAR}
/core/2.0.2/nwb.image Dataset {SCALAR}
/core/2.0.2/nwb.misc Dataset {SCALAR}
/core/2.0.2/nwb.ogen Dataset {SCALAR}
/core/2.0.2/nwb.ophys Dataset {SCALAR}
/core/2.0.2/nwb.retinotopy Dataset {SCALAR} while a sample file in ssc-7 dataset has none:$> h5ls -S -r data/L4E_whole_cell/Exp_2015-09-05_001_0001-0162.nwb/specifications/
specifications/**NOT FOUND** % from glancing over that docs/storage/source/storage_hdf5.rst it is not 100% clear to me if that is a feature of NWB:N specifically or NWB in general, and when was it introduced? Also the https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/README.rst starts with "It provides a high-level API for efficiently working with Neurodata stored in the NWB format." - it seems not to limit to version of NWB 2.0 or above. If NWB prior 2.0 are not supposed to be supported, it should be clarified in the README.rst IMHO. Unfortunately I am still learning NWB ecosystem so cannot really propose any definitive solution/PR here without guidance ;) |
@yarikoptic can we take this over to slack? I'm happy to help you sort these issues out and open this file in pynwb. It looks like this file is in 2.0 after all, so not relevant to this issue. |
and we did (in jitsi), summary: me
|
I think I have ran into the now closed #1051 while trying to read an example file from exp2nwb: https://github.com/NeurodataWithoutBorders/exp2nwb/blob/master/example/Example.nwb?raw=true
If that is the best way to determine that the file is NWB 1.0 not 2.0, please consider catching it around
h5tools.py", line 246, in read
and reraising a dedicated/descriptive exception. If there is a better way (I hope) then please add it that sensing even before even trying to load an incompatible .nwb file.The text was updated successfully, but these errors were encountered: