[5.5.3]: HDF5 reader does not account for collection header size when reading GlobalHeap #1165
Open
1 task done
Labels
bug
Something isn't working
Versions impacted by the bug
v5.x
What went wrong?
We noticed that when reading String labels from HDF5 file using default
H5iospNew
, one of the labels had an incorrect value ofBad HeapObject.dataSize=id=16, refCount=0, dataSize=503533375848452, dataPos=116958
. After looking into the code, the issue seems to be inucar.nc2.internal.iosp.hdf5.H5objects.GlobalHeap
(some code has been removed for clarity):countBytes
does not account for the 16 bytes of the header that we already read in. As a result, it reads an extra global heap object past the end of the heap. Unfortunately, if the extra object (probably just garbage?) has an id that happens to be an id of an existing valid object, the entry will be overwritten in the map. There is an extra check in another piece of code that turns this invalid object into "Bad HeapObject.dataSize..." because it'so.dataSize
is too large (as it's just random bytes), which is what we see in the returned String label.Our current way around this is to initialize
int countBytes = 16
instead of0
, and it seems to have fixed the issue. It might take me some time to create an example file because this happens pretty rarely, but let me know if this is the correct approach.Relevant stack trace
No response
Relevant log messages
No response
If you have an example file that you can share, please attach it to this issue.
If so, may we include it in our test datasets to help ensure the bug does not return once fixed?
Note: the test datasets are publicly accessible without restriction.
N/A
Code of Conduct
The text was updated successfully, but these errors were encountered: