You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As raised by @luojing1211 per e-mail: Once one has set up a pipeline, how to actually get it to write out its results? And perhaps also write intermediate steps?
One possibility, at least for the final step, is to create a memory-mapped output and then use that for the out argument in read. E.g., currently, the following works:
from numpy.lib.format import open_memmap
...
stack = Stack(square, nbin, phase)
mm = open_memmap('stack.npy', 'w+', dtype='f4', shape=stack.shape)
stack.read(out=mm)
FITS files similarly can be opened with memmap=True, so this should work reasonably transparently. All we would need is then to have some functions that create the right output files, filling the headers.
Simple idea: Would write as numpy handle, but needs to keep the handle attributes (start_time, etc.). Maybe something simpler than FITS? Can we use np.savez?? (save several arrays in a single file).
Though want to use a memmap for the main data array, which is tricky. Would need instead to mmap the whole .npz output (np.savez(<some-mmap-handle>, data=??, start_time=...) and then np.load(<another-mmap-handle>) - not sure, though, that one can avoid writing some empty initial data to disk...
As raised by @luojing1211 per e-mail: Once one has set up a pipeline, how to actually get it to write out its results? And perhaps also write intermediate steps?
One possibility, at least for the final step, is to create a memory-mapped output and then use that for the
out
argument in read. E.g., currently, the following works:FITS files similarly can be opened with
memmap=True
, so this should work reasonably transparently. All we would need is then to have some functions that create the right output files, filling the headers.See also a complete pipeline at https://gist.github.com/mhvk/65944fcc8477c7a40ef10a896bb90a56
The text was updated successfully, but these errors were encountered: