For large data files it will be necessary to chunk the data in input and output. Otherwise we will get a large data file in the output which will be uncomfortable for e.g. analysis users. It would also be useful to be able to reconstruct a range of spills for the same reason.
So propose that we add to InputCppDAQData some spill start and spill end parameter, add to execute_against_data some wrapper to chunk into e.g. 1000 spill chunks and reconstruct each set of 1000 spills.