I am generating multiple jobs with Natural remote job entry using an input dataset with a list of 300 objects. Each object is used in its own job and data is written to a single dataset on tape.
I want for each job to write to the same dataset. Do I need to insert time delays between each job submission to ensure the dataset will close? If I write to a DASD dataset with disp=(mod,catlg), do I have to know the total size to allocate at the time of creation? The amount of data will vary depending on the time of year so total size is hard to estimate.
Mel