Hi mates,
We're running in some big installations with DFSORT (V1R10/V1R5), and using HIPER to sort in a massive way.
I was detected using HIPERSORT (HIPRMAX=OPTIMAL, DSPSIZE=0, MOSIZE=0) in a lot of cases, that DFSORT allocate dynamic work space not used later, runs 100% in memory, in sort processes more than 425/430 Mbytes.
Occurs in all versions of DFSORT, from many time ago (at least until V1R10). Isn't a very big problem, but in some production Sysplex, we need to SORT near of 100 Tb daily. The sum of work space allocated (and not used) are millions of tracks...
Is possible to decrease or avoid this? (obviously without use of Memory Objects)