check a huge dataset



IBM's flagship sort product DFSORT for sorting, merging, copying, data manipulation and reporting. Includes ICETOOL and ICEGENER

check a huge dataset

Postby samb01 » Fri Dec 14, 2018 6:31 pm

Hello,

i have a dataset with 9 932 248 and 1294125 tarcks
I just want to check the integrity of the dataset.
My dataset is unreadable because of this message :


IEC020I 001-3,XJOB,S30,SYSUT1-0024,2664,PXUMPY,
IEC020I X.DATASET                                                
IEC020I NON-ACCEPTABLE ERROR            
 


so before doing the copy, i'd like to check the dataset.
I did it with : ICETOOL


//S1   EXEC PGM=ICETOOL                                
//TOOLMSG DD SYSOUT=*                                  
//DFSMSG  DD SYSOUT=*                                  
//IN      DD DSN=X.DATASET,DISP=SHR      
//TOOLIN  DD *                                          
COUNT FROM(IN)                                          
/*                                                      
 


The ICETOOL abended, it means the Dataset is not correct but it took too much elapse time :

TOTAL CPU TIME= .75 TOTAL ELAPSED TIME= 22.87

I'd like to know if there a programm taht could check a Huge dataset quikly (faster than PGM=ICETOOL)

Thank's for your help.
samb01
 
Posts: 431
Joined: Mon Nov 16, 2009 7:24 pm
Has thanked: 1 time
Been thanked: 0 time

Re: check a huge dataset

Postby enrico-sorichetti » Fri Dec 14, 2018 6:57 pm

that' s not an error that the application program can solve by itself
let Your storage support handle it
cheers
enrico
When I tell somebody to RTFM or STFW I usually have the page open in another tab/window of my browser,
so that I am sure that the information requested can be reached with a very small effort
enrico-sorichetti
Global moderator
 
Posts: 3006
Joined: Fri Apr 18, 2008 11:25 pm
Has thanked: 0 time
Been thanked: 165 times

Re: check a huge dataset

Postby Robert Sample » Fri Dec 14, 2018 7:30 pm

but it took too much elapse time
Elapsed time is impacted by a large number of factors (such as the size of the data set, the number of address spaces -- batch jobs, TSO users, started tasks and possibly OMVS processes -- executing in the LPAR, their relative priorities compared to yours, how busy the CPU is, the WLM (Workload Manager) policy, contention for the channel and for the device, and so forth). Your site support group can assist in optimizing elapsed time, but in general there's not too much you can do to change it. And if your data set has millions of tracks, it is not going to execute in a short amount of elapsed time no matter what you do -- it takes a certain amount of time to read those millions of tracks and that time is an irreducible minimum! Also note that elapsed time can be drastically impacted by these factors -- I've seen jobs that normally run in 3 minutes take as much as 5 hours when submitted while the CPU is running 100% busy.
Robert Sample
Global moderator
 
Posts: 3720
Joined: Sat Dec 19, 2009 8:32 pm
Location: Dubuque, Iowa, USA
Has thanked: 1 time
Been thanked: 279 times

Re: check a huge dataset

Postby samb01 » Fri Dec 14, 2018 7:54 pm

Hello,

i found a really simple program that can check easily the dataset : SYSGENER with a DUMMY.
It abended when the dataset is unreadable. And it takes 1 munutes elaps

TOTAL CPU TIME= .32 TOTAL ELAPSED TIME= 1.00



//SYSGEN01 EXEC PGM=SYSGENER                                    
//SYSPRINT DD SYSOUT=*                                          
//SYSUT1   DD DISP=SHR,DSN=X.DATASET            
//SYSUT2   DD DUMMY,DCB=(RECFM=VB,LRECL=27994,BLKSIZE=27998)    

 
samb01
 
Posts: 431
Joined: Mon Nov 16, 2009 7:24 pm
Has thanked: 1 time
Been thanked: 0 time


Return to DFSORT/ICETOOL/ICEGENER

 


  • Related topics
    Replies
    Views
    Last post