Hello, I have been searching but could not find that this is possible to handle:
Data looks like
2014-01-01 12:12:12,data,data,,,,data,data,,,data,,2014-01-01 12:12:22,data,data,,,,data,data,data,data,data,,2014-01-01 12:12:32,data,data,,,,data,data,,,data,,<continues to EOF>da
ta,data,,2014-01-01 12:12:12,data,data,,,,data,data,,,data,,
until max file length (VB) reached and then the data wraps. So in essence it's one giant record. A 50GB record.
Each logical record begins with a timestamp, and then data values or nulls based on the comma delimiter.
Can I read this effectively and create separate records?
2014-01-01 12:12:12,data,data,,data,,data,data,,,data,,<CRLF>
2014-01-01 12:12:22,data,data,,,,data,data,data,data,data,,<CRLF>
2014-01-01 12:12:32,data,data,,,,data,data,,,data,,<CRLF>
...
This is some sort of internet data dump coming in and we're arguing for line feeds