Memory errors on record copies with long fields

mhajduk

Member²
Tranfer records with large long fields (photos). program seems to have a memory leak as around 20,000 to 25,000 recs the program returns an out of memory error. the program is simply two datasets, looping one and appending and posting the new data to the other dataset. The long field uses the assign() function to tranfer the long data. Is there some free of memory statement that has to be performed after the assign. or is the dataset buffering all the appends in memory and running out at around 20,000 or so recs. Each record size is around 30k to 60k. Do you have any suggestions as to how and prevent this error.
 
If you are using a TOracleDataSet for this, you will have a considerable memory usage overhead. For batch processing you should consider using a TOracleQuery instead. Just read the data through TOracleQuery 1, and execute plain insert statements through TOracleQuery 2. This guarantees zero memory and performance overhead.

------------------
Marco Kalter
Allround Automations
 
If you're just going to do such a simple loop, it might be easier just to write a PL/SQL procedure (try DBMS_LOB.COPY).
 
I agree that the Toraclequery does work great and is a lot faster but how do you add the data for the long field. The long fields are photos between 20k and 50k.
 
You can simply assign the field values (TOracleQuery.Field) to the corresponding variables in the insert statement (TOracleQuery.SetVariable). It doesn't matter if the value is 50K long.

DavidH is right though: you should do this on the server in PL/SQL!

------------------
Marco Kalter
Allround Automations
 
Back
Top