Due to my last memory problem with copying large blob photo data using toracledataset i decided to due a test. My program opens a query and scrolls 300 records reading each record into memory. I then close the dataset and requery from the last record forward and read another 300 records. I can only do this for about 23,000 records before i get an out of memory error. This does not seem right, i should be able to do this for ever. Is the toraceldataset tool not releasing all prior used memory when you close the dataset and reopen.