TOracleDataset - Huge memory consumption

ireber

Member
Hello,
We have a problem with TOracleDataSet and a huge memory consumption (sometimes "Out of memory")

We analyzed the code and found that
- memory is allocated based on field size even if there are only a few characters of real data (?),
- the number of records is estimated.
(TRecordDataList.AllocateBlock
// Calculate the actual number of bytes for the block
Size := (MaxRecords * DataSet.FRecBufSize) + (SizeOf(TRecordBlock) - SizeOf(TByteArray));)

So with loading a lot of records (> 150'000) having several fields (> 50), the result is a huge memory consumption. (> 9 GB)

We cannot use the Unidirectional properties because this TOracleDataSet is linked to a Grid / PivotGrid / Chart.
These components sometimes need to load the entire dataset (grouping, aggregation, etc.),
If the user scrolls to the last record, the dataset is also fully loaded.

Do you have any suggestion to work around this problem?
Do you plan to change this behavior?

Thank you
 
You could use a TOracleQuery or a UniDirectional TOracleDataSet and write the data to a memory dataset that is more memory-efficient than the TOracleDataSet. The memory dataset can then be connected to the grid/chart components.
 
Back
Top