Array DML: Type otString has high memory usage

tbrh66

Member
Hello to all!

I am using Array DML to insert large csv files in my database.
My testfile has 50000 lines.

After creating the variant arrays for each column the memory usage of the whole program is around 100mb. Than I declare and set the variables in the TOracleQuery. With types otFloat and otDate each variable adds around 1mb to the used memory. However, if I set a variable of type otString the memory usage rises around 200mb each.

I dont understand why variables with type otString need so much memory. This is a large problem because my productive files are much larger than 50000 lines and I am running out of memory.

Thank you for your help.
Tobias
 
The variables in the query are set like this (each column of the csv file gets one variable):

Code:
with OracleQuery do
begin
  SQL.Text := MyQuery;
  DeleteVariables;

  for Column in Columns do
  begin
    with Column do
    begin
      // Float
      if DataType = ftFloat then
        DeclareAndSet(FieldName, otFloat, MyVariantArray)
      // Date
      else if DataType = ftDateTime then
        DeclareAndSet(FieldName, otDate, MyVariantArray)
      // String
      else
        DeclareAndSet(FieldName, otString, MyVariantArray)
      ;

      VarClear(MyVariantArray);
    end;
  end;
  Execute;

I monitor the memory usage while debugging and as I said: The otFloat and otDate variables are fine, only the otString variables using around 200mb memory each (the whole csv file has only 7mb).

The variant arrays I set as variables get created like this:
(After creation they get locked. Then I fill an auxiliary array of variant (called DataPreparationArray) with one column of the csv file, move this array to the appropriate locked variant array und unlock the variant array).

Code:
type
  TArrayOfVariant = array of variant;
  PArrayOfVariant = ^TArrayOfVariant;
var
  MyVariantArray: variant;
  PLockedVariantArray: PArrayOfVariant;
  DataPreparationArray: TArrayOfVariant;

MyVariantArray := VarArrayCreate([0, RecordCount-1], varVariant);
PLockedVariantArray := VarArrayLock(MyVariantArray);

while not Csv.Eof do
begin
  DataPreparationArray[idx] := Csv.Value;
  Csv.Next;
  Inc(idx);
end;

Move(DataPreparationArray[0], PLockedVariantArray^, Length(DataPreparationArray)*SizeOf(variant));
VarArrayUnlock(MyVariantArray);

The csv file looks like this:

SQL:
506497	19510120	152078641	26	814	20151218	5600000	C	CAC	-99.99	-99.99	14.93	152	2783	0	0.1631634	0.05378711	0.0001569419	460.4151	-33.54516	0	E	0
506497	19510120	152078643	26	814	20151218	6400000	C	CAC	-99.99	-99.99	2.35	0	1100	0	0.1721192	0.009963988	0.0000362581	112.2076	-9.18166	0	E	0
506497	19510120	152078644	26	814	20151218	6400000	P	CAC	-99.99	-99.99	2083.71	0	1100	0	0.1916868	-0.9501087	0.0000567821	195.7003	-152.3044	0	E	0

Thank you very much for your help.
Kind regards
Tobias
 
It may be a good idea to insert the data with arrays of limited size. For example up to 1000 records. This way you have a limited, fixed memory overhead, independent of the file size, and you can still achieve maximum performance.
 
Okay thank you. Thats what I actually doing now. I was just wondering why string variables need so much memory and other variables do not.
 
Back
Top