FieldSize vs CharSize for VARCHAR2(100 CHAR)

Ralf

Member²
We are converting our database to a UTF-multibyte database (from a single-byte database).

For this reason the definition of VARCHAR2 for string fields must change. To contain for example 100 chars the definition changes from:
VARCHAR2(100) to VARCHAR2(100 CHAR).

Without the CHAR the field can contain only 100 bytes and not 100 multibyte-chars.

In our OracleSession we defined (for some historic reason) MaxStringFieldSize of 255. But this setting doesn't match with the datasize of a VARCHAR2(100 CHAR). When checking against the MaxStringFieldSize the datasize is taken and not the char-size!!!!

Is this a bug??
 
In our example:

Field x was VARCHAR2(100)
and is now VARCHAR2(100 CHAR).

The fieldsize was 100 and is now 400 (4 bytes * 100).
 
I changed the method CreateFieldDef for strings in the following way:

procedure TOracleDataSet.CreateFieldDef(FieldInfo: TOracleFieldInfo);
...
if (msfSize 0) and (FieldSize > msfSize)

In the code above I replaced FieldSize with:

Query.FieldCharSize(FieldInfo.FieldIndex)

Is that a correct solution?
 
Back
Top