Hi there,
I've the similar code as you have written there, but the saving and loading are placed in 2 different procedures instead of one.
in function one I have :
with qrySomething do
begin
TBlobField(FieldByName('FILE_DATA')).SaveToFile(filename);
end;
and in another function I have :
with qrySomething do
begin
Insert;
TBlobField(FieldByName('FILE_DATA')).LoadFromFile(filename);
Post;
end;
---
The binary data column in Oracle 7.3 table is also LONG RAW, and I have used TBlobField.BlobSize to verify the file size.
I have noticed that when the blobfield load the binary data from file, the binary data size is actually correct, it's when the data POST to database, then retrieve from database the binary data get corrupted. (ie. 0x00 to 0x20 characters missing)
We have tried to duplicate the code with BDE and found that using exactly the same code with a BDE dataset can save blob field into database (after POST) without corruption.
Moreover, once the data was saved into database via BDE, then DOA can retrieve the binary data and SaveToFile correctly.
So I believe it might be something to do with the posting to database through DOA.
Have you try to open the binary data and check for any 0x00 to 0x20 characters? Usually in a 1 or 2 MB file that'll be around 1k to 2k smaller in size, it would be best if you manually generate a binary file with all characters from 0x00 to 0x20, and test the code again.
Thanks.
- Howard Roy.