[odb-users] Streaming BLOBs

Boris Kolpackov boris at codesynthesis.com
Fri Jun 8 10:08:10 EDT 2012


Hi Phil,

philly.dilly at gmail.com <philly.dilly at gmail.com> writes:

> The use case I had in mind was for backing up user provided documents of  
> unknown size which could get as big as several hundred megabytes.

So I guess you will have something like an std::fstream as a member
in your persistent class? I guess that could work if you open it
for reading before calling persist() or update() and open it for
writing before calling load().

You can add support for the fstream to BLOB mapping in exactly the same
way as for std::wstring, which we discussed in another thread (for
those reading just this thread, see the 'mapping' example for details). 

Maybe we will add built-in support for this in the future, but at the
moment this feels a bit too "new" to me (especially the fact that one
needs to open the file before performing database operations). Perhaps
you can give us some feedback on how this works out for you. 


> I know this is an edge case scenario since a database (specially a
> light weight one like sqlite) isn't really the place to save such
> big documents!

The SQLite BLOB streaming mechanism (sqlite3_blob_open(), etc.) actually
seems like it was made for this kind of use cases. You can still store
that kind of BLOBs into SQLite even with the current mechanism used
by ODB (i.e., copy the whole thing into a memory buffer and pass it
to SQLite). But I think it is a good idea to support streaming as
an alternative. Added to my TODO.

Boris



More information about the odb-users mailing list