Bulk Writes in Badger

I’m testing the latest version of badger and ran into ErrTxnTooBig when trying to write values in bulk. The test code seems to commit and create new transactions when the error occurs. Not sure I understand the design, can you provide example code? Thanks!

When you get this, you should commit the current txn, and start a new one.
Txn too big, means your txn is too big, and the mutation won’t be part of
the txn. See godocs.

Ok, that’s what I thought. Don’t see any details in godocs though. Might a
good idea to add an example. Thanks.

Yeah, good point. @deepak – let’s add an example.

Would it make sense to add high level methods to simplify simple cases? For

func (db *DB)Update(kv KV) error
func (db *DB)UpdateBulk(kv KV) error

type KV interface{
Key() byte
Value() byte

This hides all the details from the end user.

All the updates are done via transactions. And a single transaction can’t get too big to fit into a single memtable.

I think the fix is simple enough that it doesn’t warrant another function. Update won’t be the right method here, instead it’s better to use the transaction directly, so you can commit as soon as you get this error, and create a new one.

txn := db.NewTransaction(..)
for update := range updates {
  if err := txn.Set(..); err == TxnTooBig {
    _ = txn.Commit()
    txn = db.NewTransaction(..)
    _ = txn.Set(..) 

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.