How to sync bulk writes when running badger as a server without closing DB?

Hi there,

I’m not sure if what I’m looking to do is even possible but I was hoping to get some advice here if possible. I’m writing an Echo server which is backed by Badger. The idea was to have just two endpoints one for getting values from the DB and another to perform bulk inserts in batches of KV pairs.

I’m running into issues with the bulk insert endpoint. The function for inserting uses WriteBatch but because I’m using a global variable of the DB (for performance reasons) which does not close until the server shuts down the values don’t get written to disk at all until I close the server and everything stays in RAM. I can’t seem to find anything in the documentation that suggests how to handle this as it seems the LOCK pretty much prevents any chance of the data being written until I execute db.Close(). I have tried out adding added a custom Close and Open step at the start of each call to the bulk insert but this doesn’t feel very safe or elegant. I think this user might have been describing a similar issue. Does anyone have any experience with something similar?

The most important endpoint for my use case is the one for fetching the values from the DB, the bulk insert endpoint is effectively for convenience but the data doesn’t change too often so it is acceptable to do big writes to the DB offline every couple of weeks and simply replace the old DB with the new one. So if it’s not possible to do this it should be okay anyway.

Thanks!