I’m on an IoT project where I need data persistance. I started using BadgerDB and it’s working fine. One problem I’m facing is the constant increase of the data file.
Since this is for data persistance I have to save to disk at every new value of a tag. My option.SyncWrites is true.
I’m testing with a few tags only but the file size keep growing. I’m only interested in the latest value, no need of any historical. Set the option.NumVersionsToKeep = 1
I was expecting an update only on my [key,value] which should not increase my file size for the same number of [key,value] pairs.
The value log is append-only, so new values are appended, never overwritten. You can configure the value log size. You’ll also want to run value log GC periodically.
Changed the value log size. Now BadgerDB keeps creating more and more files. Running the GC at every 10 minutes and it cannot keep up with the creation of data files. I can see some files being deleted but less than the files being created.
Still I’m only interested in the latest value for each key. A very small file would be enough for my application.