Ritsu
(Ritsu)
October 13, 2021, 12:49pm
1
What version of Go are you using (go version
)?
$ go version
go version go1.16.6 darwin/amd64
What operating system are you using?
MacOS Big Sur 11.6
What version of Badger are you using?
v3.2103.1
Does this issue reproduce with the latest master?
Yes
Steps to Reproduce the issue
Create a persistent database with a large memtable size.
What Badger options were set?
badger.DefaultOptions(cfg.Path).
WithLoggingLevel(badger.WARNING).
WithMemTableSize(64 << 25)
What did you do?
When creating a database with a large enough memtable size (more than 64 << 24) badger crashes. I believe this is related to an overflow in this line: https://github.com/dgraph-io/badger/blob/master/memtable.go#L569 .
What did you expect to see?
Program working.
What did you see instead?
ERROR: Received err: File /db/data/00001.mem already exists. Cleaning up...
Additional questions
Are these large memtable sizes supported in general?
It’s crashing because of the following line
return mt, z.NewFile
}
mt.wal = &logFile{
fid: uint32(fid),
path: filepath,
registry: db.registry,
writeAt: vlogHeaderSize,
opt: db.opt,
}
lerr := mt.wal.open(filepath, flags, 2*db.opt.MemTableSize)
if lerr != z.NewFile && lerr != nil {
return nil, y.Wrapf(lerr, "While opening memtable: %s", filepath)
}
// Have a callback set to delete WAL when skiplist reference count goes down to zero. That is,
// when it gets flushed to L0.
s.OnClose = func() {
if err := mt.wal.Delete(); err != nil {
db.opt.Errorf("while deleting file: %s, err: %v", filepath, err)
}
I don’t know why it’s doing a 2*memtable size.
Memtable size has a hard limit of maxUint32 (around 4 GB) (see skl.go file). With the above 2*memtableSize
code, you can have a maximum memtable size of less than 2 GB.
(64 << 25)*2 is greater than maxUint32 and it will crash badger.
(64 << 24)*2 is less than maxUint32 which seems to be the current limit for memtable size.