Crash with memtable sizes over 64 << 24

What version of Go are you using (go version)?

$ go version
go version go1.16.6 darwin/amd64

What operating system are you using?

MacOS Big Sur 11.6

What version of Badger are you using?

v3.2103.1

Does this issue reproduce with the latest master?

Yes

Steps to Reproduce the issue

Create a persistent database with a large memtable size.

What Badger options were set?

badger.DefaultOptions(cfg.Path).
		WithLoggingLevel(badger.WARNING).
		WithMemTableSize(64 << 25)

What did you do?

When creating a database with a large enough memtable size (more than 64 << 24) badger crashes. I believe this is related to an overflow in this line: https://github.com/dgraph-io/badger/blob/master/memtable.go#L569.

What did you expect to see?

Program working.

What did you see instead?

ERROR: Received err: File /db/data/00001.mem already exists. Cleaning up...

Additional questions

Are these large memtable sizes supported in general?

It’s crashing because of the following line

I don’t know why it’s doing a 2*memtable size.
Memtable size has a hard limit of maxUint32 (around 4 GB) (see skl.go file). With the above 2*memtableSize code, you can have a maximum memtable size of less than 2 GB.

(64 << 25)*2 is greater than maxUint32 and it will crash badger.
(64 << 24)*2 is less than maxUint32 which seems to be the current limit for memtable size.