Alpha crashes in 20.0.7 with loading data with130K+ data

Hi Team, when I upload the big data around 130K+ records, dgrpah’s alpha port crashed and giving me error Preformatted textand whenever I am restarting it it still giving me same error it isn’t starting.
Here is the error

I0316 12:05:09.015316    3505 server_state.go:75] Setting Badger Compression Level: 3
I0316 12:05:09.015322    3505 server_state.go:84] Setting Badger table load option: mmap
I0316 12:05:09.015327    3505 server_state.go:96] Setting Badger value log load option: mmap
I0316 12:05:09.015333    3505 server_state.go:164] Opening postings BadgerDB with options: {Dir:/dgraph_data/p ValueDir:/dgraph_data/p SyncWrites:false TableLoadingMode:2 ValueLogLoadingMode:2 NumVersionsToKeep:2147483647 ReadOnly:false Truncate:true Logger:0x28325d0 Compression:2 InMemory:false MaxTableSize:67108864 LevelSizeMultiplier:10 MaxLevels:7 ValueThreshold:1024 NumMemtables:5 BlockSize:4096 BloomFalsePositive:0.01 KeepL0InMemory:true MaxCacheSize:1073741824 MaxBfCacheSize:0 LoadBloomsOnOpen:false NumLevelZeroTables:5 NumLevelZeroTablesStall:10 LevelOneSize:268435456 ValueLogFileSize:1073741823 ValueLogMaxEntries:1000000 NumCompactors:2 CompactL0OnClose:true LogRotatesToFlush:2 ZSTDCompressionLevel:3 VerifyValueChecksum:false EncryptionKey:[] EncryptionKeyRotationDuration:240h0m0s BypassLockGuard:false ChecksumVerificationMode:0 KeepBlockIndicesInCache:true KeepBlocksInCache:true managedTxns:false maxBatchCount:0 maxBatchSize:0}
I0316 12:05:09.171358    3505 log.go:34] All 19 tables opened in 139ms
[Sentry] 2021/03/16 12:05:09 ModuleIntegration wasn't able to extract modules: module integration failed
[Sentry] 2021/03/16 12:05:09 Sending fatal event [79ba8971b4c54c358d1a784eaef02c4b] to o318308.ingest.sentry.io project: 1805390
2021/03/16 12:05:09 Inter: Biggest(j-1)
00000000  00 00 28 65 71 75 69 70  6d 65 6e 74 2e 73 65 72  |..(equipment.ser|
00000010  76 65 72 2e 73 65 72 76  65 72 5f 70 72 6f 63 65  |ver.server_proce|
00000020  73 73 6f 72 73 4e 75 6d  62 65 72 00 00 00 00 00  |ssorsNumber.....|
00000030  00 03 27 1a ff ff ff ff  ff fc 7f cb              |..'.........|

 vs Smallest(j):
00000000  00 00 25 65 71 75 69 70  6d 65 6e 74 2e 73 65 72  |..%equipment.ser|
00000010  76 65 72 2e 73 65 72 76  65 72 5f 68 6f 73 74 69  |ver.server_hosti|
00000020  6e 67 45 6e 74 69 74 79  00 00 00 00 00 00 03 5f  |ngEntity......._|
00000030  57 ff ff ff ff ff fc 7f  cb                       |W........|

: level=1 j=12 numTables=14
github.com/dgraph-io/badger/v2.(*levelHandler).validate
        /go/pkg/mod/github.com/dgraph-io/badger/v2@v2.0.1-rc1.0.20200528205344-e7b6e76f96e8/util.go:55
github.com/dgraph-io/badger/v2.(*levelsController).validate
        /go/pkg/mod/github.com/dgraph-io/badger/v2@v2.0.1-rc1.0.20200528205344-e7b6e76f96e8/util.go:33
github.com/dgraph-io/badger/v2.newLevelsController
        /go/pkg/mod/github.com/dgraph-io/badger/v2@v2.0.1-rc1.0.20200528205344-e7b6e76f96e8/levels.go:191
github.com/dgraph-io/badger/v2.Open
        /go/pkg/mod/github.com/dgraph-io/badger/v2@v2.0.1-rc1.0.20200528205344-e7b6e76f96e8/db.go:348
github.com/dgraph-io/badger/v2.OpenManaged
 
Error while creating badger KV posting store
github.com/dgraph-io/dgraph/x.Checkf
        /ext-go/1/src/github.com/dgraph-io/dgraph/x/error.go:51
github.com/dgraph-io/dgraph/worker.(*ServerState).initStorage
        /ext-go/1/src/github.com/dgraph-io/dgraph/worker/server_state.go:168

It looks like your badger store may have been corrupted. 130k records should be quite fast to import - can you delete your p directory and try again?