Export data with “curl localhost:8080/admin/export” and load data with bulkload but i found that it fail to decompress and load export data.
The export log is done and the log is as followings:
I0414 20:50:53.194582 47261 log.go:34] Export Sent 1372402501 keys
I0414 20:50:53.197762 47261 export.go:581] Export DONE for group 1 at timestamp 111490001.
I0414 20:50:53.197784 47261 export.go:600] Export request: group_id:1 read_ts:111490001 unix_ts:1586860875 format:“rdf” OK.
But it fail to decompress the export *.gz file, and the error info is as followings:
invalid compressed data–format violated
And it fail to bulkload, and the error info is as followings:
_2020/04/14 16:16:37 flate: corrupt input before offset 3005180
github.com/dgraph-io/dgraph/x.Check
/tmp/go/src/github.com/dgraph-io/dgraph/x/error.go:42
*github.com/dgraph-io/dgraph/dgraph/cmd/bulk.(loader).mapStage.func2
/tmp/go/src/github.com/dgraph-io/dgraph/dgraph/cmd/bulk/loader.go:207
runtime.goexit
/usr/local/go/src/runtime/asm_amd64.s:1357_
Why does this happen, when i test with little data, that export and bulkload funtion works fine, but it’s unsuitable for big data here. The total gzip file size is 10G.
Wait for your help, thank you!!!