Hi, when I use" $ dgraph bulk -f ownthink.rdf -s ownthink.schema --map_shards=8 --reduce_shards=2 --http localhost:8000 --zero=localhost:5080"to load the data, and my data is only 43M,but I get the following error:
fatal error runtime:out of memory
Hey we have done some change to bulk loader recently. Can you please try this on recent master and confirm if it still goes out of memory?
If yes can you please share the data-set which can reproduce this?
My dataset is extracted from the universal ownthink dataset.
there is ownthink dataset: https://github.com/ownthink/KnowledgeGraphData
My dataset has vertexs of 61235 and edges of 831069
Look like this:
Could you try using bulk loader from master branch to see if the issue persists? Also, can you please share the machine configuration used for the task?