Load data to dgraph:out of memory

Hi, when I use" $ dgraph bulk -f ownthink.rdf -s ownthink.schema --map_shards=8 --reduce_shards=2 --http localhost:8000 --zero=localhost:5080"to load the data, and my data is only 43M,but I get the following error:
fatal error runtime:out of memory



How I to solve it?

Hey we have done some change to bulk loader recently. Can you please try this on recent master and confirm if it still goes out of memory?
If yes can you please share the data-set which can reproduce this?

My dataset is extracted from the universal ownthink dataset.
there is ownthink dataset:
https://github.com/ownthink/KnowledgeGraphData
My dataset has vertexs of 61235 and edges of 831069
Look like this:

Hello @qqqq,

I tried to reproduce the issue using bulk loading with a similar dataset generated using a python script. I am not able to reproduce it v1.1.0 or master.

Could you try using bulk loader from master branch to see if the issue persists? Also, can you please share the machine configuration used for the task?

你好啊!我也下载了ownthink数据,请问你在怎么定义schema的呢?怎么区分csv里面的值是一个属性还是一个实体的呢?

@purist180 双引号引起的表示一个属性,_:XXX这种结构的表示一个空的节点,也就是实体。

谢谢!我其实想问的是怎么把ownthink的数据转换成Dgraph这种格式的,


这个数据集提供的原始的csv 是这个样子的,请问怎么区分宾语是个属性还是值

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.