Today i try to load data with bulk using docker. But i failed.
Does anybody know the right instruction?
thanks!
try this
hello~
First of all, thank you for answering my question~
I try this.But maybe I don’t really understand how it’s used, so I have a few questions.
In my understanding, the bulk loader method to import data in batches is to import into the dgraph cluster by instruction under the premise that the dgraph zero service is turned on.
But that doesn’t seem to be the case.
And in this method,where does my data source come in?
@liuhx this page has good instructions: https://tour.dgraph.io/moredata/1/
note it only works if you are running dgraph using these commands: https://tour.dgraph.io/intro/2/
Important part is your ~/dgraph folder should be mounted in the dgraph container, otherwise you’d need to adjust it according to your setup
wow
I have browsed the page several times, but I ignored this instruction because I did not use docker at the time.
I will try it!Thanks!
Yes is the case, the script will run Dgraph Zero, and wait until the bulk start, so wait finish it. And then the scritpt exec the Dgraph Server.
Well, you can analyze the scripts to understand what’s going on. Basically, if you run “docker-compose up” you will already run a Bulk automatically with “1million.rdf.gz” data set.
To change the script as needed, just run “sh ./cook-a-bulk.sh” and follow the instructions.
If you need to use your own RDF, you need to delete “1million.schema” and “1million.rdf.gz” in the “service” folder. Then put yours respective models, with the same name (I will still add the option to rename the files).
Basically, if you run docker-compose first you’ll already be doing a bulk.
Cheers.
Thanks!
Your answer is of great help to me~
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.