Data export when using docker

I am currently testing a backup strategy before going to a limited production release. I setup a small cluster via docker swarm. Unfortunately, I am not able to trigger an export of all data. Doing it remotely via a ssh tunnel does not work, which is kinda sad :wink: Next I tried to ssh into one of the host machines, but it does not work as well. And trying to run a command in one of the dgraph docker images fails as well, as the images do not include the “curl” command line tool.

Any ideas?

Try to create a WebHook that triggers a command to the Dgraph instance via EXEC (docker). Making it back up to the local volume - Share this volume with some other instance of your preference for your strategy. Hence, by continuing the webhook script (Bash), by having a positive Backup result you would schedule the script to copy and send via Tunnel that file generated by Dgraph.

That’s one idea only.

I had an idea that could be useful in this case yours if it were implemented.

1 Like

I am going to add curl to the docker images. It should be part of the nightly release(dgraph/dgraph:master) and also the next release. Once you have curl, you can initiate an export from any of the nodes in the cluster(from within a container running the dgraph server).

docker exec -it <container_id> bash
curl localhost:<dgraph_http_port>/admin/export

We have an open issue to add IP range whitelist from which exports can be initiated. We’ll also later add support for doing backups to cloud providers.

2 Likes

@pawan That’s awesome. I read about the whitelist feature request. It would be perfect for our setup as the dgraph cluster is only accessible from within our network.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.