How to Use Live Loader in a more convenient way?

In my Application, I use docker swarm to deploy Dgraph services, when i want to insert some data to dgraph, I have to copy all the rdfs into the docker container and run the commond inside it.
Is there a more convenient way to run live loader sunch as using curl or http requests, so I can add data from any client machine?

if you expose temporarily the gRPC ports you can do a remote Live load. Also, you could use some port forwarding, proxying, or VPN to do that procedure.

Can you explain the specific method how to expose gRPC ports?

will Dgraph service provide an API for remote Live Loader?

The live loader executes over gRPC, no reason to copy your data in and load it. Expose the appropriate ports and execute the loader from your workstation.

dgraph live -f <path-to-gzipped-RDf-or-JSON-file> -s <path-to-schema-file> -a <dgraph-alpha-address:grpc_port> -z <dgraph-zero-address:grpc_port>

There are several. It depends on the provider you are using. For example, in AWS you have to open the port via their service. GCP the same thing.

Not sure what you mean. There’s no API for live load. The gRPC itself is a remote procedure. Similar to an API.

The steps you need are.

  1. Install Dgraph in your machine(the one where the RDF are)
  2. Expose the gRPC ports in your Cloud provider. (Not safe, as you can forget to close it).
  3. Start the live load in your machine pointing to the exposed ports.

The safest would be a VPN with your Cloud.

In my Application, I got the rdf data from RESTful APIs or stream, I wonder if dgraph client such as pydgraph, dgraph4j can support Live Loder itself, so I don’t need to save one copy of data on the machine.

I had to write shell to run Live loader, I think is there any possible to make dgraph client (like pydgraph, dgraph4j) support live loader.

Live loader is just a program created to deal with data ingestion - it uses Dgo to connect with Dgraph. You can create your live loader program with any client tho.

Try to implement a connection with kafka or RabbitMQ with Dgraph. That way you can create a “kind of stream connection” with your cluster.

A remote Live load fits there.

No, Live loader is a program that uses a Dgraph client. I can’t see the relation with other clients and Live loader.