Hi, if I want to mutate 1000 fields of 1000 nodes, can I just do that with a DQL mutation? or is that not possible? or is that possible, but I might run into troubles?
What’s actually the maximum? Can I mutate 100000 fields with a single mutation?
thanks a lot
BTW: is that also possible with graphql? can I do these many mutations also with graphql?
This is an interesting question. I want to import my RoamResearch graph into a Dgraph instance, a DQL transaction seems to be the best approach. Technically I could programmatically build the set JSON object to pass to the DQL mutation, but I’m wondering whether I should insert the entire graph in one giant mutation or batch it.
There is no maximum, it will depend on the amount of available resources and the configuration used to build the cluster - and the approach of sending these data. What we recommend is having transactions with an average of 5k objects. But if you have more resources, you can extrapolate.
Only if you push hard a very low resources Cluster.
In theory yes, Dgraph is a horizontally scalable system.
Nope, GraphQL is a very typed lang. You could mix DQL with GQL. Maybe the upsert transaction(not sure if we already have it in GQL) could be possible to mutate several objects. Not sure.
In theory, you can do whatever you could with DQL, but what limits you is the GraphQL Spec.
Use the Liveloader. It is made to do a controlled load.
I don’t recommend huge batches, split into small batches and control the shot. You can use a normal client, but you have to create a balancing strategy. Never shot against a single Alpha, always shoot even between them. That’s my personal recommendation, so you can take out the best of the best of your cluster.
This is great thanks. How would I go about figuring out what the max load is for my setup, beyond just seeing whether it times out or not?
It depends. You can pick what bothers you. e.g. Latency increase, Alphas going OOM.