Limit to triple mutations?

Hi, I try inserting about 60000 triples using the mutate { set call but the POST-query responds with (the rather unhelpful) {“code”:“ErrorInvalidRequest”,“message”:“Invalid mutation.”}

Is there a limit to the number of triples a mutation set accepts or am I running into other issues (and how may I debug those?)

If you have 60K triples, you’re better off loading them using our bulk loader.

Otherwise, if you want to do this via query mutations, you should break them down into smaller chunks, and send more queries. Maybe try with 1000 mutations per query.

Also, can you maybe post a sample of how your query looks?

Thank you for your reply. My mutations look like this, taken from the Documentation:

curl localhost:8080/query -XPOST --data-binary @file

with file content like
mutation {
set {

I split file with the command

split -n l/4 file

into smaller chunks. The invocation works now (I have not yet imported all chunks) but I get this error reply now:

{“code”:“Error”,“message”:“context deadline exceeded”}

The triples seem to be accepted nonetheless and successfully imported.


  • Despite this error messages, are the triples successfully imported IF the dgraph logs contain no error?
  • Is there a switch to specify the deadline the connection is accepted to be kept open?

Thank you

We don’t let a query run for more than a minute – this is hardcoded. I’d recommend chopping down the number of mutations sent per query to something lower; so you don’t exceed the deadline.

When the deadline is exceeded, the unapplied mutations can be arbitrarily abandoned; and it would be hard to tell which ones weren’t applied. Essentially, the system tries to dispose the query as quickly as it can. Thus, it’s better to stay within the time limits.

Ok, I decided not to fight against the documentation and went the bulk load way. Very fast, indeed!

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.