Without transaction with dgo

hi,

i realised the example shown in dgo library uses transaction

txn := dgraphClient.NewTxn()
defer txn.Discard(ctx)

is there a way that i query or mutate without transaction?
means all mutation should be automatically applied, and if any error occurs, those executed mutate should already applied.

please advice

thank you

There’s a CommitNow field in api.Mutation, which you can set.

See here:

in dgo library, if i set commitnow to true,
and if i loop the mutate for another object, it will state transaction already commited

do you mean that ive to keep spawning transaction for every mutate call?

in my opinions, i think this isn’t practical.
I’m also thinking, what if ive billion of records to update?
if i were to commit in the end, the billion of records , will it slows things down?

There’s no way to do that.

All Dgraph operation on all clients depends on transaction. You can create a gigantic operation that will you’ll commit to the end of it all. But you have to learn to do this. In general, it is better to commit immediately.

There’s no issues with that. You have the option to commit now or whenever you want. You are free to set an approach. You just need to know what you doing.

So, do not use commitnow, see the transaction documentation and you’ll understand how to use it without commitnow. Get started with Dgraph

Cheers.

thank you @MichelDiz
guess its safe to commit in the end even thou there are many mutation.
will this effect any memory or disk usage if it incur millions of records?

To correct what @MichelDiz said, you don’t want a txn with a billion mutations in there. Each txn needs to be a good chunk size. You can look at how we’re doing things in live loader. But say, a 1000 RDFs per txn is about the right size.

Yes, you’d need to commit each txn, there’s no way around that. If you want performance of loading a billion things, switch to the bulk loader. Dgraph live loading isn’t designed for that.

1 Like

Thank you for clarifying.
Alright probably thousand mutation update per transaction should be good. Its a update on existing record to fix missing predicate for existing okd records

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.