How to do a large mutate transaction without killing other database connections

How can I do a large import transaction without slowing down or killing other connections to dgraph? When I do an import of about 20k rows of data from a CSV (about 10 columns wide) dgraph because unqueryable during that time period so other people on the site can’t access data. How can I setup dgraph to not freeze everything while that is loading in?

Live loader has a “number of concurrent connections” parameter, which is 10 by default. Perhaps you could try live loader with lower concurrent connections.
https://dgraph.io/docs/deploy/#live-loader

I’m doing this in the dgo library not the live loader. So I’m doing it in one transaction.

When using a client, we make smaller batches (5k?) and load serially. This takes more time but keeps a manageable load on the server. You could also check active mutations and queries directly on dgraph via activity metrics and throttle the client at runtime. In order to see metrics, you can hit your endpoint as below.

curl http://<IP>:<HTTP_PORT>/debug/vars

https://dgraph.io/docs//howto/retrieving-debug-information/#metrics-information