Can I count quickly within 1-2 seconds together 200k nodes? Counting users that like Sushi

Hi! e.g I have 1 million user nodes, and only there are three types of favorite food: sushi, pizza, spaghetti.

type User {
userID: ID!
name: String!
favoritefood: [Food] @hasInverse(field: isliked)

type Food {
id: ID!
name: String!
isliked: [User]

query {
     queryFood( filter: { name: { eq: "Sushi" } } ) {
       islikedAggregate {

Now I want to count how many users like sushi. e.g 200k users like Sushi. Can I do that? I need the result quickly within a few seconds. is that possible? Does someone have experience with that?

I remember Michael and Amaster Senseis told me “Aggregations in dgraph are quickly and no problem” or something like that; but I forgot in what context you told that, therefore I ask now to be sure

I don’t have a million or even 200K nodes of a single type to test with, but in theory this is grabbing a posting list and then just counting that list. (Maybe it also is doing a type filter in DQL for completeness.) Golang should have no problem counting to 200K and should do it very very quickly. It is already in a list it is not like it is doing a full table scan or join, just counting an existing list.

1 Like

the question is, how long is ‘quick’ :smiley: are we speaking about seconds or minutes or even milliseconds

I think what we are trying to say is on the milliseconds scale unless you have terrible disks or extremely rapid ingestion to this predicate or any other extenuating circumstances. At the same time… Just try it.