Why would `count` function exceed response size limitation ?

when i do a query to get a counter, there always exceed the response size limitation. actually the response only several kb size.

the query like:

$ curl -H 'Content-Type:application/dql' 127.0.0.1:8080/query -d '{
a(func:type(A))@filter(eq(field1, value1)){
    a as ~A
} 
b(func:uid(a)) {
    B @filter(eq(field2, value2)){
        b as ~B
    }
}
end(func:uid(b)){
    count(uid)}
}' | jq
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   327  100   178  100   149      6      5  0:00:29  0:00:26  0:00:03    37
{
  "errors": [
    {
      "message": "while running ToJson: estimated response size: 16358122716 is bigger than threshold: 4294967296",
      "extensions": {
        "code": "ErrorInvalidRequest"
      }
    }
  ],
  "data": null
}

the relationship of type A/B/C looks like:
image

the normal response will just include count field, example:

{
  "data": {
    "a": [],
    "b": [],
    "end": [
      {
        "count": 54321
      }
    ]
  },
  "extensions": {
    "server_latency": {
      "parsing_ns": 100531,
      "processing_ns": 30463511,
      "encoding_ns": 61335626,
      "assign_timestamp_ns": 1027941,
      "total_ns": 93051248
    },
    "txn": {
      "start_ts": 1853380,
      "hash": "28ca9250632810a407cebff6735ab545da6da8a33d44aadb418639213e4350e6"
    },
    }
  }
}

so why would the response size exceed limitation ?

This looks like it might be a bug. Let me investigate.

will it be fixed ? when ?