What data analytics tool can be used for dgraph data?

What data analytics tool can be used for dgraph data?

Hi @Abishek_Nair… welcome.

Hmm, not exactly sure what you’re asking. As far as Dgraph metrics is concerned, Dgraph has a metrics reporting mechanism, check out: https://dgraph.io/docs/deploy/metrics/#sidebar

If you’re looking for some sort of Analytics/BI functionality inside Dgraph, that’s not a focus for Dgraph. But of course it can be used to support those sorts of systems.

I need to see metrics for my product/user data, like sign up/usage trends over a period of time. What tools are available to get that?

If you store your signup data and usage trends in Dgraph, then the tool to use is either DQL or GraphQL. I’ve done this in a project before, blended functional graph data with analytical information.

I am looking for some products like PowerBI that I can plug into dgraph data and build reports and data visualizatios. Not spend resources on building the whole thing using DQL or GraphQL.

Hi Abishek. Yes, you can consume Dgraph data in many different ways for analytics.

One is to connect directly to Dgraph using a visual tool/builder, which sounds like what you want to do. GraphQL makes it more flexible and performant to connect a BI tool on JSON data (vs REST which overfetches and is slow to build/maintain, or SQL which is inappropriate for graph use cases). See this video about how How to build an interactive GraphQL GUI, Dashboard, or Admin Client in minutes - YouTube from DronaHQ and see their related overview: Integrations - DronaHQ Low Code App Development Platform ).

Here’s a blog on how to do something similar in Qlik: Solved: Load data from GraphQL API with REST data connecto... - Qlik Community - 1558473

These are a couple examples but any BI tool that can consume JSON over HTTP can consume data from Dgraph. The HTTP calls must specify a GraphQL or a JSON request payload as documented here https://dgraph.io/docs/clients/raw-http/ .

Another option is to stream data from Dgraph → Kafka since most tools will consume JSON from Kafka for analytics.

Realistically, many companies still use batch for analytics, though, so you can orchestrate a data pipeline to periodically export recent Dgraph data as JSON or RDF and store or process it for BI/reporting. See: https://dgraph.io/docs/enterprise-features/change-data-capture/.

As noted above, GraphQL makes it simple enough to configure a .js chart dashboard that you may consider that where normally you would not. Some basic dev knowledge is needed, but no REST servics or OpenAPI specs or middle tiers needed.

Finally, there is deep analytics on graphs, populating GNNs, exploratory network views and so on where Dgraph really shines. See Knights Analytics presentation here: Alex Ridden - Building a Centralized Knowledge Graph to Power Your Analytics - Dgraph Day 2021 - YouTube .

I know that’s a lot, but the best approach depends on your use case. I’m very interested in how people need to consume data for analytics, so it will help everyone out if you can let us know your use case and exact situation.