If I have 30 GB of data from the data warehouse, which I need to insert into the DGraph schema, then how much storage do I need to configure in the DGraph?
And what principle is behind the scenes to calculate this?
Thanks
If I have 30 GB of data from the data warehouse, which I need to insert into the DGraph schema, then how much storage do I need to configure in the DGraph?
And what principle is behind the scenes to calculate this?
Thanks
team, please guide, how much space is required to store 30 GB data ?
The indexes and nature of the source data, including it’s format and graph topology, will affect the answer. The best thing to do is insert a reasonable subset of the data, with a representative set of indexes turned on via the schema (rough expected number of text indexes, value indexes, reverse edges) and measure the size ratio of input data to used data. This can then predict the total disk usage.
Always keep some free space since LSM compaction (typical of modern DBs) requires some space to work.
Hi @Damon
what is the RAM configuration required for the 30 GB data, & hardware confuration CPU etc. too.
As noted above, with any database, the requirements for indexes, caches, working memory and so on vary on the data structure, indexing choices, query complexity and workload, so it must be tested and adjusted based on observations.
I would probably start with 32 GB per machine on 3 machines, 4 or 8 cores (vcpu) each, and see how it works. You can then adjust up or down from there. Hopefully you are using a cloud service or K8s, making it easy to change based on testing.