Cooldown/MemoryLimit Flag for Live Loader

amaster507 commented :

Memory requirement should not scale with the dataset, or the size of your dataset is limited by the size of your machine’s RAM, which is not desirable.

While it is not desirable, there has to be some type of limitation. In order to make queries and load the schema it has to use memory. The more the data, the deeper the query, the more filters, and the larger the schema obviously leads to needing more memory.

I don’t think there is any way of not limiting a dataset to a set amount of memory.

However I think it needs optimization still so that the memory requirement does not scale exponentially but rather proportionately. For example if a dataset of 10 million requires 8Gb (just fabricating numbers here) then a data set of 20 million should require 16Gb and not 64Gb.

Again, I am not saying these numbers above are accurate but just from what I have seen discussed:

Bulkloader uses resources exponentially based on the dataset size, especially RAM. - Fatal error: runtime: out of memory when bulk loader - #9 by MichelDiz