Hi,
I am witnessing some exorbitant amount of RAM usage/requirements by Power BI desktop.
I have circa 100 queries defined, data being pulled from ODBC (3 tables) and Google Analytics (15 properties, but close to 40 tables as GA limits to 8 dimensions that could be imported/table). Note that none of these tables exceed 2000 records and some are empty (for now). The PowerBI file is about 20-22MB. Happy to provide any additional detail if need be.
Anyway, as soon as I load my PowerBI report, the RAM usage is about 6GB (steady state). During the Refresh sequence, the RAM usage goes up to staggering 22-23 GB. Naturally and thankfully, after the refresh, it drops back to its steady state usage.
Since I am running PowerBI desktop on an Azure VM, that I switch on/off when I need to Refresh the data and push it to powerbi.com, I am not that concerned as of yet. However, what concerns me is potential future scalability of this setup. What will happen if I add few more GA properties (this is what I think caused the major spike, by the way)? How many records before I ran out of my available 28GB and before I need to bump up the VM to XY GB of RAM? Finally, not to mention that it bothers me that my analytics solution uses 3 times more RAM than the production application, running on AWS.
Am I doing something wrong (I hope I am)? What should the setup be to avoid this situation? Given that I have my PowerBI report set up, is there a way to refresh the data directly from PowerBI.com without having to start the VM and refresh it from there (I don't see how it could be done given that I am pulling some data via ODBC)? I'm quite new to PowerBI, so any indication/suggestion is more than welcomed.
Thank you all!