Numerous performance-affecting factors can lead to high memory usage in Power BI, even for queries that appear to be small. The following are typical causes and tips for query optimization to improve performance:
Typical Reasons for Excessive Memory Use:
Big Data Model:
Power BI can load and process large amounts of data into memory, regardless of the size of the query, if it works with a large dataset or a complex data model (many columns, large tables).
Optimization
Eliminate any tables or columns that are not in use.
When feasible, make use of aggregated tables, also known as summary tables.
By removing unnecessary rows (such as time periods that are no longer required), you can potentially reduce the volume of data.
Complicated computations in queries:
Memory usage can be greatly increased by using Power Query transformations, calculated columns, or complex DAX measures. computations that are done row by row or at a high level of granularity can consume a lot of memory.
Optimization:
- Break down complex calculations into smaller, more efficient parts.
-
- Use calculated columns sparingly in favor of measures, as measures are calculated dynamically and only when needed.
-
Query Folding and Data Transformation:
-
Relationships Between Tables:
-
Large Granularity in Data:
-
Columnar Data Storage:
-
Power BI uses a columnar storage model. If there are highly cardinal columns (e.g., unique IDs or text fields with many different values), it can consume more memory, as Power BI stores values per column rather than row.
-
Optimization:
-
Filters and Contexts: