If you’ve been working with DAX for a while, chances are you’ve hit that point where reports start slowing down. Visuals take longer to load, slicers feel laggy, or that one KPI card spins forever before displaying a number. Sound familiar?
Let’s break down why that happens, and more importantly, what you can do to fix it. These aren’t just theoretical tips, they come from real projects where tuning DAX performance made or broke the user experience.
First, it’s not always DAX’s fault
Bad report performance isn’t always caused by DAX. Often, it’s the model: too many columns, poor relationships, high-cardinality fields, or unnecessary calculated columns. That said, poor DAX can definitely make it worse, especially when:
- You use iterators on large tables
- You apply too many filters inside
CALCULATE - You create calculated columns for things that should be measures
- You don’t control context properly
Always start with the data model
Before you write a single line of DAX, ask:
- Are my tables structured in a star schema?
- Am I using only the columns I need?
- Is my date table optimised?
- Can I reduce relationship complexity?
Real case: I reduced the fact table from 40+ columns to 12 used ones. That reduced model size by 30% and cut page load time almost in half. Huge win.
Tip: Use DAX Studio’s VertiPaq Analyzer to see what’s bloating your model.
Avoid overusing calculated columns
If you’re using calculated columns for metrics like profit or margin, stop and think: could this be a measure?
Example – Don’t do this:
Profit = Sales[Revenue] - Sales[Cost] Do this instead:
Profit := SUM(Sales[Revenue]) - SUM(Sales[Cost]) Calculated columns increase model size. Measures are calculated at runtime and don’t take up memory until used.
Use iterators sparingly
Functions like SUMX and AVERAGEX are powerful, but they calculate row by row — and that’s costly on large datasets.
If you’re doing this:
Total Sales := SUMX(FactSales, FactSales[Quantity] * FactSales[UnitPrice]) Ask yourself: can I pre-calculate TotalPrice in Power Query?
Then use:
Total Sales := SUM(FactSales[TotalPrice]) Much faster, much cleaner.
Use variables — always
Variables aren’t just for code readability, they also improve performance by reducing repeated evaluations.
Instead of:
Profit Margin := (SUM(Revenue) - SUM(Cost)) / SUM(Revenue) Use:
Profit Margin := VAR TotalRevenue = SUM(Revenue) VAR TotalCost = SUM(Cost) RETURN DIVIDE(TotalRevenue - TotalCost, TotalRevenue) It’s safer, clearer, and faster. DIVIDE() also protects against divide-by-zero errors.
Use Performance Analyzer + DAX Studio
Performance Analyzer (View > Performance Analyzer) shows how long visuals take to render and which ones are the bottleneck.
Copy the DAX query and run it in DAX Studio to see what’s really going on, including query plans and memory usage.
Real example: A matrix visual was taking 10+ seconds. Turned out, a CALCULATE(FILTER(...)) construct was being re-evaluated for every row. We rewrote it with TREATAS and shaved 8 seconds off the load time.
Reduce cardinality wherever you can
Columns with lots of unique values are bad for compression, and that slows everything down.
Common offenders:
- Transaction IDs
- Free text fields
- Timestamps down to the second
Fixes:
- Remove if not needed
- Round timestamps to the hour or date
- Replace IDs with grouped segments or hashes
Simplify CALCULATE whenever possible
The FILTER(ALL(...)) pattern is overused — and slow when misapplied.
Instead of this:
CALCULATE( SUM(Sales[Amount]), FILTER(ALL(Sales), Sales[Product] = "Bike") ) Try this instead:
CALCULATE( SUM(Sales[Amount]), Sales[Product] = "Bike" ) Direct Boolean expressions are far more efficient.
Tools that should be in your kit
| Tool | Use |
|---|---|
| DAX Studio | For query diagnostics, memory usage |
| Performance Analyzer | For visual timing |
| VertiPaq Analyzer | To analyse model size |
| Tabular Editor | To manage and edit measures, roles, calc groups |
Performance Tuning Quick Checklist
- Convert calculated columns to measures where possible
- Reduce high-cardinality columns
- Use variables to avoid redundant logic
- Avoid unnecessary
FILTER()insideCALCULATE() - Test complex visuals with Performance Analyzer
- Optimise your star schema and remove unused columns
- Use tools like DAX Studio and Tabular Editor regularly
Tuning DAX isn’t just about squeezing milliseconds. It’s about building models and reports that scale. When you’re working on business-critical dashboards, every second counts. Make your models lean, your DAX smart, and your users happy.
If this post helped, and you want to dive deeper, I dedicate an entire chapter to this in my book, Mastering DAX in Power BI. You’ll find tools, walkthroughs, and techniques I’ve used in real production environments.
👉 Get the book and start building faster, smarter DAX today.

