Listen

Description

(00:00:00) The DAX UDF Dilemma

(00:00:32) The Context Transition Trap

(00:00:47) VAL vs XPR: The Core Decision

(00:01:39) The Best Customers Example

(00:02:52) When to Use VAL and XPR

(00:04:54) The Context Transition Problem

(00:05:57) Fixing the Context Transition Trap

(00:08:59) Materializing with Add Columns

(00:13:06) Parameter Types and Casting

(00:16:12) Authoring Checklist for UDFs



The Two Modes That Change Everything — VAL vs EXPR In DAX UDFs, parameter mode isn’t decoration; it’s semantics. It changes when evaluation happens, which changes the result.

What breaks most UDFs: using VAL where EXPR is mandatory. You pass a snapshot, then change filters inside the function and expect it to breathe. It won’t. Mini proof: A ComputeForRed UDF sets Color="Red" internally and returns “some metric.”Decision frameworkSubtlety: EXPR ≠ automatic context transition. Measures get implicit CALCULATE in row context; raw expressions do not. If your UDF iterates rows and evaluates an EXPR without CALCULATE, it will ignore the current row. Fix lands in the function, not the caller. 

The Context Transition Trap — Why Your UDF Ignores the Current Row Row context becomes filter context only via CALCULATE (or by invoking a measure). Inline expressions don’t get that for free.Fix (inside the UDF):Anti-patternsRule of thumb: iterator + EXPR ⇒ wrap the EXPR with CALCULATE at the exact evaluation point.

Stop Recomputing — Materialize Once with ADDCOLUMNS Correctness first, then cost. EXPR + CALCULATE can re-evaluate the formula multiple times. Don’t pay that bill twice. Pattern: materialize once, reuse everywhere.
  1. Build the entity set: VALUES(Customer[CustomerKey]) (or ALL(Customer) if logic demands).
  2. ADDCOLUMNS to attach one or more computed columns, e.g.
    Base = ADDCOLUMNS( VALUES(Customer[CustomerKey]), "Metric", CALCULATE(MetricExpr) )
  3. Compute aggregates from the column: AvgMetric = AVERAGEX(Base, [Metric]).
  4. Filter/rank using the column: FILTER(Base, [Metric] > AvgMetric); TOPN(..., [Metric]).
BenefitsGuardrailsParameter Types, Casting, and Consistency — Quiet Data Traps Type hints are a contract. Coercion timing differs:TrapsSafe practiceAuthoring Checklist — UDFs That Don’t Betray YouMnemonic: Mode → Move → Make.
Choose the right mode, force the move (context transition), make once (materialize). Body 6: Compact Walkthrough — From Wrong to Right
  1. Naive: BestCustomers(metric: VAL) → iterate customers, compute average, filter metric > average.
    Result: empty set (you compared one frozen number to itself).
  2. Partially fixed: switch to EXPR but pass an inline expression inside an iterator.
    Still wrong (no implicit CALCULATE).
  3. Correctness: keep EXPR, wrap evaluations with CALCULATE in AVERAGEX and FILTER.
    Now per-customer logic works.
  4. Performance:


    Base = ADDCOLUMNS( VALUES(Customer[CustomerKey]), "Metric", CALCULATE(metric) ); AvgMetric = AVERAGEX(Base, [Metric]); RETURN FILTER(Base, [Metric] > AvgMetric)

    One evaluation per customer; reuse everywhere.
Quick checks: fewer slicers ⇒ more “best customers”; narrow brand slice ⇒ fewer; totals reconcile. Conclusion: The Three Rules You Can’t SkipIf this killed a few ghost bugs, subscribe. Next up: advanced UDF patterns—custom iterators, table-returning filters, and the performance booby traps you’ll step over instead of into.

Become a supporter of this podcast: https://www.spreaker.com/podcast/m365-show-modern-work-security-and-productivity-with-microsoft-365--6704921/support.

Follow us on:
LInkedIn
Substack