Is Calendar-Based Time-Based Intelligence Changing Custom Mindsets?

Introduction
Calendar-based Time Intelligence, the need for a custom concept of Time Intelligence is greatly reduced.
Now, we can create custom calendars to meet our Time Intelligence accounting needs.
You may have read my article about Advanced Time Intelligence:
Most of the custom logic is no longer needed.
But we still have cases where we have to have custom statistics, like run rate.
In the past, SQLBI has written an article about calculating the running average.
This piece uses the same principles described there in a slightly different way.
Let's see how we can calculate the effective ratio during three months using the new Calendars.
Using classic Time Intelligence
First, we use the standard Gregorian calendar and the classic Time Intelligence date table.
I use the same method as described in the SQLBI article linked in the Reference section below.
Running Average by Month =
// 1. Get the first and last Date for the current Filter Context
VAR MaxDate = MAX( 'Date'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Date'[Date]
,MaxDate
,-3
,MONTH
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Date'[MonthKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
When I run this measurement in DAX Studio, I get the expected results:
So far, so good.
Using a regular calendar
Next, I created a Calendar named “Gregorian Calendar” and changed the code to use this calendar.
To make this easier to understand, I copied the date table into a new table called “Gregory Date Table”.
The change is when you call the DATESINPERIOD() work.
Instead of using a date column, I use a newly created calendar:
Running Average by Month =
// 1. Get the first and last Date for the current Filter Context
VAR MaxDate = MAX( 'Gregorian Date Table'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Gregorian Calendar'
,MaxDate
,-3
,MONTH
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Gregorian Date Table'[MonthKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
As expected, the results are the same:

The performance is very good, as this query finishes in 150 milliseconds.
Using a custom calendar
But what happens when you use a custom calendar?
For example, a calendar with 15 months in a year and 31 days in each month?
I created such a calendar for my article, which describes the use cases for calendar-based Time Intelligence (See the Link at the Top and the References section).
If you look at the rate code, you will notice that it is different:
Running Average by Month (Custom) =
VAR LastSelDate = MAX('Financial Calendar'[CalendarEndOfMonthDate])
VAR MaxDateID = CALCULATE(MAX('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[CalendarEndOfMonthDate] = LastSelDate
)
VAR MinDateID = CALCULATE(MIN('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[CalendarEndOfMonthDate] = EOMONTH(LastSelDate, -2)
)
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[CalendarYearMonth]
, "#Sales", [Sum Online Sales]
)
,'Financial Calendar'[ID_Date] >= MinDateID
&& 'Financial Calendar'[ID_Date] <= MaxDateID
)
RETURN
AVERAGEX(SalesByMonth, [#Sales])
The reason for the changes is that this table does not have a date column that is used with DATESINPERIOD() work. For this reason, I have to use custom code to calculate the distance value ID_Date.
These are the results:

As you can see, the results are good.
Improving by using the date index
But when I analyze the performance, it is not that good.
It takes about half a second to calculate the results.
We can improve performance by removing the need to retrieve the minimum and maximum ID_Date and making a very efficient calculation.
I know that each month has 31 days.
To go back in three months, I know I have to go back in 93 days.
I can use this to create a quick version of the estimate:
Running Average by Month (Financial) =
// Step 1: Get the last Month (ID)
VAR SelMonth = MAX('Financial Calendar'[ID_Month])
// Step 2: Generate the Date Range from the last 93 days
VAR DateRange =
TOPN(93
,CALCULATETABLE(
SUMMARIZECOLUMNS('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[ID_Month] <= SelMonth
)
,'Financial Calendar'[ID_Date], DESC
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[ID_Month]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
In this case, I used the TOPN() function to retrieve the previous 93 rows in the Financial Calendar table and use this list as a filter.
The results are the same as the previous version:

This version only needs 118 ms to complete.
But can we go even further with excellence?
Next, I added a new column to the Financial Calendar to assign levels to the rows. Now, each date has a unique number directly related to their order:

The equation using this column is as follows:
Running Average by Month (Financial) =
// Step 1: Get the last Month (ID)
VAR MaxDateRank = MAX('Financial Calendar'[ID_Date_RowRank])
// Step 2: Generate the Date Range from the last 93 days
VAR DateRange =
CALCULATETABLE(
SUMMARIZECOLUMNS('Financial Calendar'[ID_Date])
,REMOVEFILTERS('Financial Calendar')
,'Financial Calendar'[ID_Date_RowRank] <= MaxDateRank
&& 'Financial Calendar'[ID_Date_RowRank] >= MaxDateRank - 92
)
--ORDER BY 'Financial Calendar'[ID_Date] DESC
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Financial Calendar'[ID_Month]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
The result is the same, I don't show it again.
But here's a comparison from the execution statistics:

As you can see, the version uses TOPN() is slower than using the RowRank column.
But the difference is very small.
More importantly, the version that uses the RowRank column requires more data to complete the calculations. See the Rows column for details.
This means more RAM usage.
But with this small number of lines, the difference is marginal.
It is your choice which version you choose.
Using a weekly calendar
Finally, let's look at the calculation based on the week.
In this case, I want to calculate the average fold for the last three weeks.
Since the calendar-based Time Intelligence allows the creation of a week-based calendar, the ratio is very similar to the second one:
Running Average by Week =
// 1. Get the first and last Date for the current Filter Context
VAR MaxDate = MAX( 'Gregorian Date Table'[Date] )
// 2. Generate the Date range needed for the Moving average (three months)
VAR DateRange =
DATESINPERIOD( 'Week Calendar'
,MaxDate
,-3
,WEEK
)
// 3. Generate a table filtered by the Date Range generated at step 2
// This table contains only three rows
VAR SalesByMonth =
CALCULATETABLE(
SUMMARIZECOLUMNS(
'Gregorian Date Table'[WeekKey]
, "#Sales", [Sum Online Sales]
)
,DateRange
)
RETURN
// 4. Calculate the Average over the three values in the table generate in step 3
AVERAGEX(SalesByMonth, [#Sales])
The important part is that I use the “WEEK” parameter in the DATESINPERIOD() call.
That's all.
This is the result of the query:

The performance is very good, the execution times are less than 100 ms.
Note that weekly calculations are only possible with calendar-based Time Intelligence.
The conclusion
As you have seen, the calendar-based Time Intelligence makes life easier in a custom sense: we only need to pass the calendar instead of the date column to the functions. And we can calculate the weekly intervals.
But the current feature set does not include the semester interval. If we have to calculate results based on semester, we have to use classic Time Intelligence or write custom code.
But we still need custom logic, especially if we don't have a date column in our calendar table. In such cases, we cannot use standard time-wise functions, as they still work with date columns.
Remember: The most important task when working with calendar-based Time Intelligence is to create a consistent and complete calendar table. In my experience, this is a very complex task.
As a side note, I found some interesting works on daxlib.org about the effective ratio.
I've added a link to the works in the references section below.
These activities follow a completely different pattern, but I wanted to include them to create a complete picture for this article.
References
The SQLBI.com article mentioned on calculating the Effective Ratio:
Time Series works on daxlib.org in a different way:
Here is my last article, where I describe Time-Intelligence based on the calendar:
As in my previous articles, I am using the Contoso sample data. You can download the ContosoRetailDW Dataset for free from Microsoft here.
Contoso data can be freely used under the MIT license, as described in this document. I changed the dataset to change to more modern dates.


