I recently implemented a customer reporting project using the Virtual List technique. In this particular implementation, the customer needed to create a cross-tab style project schedule report with 28 columns and about 30 rows. Each “cell” in the report is an unstored calculation field that would evaluate a repeating global variable to derive the correct value to display. Creating the report is a combination of scripting to load values into the global variable and reevaluation of the 800+ unstored calculations in the report.
Concern for the performance of the report drove me to explore ways to evaluate performance. In particular, I wanted to be able to evaluate script performance vs calculation performance so that I would know where I would get best performance improvements for my efforts.
Evaluating Script Performance
Evaluating script performance was a simple matter of setting the start and end times to variables and calculating the difference at the end of the script. At the beginning of the script I use the undocumented function Get ( UTCmSecs ) to get milli-second timestamp precision:
Set Variable ( $startTime ) = Get ( UTCmSecs )
At the end of the script I simply calculate the difference between start and end times and divide by 1000 to get a script run-time duration:
Set Variable ( $$TotalScriptTime = ( get ( UTCmSecs ) - $startTime ) / 1000 & " Seconds"
Note: Make sure the data viewer is not visible when evaluating script performance. The display of fields and variables in the data viewer will slow script execution down.
Evaluating Layout Performance
Of course, script run time is only part of the story. I also wanted to know how long it took for the report to reevaluate all of the 800+ unstored calculations. This would provide excellent information into what report elements (scripts vs calcs) had the biggest effect on performance. I was also curious as to how different calculation methods affected performance (like ExecuteSQL vs FileMaker functions based on relationships).
To accomplish this I added three global variables to the layout:
The start and end time global variable objects have conditional formatting applied. Since conditional formatting evaluates whenever a layout is refreshed, I can use a Let () function in the conditional formatting calculation to create and calculate global variables. I can further take advantage of the stacking order of objects in the layout and cause the start time variable to be calculated first by applying “Arrange/Send to Back” to the start time variable in layout mode. Here is the calculation used in the conditional formatting for the start time:
Let ( $$reportStartTime = Get ( UTCmSecs ) ; "" )
The end time and total time are created with conditional formatting as well. In this case, the end time variable’s position is set using “Arrange/Bring to Front.” Lastly, the position of the total time variable is also moved to the front. Here is the calculation applied in the conditional formatting of the end time:
Let ( [ $$reportEndTime = Get ( UTCmSecs ) ; $$reportTotalTime = ($$reportEndTime - $$reportStartTime) / 1000 & " Seconds" ] ; "")
I tuck these globals away into the undisplayed right-hand margin of the layout and slide them out whenever I have the need to evaluate the performance of the report layout. These global variable objects are easily copied to other layouts or solutions where performance evaluation is needed.
By the way, it takes about 3.3 seconds for the report scripting to execute. The report unstored calculations take about 0.015 seconds to reevaluate. Of course, it seems clear that the scripting of this report is the best target to consider when looking at how to improve its performance.
Having the global variables in the view area on the layout all the time is quite feasible, if you modify the conditional formatting calcs to obey a $$debug variable, you could have them only display data when you set that $$debug variable, an NULL otherwise.
Good point Peter, that would make the solution a bit easier to maintain. Thanks for stopping by and contributing to the conversation.
Don’t forget to make sure the calculations you want to benchmark really happen after the start time is recorded and before the end time is recorded. Sometimes stacking order is not sufficient – especially if you have multiple parts on the layout. I have personally found unstored calculation fields more reliable for benchmarking layouts than conditional formatting.
BTW, if you want higher precision or more details (i.e. not only discover how long the whole report takes to display but also which of all the calcs is the slowest one), you can use FM Bench Detective (http://fmbench.com/). Once you have it installed, adding more benchmarks to a layout or scripts is very quick – no need to define new variables or write new calculations.
Thanks again HOnza for participating in the conversation. I am going to test out some of what you suggest and report back hear with my findings.
This is very useful for any FileMaker developer to improve his solutions. Very often using seconds is not accurate enough, especially if your solution is fast anyway and you just want to make it even more performant.
Thanks for sharing this!
Thanks for stopping by, Alexander. Glad to hear you got some benefit from the blog.