top of page

Top 5 AL mistakes that undermine performance

This post is my shortlist of the most common beginners' mistakes in AL that affect application performance. These are not necessarily the worst performance killers - selecting records from a ledger table with a filter on a field without an index could be worse that all my top 5 combined. This are rather a few things that I've run into most frequently in the past years.


1. Checking if a record is empty before calling FindSet on this record.

if SomeRecord.IsEmpty() then
    exit;
    
SomeRecord.FindSet();
repeat
until SomeRecord.Next() = 0;

No need to overcomplicate things - IsEmpty does not add value in this code - unless the subsequent FindSet is called with the ForUpdate switch, or the ReadIsolation property on the SomeRecord variable is set to UpdLock or RepeatableRead. If the transaction isolation is higher than ReadCommitted, this additional check can prevent a long-living intent lock on the table. And only if your code is running in an on-premises environment. Otherwise - just drop this line, it doesn't help.


Better way to do the same

Just call FindSet straight away.

if SomeRecord.FindSet() then
    repeat
    until SomeRecord.Next() = 0;

Note

There are still some cases when IsEmpty before FindSet can be useful to fine-tune performance in specific scenarios in on-prem configurations. More on this in one of the coming posts.


2. Calling FindSet before DeleteAll

SomeRecord.SetFilter(SomeField, FilterValue);
if SomeRecord.FindSet() then
    SomeRecord.DeleteAll();

When writing code in AL language, we can choose one of the two statements to delete database records. We can either delete one record at a time with Delete, or call DeleteAll to delete a whole batch with a single query. On the SQL side, there is practically no difference between them - both are translated into the same query:

DELETE FROM <TableName> WHERE <Conditions>

The subtle difference exists only in the condition part of the query - that clause which defines what exactly will be deleted. Delete always sets the selection conditions on the primary key, so that there is only one record satisfying the criteria. And these conditions come from the primary key fields of the current record.

DeleteAll, on the other hand, does not care about the current record (or if the recordset is received from the database or not). Record filters is all it takes into account when preparing the query.


Better way to do the same

Apply all required filters and delete.

SomeRecord.SetFilter(SomeField, FilterValue);
SomeRecord.DeleteAll();

Note 1

If the filters are applied on the primary key, DeleteAll will delete a single record, just like Delete does. And without reading the record from the table.


Note 2

Sometimes in BaseApp code, you can find IsEmpty being invoked before DeleteAll to check is there are any records to be deleted. Similar to the first example, it can be beneficial for some scenarios involving concurrent updates, but more on this topic in the following posts.


3. Attempting to insert a record to check if it already exists

InitializeRecord(SomeRecord);
if not SomeRecord.Insert() then
    SomeRecord.Modify();

First of all, even if all inserts are successful and no duplicated primary keys are encountered, this code structure prevents buffered inserts, knocking out a good piece of BC server performance optimisation.

Secondly, if a duplication is found, this insert attempt raises a SQL error which is then sent back to Business Central where it can be intercepted and analysed. Also not the fastest and most elegant way of ensuring uniqueness of the key values.

The best way to avoid this structure is to make sure that the record you want to insert is unique - use number series or an incremental primary key. Sometimes the data is received from an external source and can contain records which already exist in BC, and these records must be updated. In this case IsEmpty is cleaner and more efficient method of probing the database.


Better way to do the same

Try to restructure your code and data to ensure uniqueness of the records upfront, or use IsEmpty.

InitializeRecord(SomeRecord);
SomeRecord.SetRecFilter();

if SomeRecord.IsEmpty() then
    SomeRecord.Insert()
else
    SomeRecord.Modify();

4. Calculating totals in a loop

SomeRecord.SetFilter(SomeField, FilterValue);
if SomeRecord.FindSet() then
    repeat
        TotalAmount += SomeRecord.DecimalField;
    until SomeRecord.Next() = 0;

CalcSums function does the same much more efficiently. Use it!

CalcSums is more efficient because it shifts all calculation to the SQL server. Instead of selecting the whole recordset to iterate on all records, CalcSums, allows the database engine to do the calculation and return a single numeric value in the response.


Better way to do the same

Use CalcSums

SomeRecord.SetFilter(SomeField, FilterValue);
SomeRecord.CalcSums(DecimalField);
TotalAmount := SomeRecord.DecimalField;

5. Calculating FlowFields' values in each iteration of a loop

SomeRecord.SetFilter(SomeField, FilterValue);
if SomeRecord.FindSet() then
    repeat
        SomeRecord.CalcFields(DecimalField);
        TotalAmount += SomeRecord.DecimalField;
    until SomeRecord.Next() = 0;

Remember that each call of CalcFields sends a separate query to the SQL server. If your code iterates on a recordset, and some flowfields must be calculated on each iteration, there is a more efficient way of doing this. SetAutoCalcFields does not do any calculations itself, but adds these fields to the selection list of the data access provider. The following FindSet will include the calculation of the selected flowfields into the same SQL query that retrieves the table records. Reduced number of database queries has a positive effect on performance.


Better way to do the same

Call SetAutoCalcFields before the loop to select the fields that must be calculated.

SomeRecord.SetFilter(SomeField, FilterValue);
SomeRecord.SetAutoCalcFields(DecimalField);
if SomeRecord.FindSet() then
    repeat
        TotalAmount += SomeRecord.DecimalField;
    until SomeRecord.Next() = 0;

1,515 views4 comments

Recent Posts

See All

4 Comments


Sławek Guzek
Sławek Guzek
Nov 11, 2023

We can either delete one record at a time with Delete, or call DeleteAll to delete a whole batch with a single query. On the SQL side, there is practically no difference between them - both are translated into the same query"


Firstly - the code example is completely irrelevant to the description (as it does the DELETEALL twice rather than deleting one record at a time.


Secondly - the explanation is not exactly correct. There is a massive difference in that DELETE always uses the primary key of the current record in the WHERE condition, regardless of filters on the record, and DELETEALL uses only filters set on the record and ignores the current record primary key fields.

Like
adrogin
Nov 12, 2023
Replying to

Sure this description could be phrased better. What I'm pointing out here is that Delete requires primary key fields to be initialised, and DeleteAll, on the contrary, does not care about the PK. It's exactly the difference between the two functions that I'm trying to highlight.

Like

Sławek Guzek
Sławek Guzek
Nov 11, 2023

IsEmpty does not add value in this code - actually, it does.


When IsEmpty is executed SQL Server uses the most optimal index - exactly because isEmpty does not return anything. If there is an index on the table that includes all the fields that are used in filters IsEmpty implementation will use it.


Findset, on the other hand, aims at returning all the fields - hence even if there is the optimal index SQL still may decide to scan the table instead of using the index and joining to the base table. Which means the entire table may be scanned to return nothing

Like
adrogin
Nov 12, 2023
Replying to

Agree, it may be useful in some cases - hence my note that there are exceptions to this. But each of this exceptional cases should be carefully analysed. If there is a good index for IsEmpty, SQL optimiser is likely to use the same index for the FindSet (+ key lookup). And vice versa - if FindSet triggers a table scan, IsEmpty probably will run the same scan. Query optimiser may prefer table scan over index seek and key lookup for FindSet in certain situations, but just placing IsEmpty before FindSet randomly will most probably make things worse instead of improving performance.

Like
bottom of page