Dec 112018
 

Pointers can be really helpful, especially they can improve performance and readability, but they are also dangerous. I spent nearly a day tracking down the reason why the code a former colleague wrote about 5 years ago all of a sudden led to access violations. The code used to work fine, the problem only surfaced when I changed the size of a record. Consider this code:

var
  arr: array of TSomeRec;
  ptr1: ^SomeRec;
  Ptr2: ^SomeRec;
begin
  SetLength(arr, 0);
  // [...]
  SetLength(arr, Length(arr) + 1);
  ptr1 := @arr[High(arr)];
  ptr1.SomeField := SomeValue;
  // [...]
  SetLength(arr, Length(arr) + 1);
  ptr2 := @arr[High(arr)];
  ptr2.SomeField := SomeValue;
  // [...]
  SomeVariable := ptr1.SomeField); // boom

(Of course this is a very simplified example. There is a lot of code where I put the […] comments.)

So why is accessing ptr1 a problem? It was assigned and pointed to the right kind of variable. And it still points to that memory.

It’s not the pointer that was changed, it’s the array. arr is a dynamically allocated array. Each call to SetLength changes the length of the array and, if the previously allocated memory block is too small to fit the new length, a new memory block will be allocated, the content of the array will be copied to the new memory block and the old memory block will be marked as unused.

And this is where ptr1 becomes invalid. It still points to the old memory block which now is marked as unused and will be used to store other data as soon as necessary.

So: Boom 💣

This is called a “stale pointer” and is the cause of a lot of debugging headaches.

So, why did this work before I changed the record size?

It’s due to the allocation strategy of the memory manager. When it gets a memory request, it checks for free blocks and assigns the block that fits best. Depending on how large the requested memory block is, it takes that memory from one of several pools. Each pool contains blocks of available memory of a fixed size, e.g. pool1 contains blocks of 16 bytes, pool2 contains blocks of 64 bytes etc. (these are just examples the real sizes depend on which memory manager is used and how it has been configured).

When I changed the record size, the record became smaller so the first array entry was allocated from a different pool than before, one that contained smaller blocks. Where previously the memory allocated to the array was large enough to fit more than one record, it now was so small that it fit only one record. Increasing the length of the array therefore required requesting a larger block of memory and copying the array contents. So the pointer became stale.

Discuss this in Delphi Praxis.

 Posted by on 2018-12-11 at 10:49