I implemented a small function that allocates more and more bytes (200000 `unsigned long`) to a static array upon each notification arrivwas never able to observe NotificationService run in parallel on separate threads to process different notifications so I am not sure if it is actually possible to properly test this differential. I observed that I see memory usage notification twice since the static array is first allocated (new NSE process starts). After the second time the process was killed - I didn't see "Cleaning-up extension" log after notification display.tested that notifications work correctly in normal circumstances and that enforcing to call `serviceExtensionTimeWillExpire` by putting `sleep()` works correctly - notification is either displayed decrypted or with proper error message.
Discussion here: https://stackoverflow.com/questions/62566948/notification-service-extension-lifecycle, Additionally I was logging allocated array size after each notification and after the second memory usage notification array size dropped to 200000 which means it was freshly allocated so new process was started.
I also tested whether entire notification functionality (normal notifs,mentions that it is possible to launch two NSE processes with debugger. However inter-process safety is already tested by parent differentials. rescinds and badgeonly notifs) work correctly without using memory-leak simulation function.
I tested everything on a release build connected to my local keyserverFor this differential it would be necessary to have two NSE thread operate on one NSE class to process two different notifications. I think it is not possible to recreate such conditions.