I've been using Everything for so many years now and as a software developer it always felt like this is too good to be true and its developer must be cheating somehow
![Wink ;)](./images/smilies/icon_e_wink.gif)
On the one hand Everything is incredibly fast at searching, which indicate that the in-memory data structure for the database is likely a sorted array (or multiple array chunks) for its fast lookup times, cache friendliness and great scaling with multiple threads. Maybe the array items also store the index or memory address of their parent folders, for fast path building and to save memory.
But on the other hand Everything also seems to be highly dynamic, in that there are constantly files and folders being removed or added to the database, which wouldn't work that well with an array, since inserting items sorted or deleting them involves a lot of moving memory around and presumably updating lots of references to items.
I also thought that maybe those changes would be stored in different arrays, like a database diff, but I suppose this would impact the search performance too much.
So does someone know how Everything is so fast and dynamic at the same time? Because I'm always hesitant to use arrays with lots of entries once adding and removing items becomes a common operation. Or maybe I'm just underestimating how fast computers nowadays are.
Thanks!