Hacker News new | past | comments | ask | show | jobs | submit login

I like the staging idea, but curiously I haven't seen any papers investigating this.

Obviously, there is an overhead cost of the staging itself, which hits the use case for small data, and there is the question "From what data size onwards to use what algorithm?".

For example, I just had a look at sort in Clang's STL implementation, which seems to set a boundary at N=1000. Since the number is a multiple of 10, it looks like it might be arbitrarily picked, in any case there is no paper references in the comments to back it up.

Are people aware of formal studies that explore whether it is worth the trouble to inspect data structure and then to dispatch to different algorithms based on the findings?




You're saying you think there's still optimization to be done re. the N=1000 constant?

Other than that, I'd suggest that big O complexity is already telling you that the inspection is worth it. Assuming inspection of an array is constant.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: