Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> As I vaguely recall, even Plain Old Array accesses take logarithmic time on real machines.

How does adding an offset to a pointer and referencing that data require logarithmic time?

>. However, if we don't use bignums, they're effectively constant time.

What are these bignums that are used for array access on "real machines"?



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: