Broken Binary Searches Oh My!

So it turns out that nearly every binary search algorithm is broken. Why? Fixed-sized integers, a holdover from machine programming that still lingers in supposedly modern languages like Java. LtU’ers are haggling to decide if it’s a type error, a bad data structure, a language defect, a failure of correctness proofs, or just a bug. Lispers are talking smugly about their automatic fixnum-to-bignum conversion.

Of course, even Common Lisp arbitrarily limits the size of arrays. On my 64-bit SBCL, it’s a little over one quintillion, or (assuming one-byte array elements) one exabyte. That’s a bigger array than even Google is likely to need in the near future. But I think it’s valid to ask why code should be limited by anything except the hardware it’s running on.

So here’s my take on the whole issue: Yes, Java/C++/C suck. But a mathmatically perfect computer will never be built. Every premature optimization is just a concession to the reality of chips and electrons. Computer Science likes to believe it’s pure Mathematics, but Engineering will always be there to keep it humble. Double-check those proofs, and then start running tests.