Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"1st" is not a "natural" choice for the starting point of ordinal numbers.

If anything, it is an artificial choice, because in programming it is derived from a property of most, probably of all, human languages that have ordinal numerals.

In most, probably in all, human languages the ordinal numerals are derived from the cardinal numerals through the intermediate of the counting sequence 1, 2, 3, ...

When cardinal numbers appeared, their initial use was only to communicate how many elements are in a set, which was established by counting 1, 2, 3, ...

Later people realized that they can refer to an element of a sequence by using the number reached at that element when counting the elements of the sequence, so the ordinal numerals appeared, being derived from the cardinals by applying some modifier.

So any discussion about whether 1 is more natural than 0 as the starting index goes back to whether 1 is more natural as the starting point of the counting sequence.

All human languages have words for expressing zero as the number of elements of a set, but the speakers of ancient languages did not consider 0 as a number, mainly because it was not obtained by counting.

There was no need to count the elements of an empty set, you just looked at it and it was obvious that the number was 0.

Counting with the traditional sequence can be interpreted as looking at the sequence in front of you, pointing at the right of an element and saying how many elements are at the left of your hand, then moving your hand one position to the right.

It is equally possible to count by looking at the sequence in front of you, pointing at the left of an element and saying how many elements are at the left of your hand, then moving your hand one position to the right.

In the second variant, the counting sequence becomes 0, 1, 2, 3, ...

The human languages do not use the second variant for 2 reasons, one reason is that zero was not perceived as having the same nature as the other cardinal numbers and the other reason is that the second variant has 1 extra step, which is not needed, because when looking at a set with 1 element, it is obvious that the number of elements is 1, without counting.

So neither 0 or 1 is a more "natural" choice for starting counting, but 1 is more economical when the counting is done by humans.

When the counting is done by machines, 0 as the starting point is slightly more economical, because it can be simpler to initialize all the bits or digits of an electronic or mechanical counter to the same value for 0, than initializing them to different values, for 1.

While 1 was a more economical choice for humans counting sheep, 0 is a choice that is always slightly simpler, both for hardware and software implementations and for programmers, who are less likely to do "off by one" errors when using 0-based indices, because many index-computing formulas, especially in multi-dimensional arrays, become a little simpler.

In conclusion, the choice between 0 and 1 never has anything to do with "naturalness", but it should always be based on efficiency and simplicity.

I prefer 0, even if I have programmed for many years in 1-based languages, like Fortran & Basic, before first using the 0-based C and the other languages influenced by it.



> whether 1 is more natural as the starting point of the counting sequence.

Seriously?

- "Hey Timmy, how many pens do you have on your desk?"

- "Let's count them! Zero, One, Two."

- "So how many is that?"

- "Well since I ended my counting sequence with 'two', it totally makes sense to announce that I have three pens."


I think you're conflating "natural" with "familiar".

Here are two processes Timmy could use to count the pens on his desk. (I have numbered them starting with 1, because I considerately adapt to the needs of my audience :-).)

1. Say "0". Then increment for each object. So he goes: zero, (points to pen) one, (points to pen) two, (points to pen) three.

The advantage of this is that you don't need a special case when the number of pens turns out to be zero. Hey, Timmy, how many unicorns on your desk? Timmy starts by saying "zero", looks around, no unicorns, finished: it's the same process as for pens, it just stopped earlier.

2. Number the objects from 0, and then the count is the next number after the ones you listed. So he goes: (points to pen) zero, (points to pen) one, (points to pen) two, so the number is three.

This corresponds more closely to how indexing works in computers, and matches up nicely with the so-called von Neumann construction in mathematics, but in other ways I find it less natural than #1. But I am confident that you could teach it to children, and they would find it natural, and think it weird to mix up "the number of objects" with "the number of the last object". In this case, the only thing that changes from your dialogue is that Timmy says "Well, since the next number is three, it totally makes sense to announce that I have three pens." You count the pens, then you say how many there are. Zero, one, two, three. Not zero, one, two, two as you would prefer. What are you, some kind of weirdo or something? :-)


I did not explain in detail how a human should count, because I think that for a programmer it should have been clear that the exit criterion from the loop is in both cases to have no elements at the right of your hand.

So your Timmy has a bug in his loop code, which caused a premature exit :-)

He should have counted 0, 1, 2, 3. Like I have said, humans do not count like this, because there are 4 steps instead of 3 steps.

For humans, it is easier to have an "if" before the loop, selecting between the special case of an empty set and a non-empty set, whose elements must be counted.

Nonetheless, an electronic counter would have really counted 0, 1, 2, 3, unlike you.


You wrote a lengthy post trying to explain how it could totally be possible that humans start counting at zero, yet the whole premise of it is wrong: we start counting at 1 because it's natural. You cannot assume otherwise.

> He should have counted 0, 1, 2, 3

Is that how _you_ count? You imagine some pointer going _between_ items, start at zero, and stop once you have nothing at the right of your pointer?

Or, little Timmy could also skip the "zero" part (because you know, he's not dumb nor blind, and sees that he has several pens on his desk), start at 1 while pointing _at_ the first pen, and count up to 3.


Pointing directly to the objects that you are counting is not natural, it is just a bad habit.

People who are interrupted while they are counting and who have to restart the counting later, frequently forget whether they have already counted the pointed object, or not.

Pointing between the objects avoids these errors.

When I was young, I also had the impression that counting from 1 is natural. When I first encountered 0-based languages, I was annoyed by them, especially because I knew how Fortran compilers are implemented and I was familiar with all the tricks that are used by compilers to minimize the extra operations that are required to work with 1-based arrays.

So my initial naive thought was that maybe the implementers of C were not competent enough. However, after many extra years of experience, I believe that all the tricks for the efficient implementation of 1-based arrays are not worthwhile, because even for programmers it is more efficient to think using 0-based indices.

So now I really consider 0, 1, 2, 3 ... as a more "natural" counting sequence, unlike when I was a child.


Can you always truly count between the objects? Are you in between or are you at some object at any given time? What if you count molecules?

With memory slots is the pointer in between or at a slot?

Should pages in books also start at 0? If you are saying show me page 4 should you look between 3 and 4?


You don't need to start at 0 to count intervals between objects, it's just easier to say "one" and point your finger to the space to the right of the leftmost item.


> … and point your finger to the space to the right of the leftmost item.

What if there is no leftmost item? Because you skipped the first step, essentially assuming that there was at least one item and including it in your initial condition, that process fails when the set of objects to be counted is empty. Humans are good at dealing with special cases like this; they'll notice the discrepancy and report "zero" instead of following through with the predetermined process. Computers, not so much. So instead we start with a count of zero and all items in the as-yet-uncounted set as our initial condition, and increment the count by one as we move each item from the "uncounted" set to the "counted" set, which correctly handles the case of the empty set.


This whole subthread was about what comes as a natural procedure to humans. Arguing that computers may need a different process is moot.

Anyway, to handle the exception, you could start the algorithm with a "check for the empty collection" base case; that would make it a natural definition for a computational recursive definition as well.


My point was that the natural procedure for computers is also more natural for humans, since it involves fewer special cases. Humans are better than computers at dealing with ad-hoc special cases but that doesn't make them natural.


Having fewer special cases doesn't make it more natural, when those special cases are intuitive and forcing a single homogeneous process feels weird.

The brain is very good at finding patterns for special cases and very bad at following repetitive orders.


(And then, having pointed at the pens, Timmy got confused and tried to write down his answer using his finger. Annoyed that no actual writing occurred, he cut off his finger and threw it away. The next day, he was again confused to find his desk all cluttered with crap that somewhat resembled pens, despite having so bloodily thrown away his pointer.)


Timmy, how many pens do you have on your desk?

Answer: none


I doubt that cardinals came before ordinals. To me, at least at first glance, they both seem equally conceptually primitive.

As far as efficiency, this should be the task of the compiler. We shouldn't have to start counting with the "zeroth" just to save the compiler a few cycles (or a few mental cycles to whomever writes the compiler/interpreter). That's why we don't program in assembly after all.

But I agree with GP, we shouldn't start counting from 1 either, but from "1st". Keeping ordinal separate from cardinals would be closest to how natural language works and I bet it would eliminate all kind of errors and gotchas.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: