Is linear algebra older than 0? Hang on (no, it is not, formalised in 17th century)
In my CS course, at least, it was treated as “engineering”, so we did both linear algebra and C programming. For everyone counting from 1 was more natural and the C method had to be taught a few times throughout the course (starting with java loops, which wasn’t used for malloc, OOP was probably the first unit anyone did for CS). As a habit it tended to stick even where we didn’t really use it (or in languages that don’t, e.g. lua), given how grueling C programming was and the other languages that were downstream of it.
I guess you could analogise things like saying “17th century” is 1600-1699 (first century is 0001 to 0099, I guess), in CS you are counting the very start of a thing (e.g. how many apple-widths to get to the first apple), vs the more common how many apples to have gotten the first apple. Or something, idk,
Is linear algebra older than 0? Hang on (no, it is not, formalised in 17th century)
In my CS course, at least, it was treated as “engineering”, so we did both linear algebra and C programming. For everyone counting from 1 was more natural and the C method had to be taught a few times throughout the course (starting with java loops, which wasn’t used for malloc, OOP was probably the first unit anyone did for CS). As a habit it tended to stick even where we didn’t really use it (or in languages that don’t, e.g. lua), given how grueling C programming was and the other languages that were downstream of it.
I guess you could analogise things like saying “17th century” is 1600-1699 (first century is 0001 to 0099, I guess), in CS you are counting the very start of a thing (e.g. how many apple-widths to get to the first apple), vs the more common how many apples to have gotten the first apple. Or something, idk,
I’m drunk and avoiding housework, sorry