I love how 70's is an objectively garbage language. I just imagine half of these design decisions being reasons it was mocked on forums & discussions.
A few of my favorite quotes:
[...] No more than the first eight characters are significant, and only the first seven for external identifiers.
This aged as poorly.
An integer constant is a sequence of digits. An integer is taken to be octal if it begins with 0, decimal otherwise. The digits 8 and 9 have octal value 10 and 11 respectively.
Honestly just weird to have octal's accept 8
and 9
as they aren't in range. It seems pretty easy to make the lexxer reject that, but what ever.
C bases the interpretation of an identifier upon two attributes of the identifier: its storage class and its type. The storage class determines the location and lifetime of the storage associated with an identifier; the type determines the meaning of the values found in the identifier’s storage.
So much of C's oddities came from this.
[...] It is also possible to interpret
char
as signed, 2’s complement 8-bit. Integers (int
) are represented in 16-bit 2’s complement notation numbers.
OH so it was sane once upon a time. I know a lot of pain in C-Compiler land is up to the fact that the 2's complement requirement hasn't been part of the standard for a while.