The 2012 San Diego fireworks mishap was caused by a faulty computer file misreading "second" as "millisecond." (Warning: loud)

No, you're just confused.

No, that's 100% you:

If the programmer assumes the file he's reading from in is in milliseconds, he's going to divide by 1000 before passing to sleep

  1. What "sleep" function are you referring to? In win32, the sleep system call takes milliseconds. In Unix, it takes seconds. However, your example code used "sleep_ms", so your function expects milliseconds. So if a programmer has a value that he thinks is in milliseconds, and he's passing it to a function that expects milliseconds, why the fuck would he divide it by 1000, turning it into microseconds?

  2. You didn't detail this scenario in your post, and we can easily imagine another: the code expects values in seconds, but the "programmer assumed value was in milliseconds", so he specifies millisecond values rather than second values in the file. The show takes 1000 times longer than expected.

There is no scenario where the "programmer assumed value was in milliseconds, but in actuality it was in seconds" which ends up with the show being faster.

The reverse is not true. If the "programmer assumed value was in seconds, but it actually it was in milliseconds", then he's going to enter value much smaller than intended, or interpret values to mean something much longer than they actually are.

Also, just to be clear, this isn't what actually happened at the show. This is 100% hypothetical.

/r/videos Thread Parent Link - youtu.be