This comment was posted to reddit on Feb 14, 2019 at 12:04 am and was deleted within 13 minutes.

I thought we had to prove the statement and was wondering for so long how to deal with the case where f(n1) = 0 but g(n1) != 0 for some n1. On the bright side, now we have a ready-made counterexample.

Say you have f(n) = n-1 and g(n) = n

Clearly f(n) < g(n), so -g(n) < f(n). So -g(n) < f(n) < g(n)

But it is impossible to go in the other direction, because f(1) = 0 and g(1) = 1, so for the inequality

k1*f(n) =< g(n) =< k2*f(n) to hold at n = 1, it would have to be true that k1*0 =< 1 =< k2*0, which is impossible.

So the reason this statement fails is because we know from the hypothesis f(n) = 0 implies g(n) = 0, but it is impossible to prove that g(n) = 0 implies f(n) = 0 (the converse), so we can't be sure that g(n) can sandwich g(n).