How Snapchat's filters detect your face

I'm not an expert by any means, but unlike what many people think, computers aren't 100% accurate all the time, so there are often a lot of little glitches/bugs which are "ignored" because in the big picture don't make a big effect. For example, CPUs have a clock speed, and clocks are used for performing actions in a set frequency. So, if you have a 2 Hz clock speed, instead of a computer performing all of it's actions instantaneously, it will instead perform an action every 0.5s. This prevents most race conditions, as it makes sure things are performed sequentially instead of simultaneously. To put it really simply, let's say there's something that will output "yes or no" if two conditions are met. Without a clock, one of the conditions might get checked faster, and reach the "yes or no" before the other one is done computing. This will output a "no," even if the other condition may have been met. With a clock, it will perform some actions, and leave a bit of space (for lack of better phrasing) for both of them to be ready in time for the next clock cycle.

So it might have identified the change, and instead of instantaneously changing, it just tried to adjust for it for a certain amount of time (since it would be really bad if it dropped the filter after every little discrepancy in it's facial recognition), and once it hits a certain threshold of time where a face isn't found, it stops trying to apply the filter.

/r/videos Thread Parent Link - youtube.com