John Wick has entered the game

It's running ~140fps After recording and uploading you're seeing it at 60fps (or lower - depending on the refresh rate of your monitor)

However, when being recorded / encoded the software is grabbing the most recently rendered frame to display to you, so to use some example code with | being a frame and . a millisecond:

144FPS
|.......|.......|.......|.......|.......|.......|.......|.......|.......|

60FPS
|.................|.................|.................|.................|

In those spaces between frames, you have no information displayed about what is happening. At 144fps the time between updates is ~7 milliseconds, at 60fps it's ~17 milliseconds. When a game renders at, for example, 144fps but your monitor (or videoplayer) only displays (not renders) at 60fps, it will show the most recently rendered frame, with ^ marking discarded renders, and * marking displayed renders:

144FPS -> 60FPS
|......^......*..|...^......*.....|^......^......*.|....^......*....|

Effectively = ~80FPS
|.............|.............|.....................|...........|

You're rendering at 144fps, displaying at 60fps, and seeing information at about 80fps. Basically what you know of the world-state is more up-to-date at 144 rendered -> 60 displayed, than if it were 60 rendered -> 60 displayed.

/r/gaming Thread Parent Link - gfycat.com