PCGamer: GTX 1070 beats the 980ti and Titanx in every game we tested.

Gameworks is hacked into the game by nVidia so that disabling it involves either lowering the graphics settings or setting PhysX to run on the CPU which drastically reduces performance

That's wrong to the point of being farcical. Look up what exactly PhysX is. It's 2 products with the same name.
First is the base PhysX SDK which provides basic physics for games all of which is done on the CPU, regardless of what graphics card you have in your computer and regardless of if you enable some of the GPU hardware accelerated effects.
This is the physics calculations that Project Cars used - it was platform agnostic (made no difference if you're running Nvidia, Intel or AMD) except the fact that AMD has really shitty CPU driver overhead, so the game was more CPU bound for them than Nvidia - that's not nvidia or project cards devs fault - they could have used another CPU based physics engine (like Havok) and the results would be example the damn same in a CPU bound game.
It's simple a fact that CPU bound games run worse on AMD hardware because AMD's drivers have (/had - I've read their latest drivers have some good improvements on this) greater CPU overhead.

Then it has a layer on top of that which allows adding Cuda based enhancements (hardware acceleration) which is obviously limited to nvidia.
These enhancements are all cosmetic - mainly cloth and particle effects (flapping flags, more sparks in explosions etc). Of course AMD cards get none of these, and they are pretty intensive even on Nvidia cards - they're supposed to be. That's why they're optional.

When you set Physx to "low" (in a game like borderlands) you still get all of the physx SDK based physics calculations that run the game. When you set it to high on Nvidia your CPU is still doing all the same calculations on low - you just get a few enhancements.

There are 600+ Games that use the PhysX SDK with no hardware acceleration and like 60 that use both CPU and GPU [source].

The effects are never demanding or uniquely cool looking,

You clearly have no idea how intensive physics simulations on particle effects, fluid dynamics, or Hair strand physics calculations are.
Here's a clue: They're pretty much the most demanding thing you can ask a CPU or GPU to do in animation. THere is a reason Pixar needed to double the size of their render farm for Monsters Inc. Hair simulation is computationally intensive.

Honestly the fact you got so many upvotes just shows how technological and technically ignorant the vast majority of redditors are on games related subs.

Now the only Valid argument - and the one your video mostly makes - and I like AdoredTV his Async video was one of the best around, but he's a little tinfoil hat in this one... Anyway the only valid argument is that Nvidia's free graphical plugins work better on newer Nvidia cards than AMD or their older cards. And they should make more of an effort to make their software more compatible across the board...

...Yes it does and yes they should.

Ultimately though it's the devs to blame for accepting the temptations that gameworks offers - a free full quality commercial middleware solution vs AMDs barely beta level open source stuff. It's not hard to understand why some game devs take the temptation of saving a lot of money using an out the box middleware vs developing their own.

/r/Games Thread Parent Link - pcgamer.com