X box one wont get 1080p on witcher 3 even with direct x 12

It may, but there seems to be a really widespread misunderstanding of the benefits of DX12. It won't magically make everything better. There are three main benefits based on my understanding (I work with game engine programming, but not with graphics specifically):

  • Less driver overhead on the CPU. This is the big win, but it's important to know a few things. All applications have this thing called the "main thread". It's usually the thread that kicks off the application and in games it's usually what's responsible for updating the game state. The majority of AAA games have a thread aside from this dedicated to rendering, aka the "render thread". Many modern engines are moving in the direction of more parallelism, but these two threads usually still exist in some form and are responsible of delegating work to other threads. Now in most engines, what happens is: While the main thread is updating the current frame, the render thread is rendering the previous frame (again for some modern engines this is not true, but for most it is). So for DX12, where you will see this benefit is on the render thread. If the bottleneck of your game instead is on the main thread, you will actually not see a performance benefit at all, since the render thread would be waiting anyway. This is usually the case in processing heavy games, such as MMOs or RTSes. This does however mean that you can push a lot more graphics per frame in exchange assuming your GPU is not already at 100%.

  • More developer control on the GPU, meaning each game can use the GPU in whatever way makes sense. Keep in mind the graphics cards will still have the same processing power, it can just be utilized in a more custom way.

  • And finally a cleaner and simpler API, meaning graphics vendors (nVidia/AMD) don't have to put as much bullshit in their drivers, allowing them to focus on optimization and other good stuff.

So with this said, I wouldn't expect Witcher 3 to get a huge performance benefit out of DX12 even on PC. The GPU(s) tend(s) to sit at 100% already, meaning they probably won't benefit from the decreased CPU overhead (the GPU appears to be the bottleneck, not the CPU). They could benefit from the increased control on the GPU, but that would most likely require changing their rendering pipeline to be more efficient with the extra control. That's potentially a truckload of work. But on the plus side, we may see a driver performance gain!

Again though I may have missed something, as I usually work on non-graphics parts of the engine. Also, since DX12 has a completely new API, it's not a drop-in replacement for DX11 which many people seem to think. It may take anything from just remapping APIs to restructuring your rendering pipeline to go from one to the other, depending on your engine. And each thing in between may have different benefits/impacts on performance.

/r/pcmasterrace Thread Parent Link - cinemablend.com