Why does it seem like every game is DX9 or DX11? Why did developers hardly touch DX10?

Basically because at the time, DX9 was just getting into swing on consoles and computers when DirectX 10 was introduced, and DirectX 10 was exclusive to the vile and loathed Windows Vista.

Since Windows Vista had basically no market share and DirectX 9 was just beginning to take off (Unreal Engine 3 games debuted in Q4 2006, a few short months before DX10/Vista hit, and Call of Duty 4, described as "the most photorealistic game we've ever seen", debuted in DX9 in 2007, right in the middle of DX10's launch), there was no financial reason to make a DX10 game, but many financial and technical reasons to adopt DX9.

A few games were made in DX10, including Metro:2033, Far Cry 2, Stalker, and Crysis, but that was really the exception rather than the rule.

A couple of years later, Windows 7 came out and after that, DX11. Many, many users were eager to move past Vista and off the aging Windows XP. Consoles were begging to age a bit and PC tech was quickly making exponential power improvements over them.

Because DX11 featured DX10's capabilities plus improved speeds and new features, it eclipsed DX10. Why use the old API? On the other hand, the consoles could only use DX9 and older computer hardware was, by far, DX9. So now you had two primary users: DX11 and DX9 GPU owners, with a small smattering of DX10 card owners.

What better way to move new silicon than to adopt the much wider used DX11, tempting the DX9/10 holdouts to upgrade, and further play up the "PC Master Race" meme by hardware vendors, showing how much better their hardware was than consoles?

Thus, there was now both market and demand for DX11 titles as the market had rolled over to Windows 7 and new hardware. At the same time, consoles had the longest lifespan ever, necessitating keeping DX9 alive to service them.

And that is why you see those two APIs used and DX10 not.

/r/pcgaming Thread