Sep 18, 2020

It's a good question actually. Intel tried to make a GPU called Larrabee that was mostly a bunch of small x86 cores with giant vector units. Turns out that it couldn't compete in rendering performance on existing games (in 2010) without the fixed function units that GPUs have, so they canceled it as a GPU. It did result in the AVX-512 instruction set though.

I think the idea still has promise but there's a chicken and egg issue where you'd really need to rearchitect game engines and content pipelines to take full advantage of the flexibility before you'd see a benefit. It's possible that it would work better today, and it's also possible that Intel just gave up too early. In some cases we're already seeing people bypassing the fixed function rasterizer in GPUs and doing rasterization manually in compute shaders [1] [2].

[1] Doom Eternal: http://advances.realtimerendering.com/s2020/RenderingDoomEte...

[2] Epic Nanite: https://twitter.com/briankaris/status/1261098487279579136