Well you know how those chips have billions and billions of transistors on them? Someone – or rather, teams of people – had to plan out where to put them all. That starts with software simulations to make sure the design is good. Then those were turned into prototypes that spent time in testing, repairs made, new iterations on the design over and over again, etc. Making prototypes can be expensive on its own as it means you’re not manufacturing a product you can sell.
At the same time, you need to write drivers for the chip so that users can run the chip on their computers. Most platforms run OpenGL and Vulkan these days for games, but AI and general computing also runs OpenCL and CUDA so you can run your own custom apps on the graphics card. Windows users have to contend with DirectX as well, and so on. And those drivers need to be reliable and perform very well to keep framerates high.
It’s not just making the chip, even though that is complex as hell. It’s also about the ecosystem that the chip makes. A GPU with no drivers is no good, and all the engineers who invented the product need to be paid for their work. It all adds up.
Latest Answers