Why chipset based graphics discontinued?


In olden days before the core series from Intel, motherboards used to have small graphics chipset on it. Rather than being integrated on the cpu. Why was this discontinued? Current gen mainstream AMD cpus do not have processor based graphics like the Intel ones do. (the G series parts come very late and technically are low end stuff.) sometimes you do not need a discreet gpu. So why was chipset graphics discontinued, it could have been used to decrease the cpu complexity on Intel side. While Amd would not need to have to launch apus. Hence saving development time.

In: Technology

Those chips cost money and take up valuable mobo space. While the integrated graphics costs money, it costs less money than those chips cost.

Oh yes, and making it harder for AMD to make money is something Intel is very, very keen to do. Intel offers mobo makers a way to save money, and it causes AMD to spend lots of staff-years designing new graphics circuits instead of having mobo makers buy chips. That’s what Intel calls a WIN-WIN.

Both Intel and AMD, over the years, have moved many functions that would previously have been handled by part of a motherboard chipset to being part of the CPU die/package. This includes just about everything that would have also been bundled with a chipset GPU on older motherboards.

From a technical perspective, there could be a huge bandwidth penalty for placing a GPU directly into the motherboard, since it may have to use the PCI Express lanes coming from the chipset unless it were to cannibalize the lanes coming directly from the CPU (taking away lanes from a discrete graphics card or pci express solid state drives). Also keep in mind that integrated GPUs don’t have their own RAM: they must share RAM with the CPU. If placed on a chipset PCI-e lane, the graphics card would be bandwidth starved to hell just trying to access the system RAM.

There’s just no incentive to create a solution like this.

Basically, in terms of both cost and performance, it makes sense to place a basic GPU on the same die/package as the CPU than anywhere else.

There was a move to consolidate features from various chips on the board into the CPU

Originally motherboards hosted the CPU, Northbridge(Memory Access, high speed ports, and iGPU), and a Southbridge(Slow access ports like PCI and USB)

As time went on and we consolidated towards PCI-e the Southbridge lost its purpose and was dropped. Secondary port types are covered by a PCI-e to whatever converter chip.

Then we moved memory controllers into the CPU and gave them support for PCI-e internally. The Northbridge (which used to host the iGPU as well) lost most of its purpose so the remaining features were absorbed by the CPU. Starting in 2011 the mainstream AMD and Intel CPUs had absorbed all of the features of the Northbridge into the CPU so there wasn’t a high speed connected chip left to house the iGPU, it had to be built into the CPU if it was to exist at all.

This gave us a lot fewer compatibility issues, and allows for much smaller boards where the only thing that needs cooling is the CPU (the northbridge used to have a heatsink and sometimes a fan). This means that you can build a micro ITX PC and put it in a tiny case which wouldn’t be possible if you had to fit a CPU, northbridge, southbridge, and RAM onto the board.

We wanted to connect the memory directly to the CPU, instead of going through a “north bridge” chipset. Basically, the CPU absorbed the chipset. It’s because there was room, due to shrinking transistors. That also mean motherboards can become simpler.

But as a result, we are now dependent on the CPU for a lot of things. Typically, desktop and laptop CPU include an integrated graphics part, while server CPU don’t. AMD decided to not even include one on the desktop parts, only the laptop parts have one, their idea being that most people have a graphics card on desktop.

I do agree it’s useful to have integrated graphics even on a desktop that don’t use it, if only to diagnose issue with the graphic card.