Nvidia CEO said they spent $10 Billon dollars on R&D for their new chip. What exactly does an R&D worker do all day? Where does this money go?

1.16K viewsOtherTechnology

How does a chip go from being an academic theory to a $10 billion R&D Project?

In: Technology

28 Answers

Anonymous 0 Comments

Well you know how those chips have billions and billions of transistors on them? Someone – or rather, teams of people – had to plan out where to put them all. That starts with software simulations to make sure the design is good. Then those were turned into prototypes that spent time in testing, repairs made, new iterations on the design over and over again, etc. Making prototypes can be expensive on its own as it means you’re not manufacturing a product you can sell.

At the same time, you need to write drivers for the chip so that users can run the chip on their computers. Most platforms run OpenGL and Vulkan these days for games, but AI and general computing also runs OpenCL and CUDA so you can run your own custom apps on the graphics card. Windows users have to contend with DirectX as well, and so on. And those drivers need to be reliable and perform very well to keep framerates high.

It’s not just making the chip, even though that is complex as hell. It’s also about the ecosystem that the chip makes. A GPU with no drivers is no good, and all the engineers who invented the product need to be paid for their work. It all adds up.

Anonymous 0 Comments

Among other things they design and test various prototypes, which cost **a lot** more than the finished product because they’re one-offs or made in small batches so you get none of the cost savings of large scale production. And that’s the cost of producing the chip,  there’s also verification first to make sure the chip will do what you think. When I was in school 10 years ago learning this stuff we couldn’t even do the verification, let alone actually produce the simple integrated circuits we designed as just the verification cost was on the order of tens of thousands of dollars per circuit. And our circuits were utterly trivial compared to what goes into a modern GPU. We’re talking maybe a few thousand transistors for our stuff vs tens or hundreds of billions for a GPU. 

Anonymous 0 Comments

such a number rarely ever includes only the money spend on working time but also capital investments to make all of this happen.

just think about all the prototypes you gonna need for something like this and then realize you need an entire small scale fab just to make your own prototypes which by itself can already cost billions.

Anonymous 0 Comments

I don’t know specifically what someone does at Nvidia, but R&D can cover all sorts of things.

Let’s say you want to build a better mouse trap. You have an idea, you try it. It doesn’t work. So you tinker with that idea to see if you can make it work. You have to buy new bits of springs and wire and maybe wood or plastic. You take the time to test it, monitoring how many mice it catches. Maybe it doesn’t work at all and you have to start over, so more time is spent on developing a different way of catching a mouse. More materials are needed, too. Overall, your money is going to time (or a salary for a paid worker), materials, and so in.

In Nvidia’s case, it’s the same thing. Let’s say Nvidia knows they need to produce a chip that can do 10 billion calculations a second. But the current microchip in the GPU can only do 7 billion. But they also can’t just make a new CPU that’s 50% larger, it just won’t fit in the space provided. So they have to find new ways to make GPUs that are smaller but do more. It will take hundreds of people to make this happen. I imagine most of the money they spend is on salaries, with materials and new equipment as the next most expensive items.

Anonymous 0 Comments

Honestly, R&D is everything. It’s Researching & Developing new technology. An ‘r&d worker’ could be one of the technicians working on the lasers, or another working in metallurgy to find new or better materials to cut at the tiniest of scales, a third team writing experimental code. It’s a big umbrella term, $10B in R&D is $300M here, $20M, there, $100,000 somewhere else.

Anonymous 0 Comments

10 billions sounds like a lot, until you remember that a 10 billion budget can be busted by 1000 people working 8 years for 250k each.

https://www.levels.fyi/companies/nvidia/salaries can give you a sense of what approximate salaries are at nvidia. Not many people in that company work under 150k per year, many are over 300k.

Modern GPUs are absolutely huge and complet. tens of billions of transistors that must be both fast and efficient in design. That requires many large teams of very well paid people to work in parallel on many different parts without overwriting each other’s work. Then as DehackEd mentionned there’s the software and testing departments. Building rent or maintenance is also added to that cost, so are bonuses, 401k, sick days, vacations paid to employees.

Anonymous 0 Comments

You should check out this video (https://youtu.be/2ehSCWoaOqQ?si=Ku2IFkAixeHPGaZl) of a foundry tour. That’s where they make chips.

TL;DR it is a 20 minute video (edited) of walking around highly specialized machines that do one thing and cost 20 million a pop.

Anonymous 0 Comments

R&D is research and development. This covers a ton of things.

Some people plan how to give the product the features they want. Like, you want better raytracing capabilities on a GPU? What is actually needed for that? Someone needs to design the chip itself.

Sometimes this involves researching new engineering. We might not know how to do what we want already.

Some people research new techniques to manufacture these things. The chips we make today weren’t remotely possible to manufacture 20 years ago. New equipment and techniques are sometimes needed.

Some people work on taking all that research and scaling it up for production level scales. Like, how do you go from a handful of chips in a lab over several weeks to hundreds of thousands a year?

So it can look anything from relatively basic research in physics and engineering to large scale manufacturing concerns.

Anonymous 0 Comments

– Good researchers with appropriate skills and background are rare and demand a high compensation.
– Research is largely automated these days, involves a lot of complex computations and thus you need a budget for all the hardware and electricity to actually do that, and for as complex a modeling that chips are, this can be a lot. Every large IT company has a dedicated computational cluster specifically for internal research nowadays, those are far from free.
– Research goes beyond desk work, it also involves prototyping and for novel technology like chips that might mean constructing new assembly lines at factories and new one-of-a-kind hardware that can consequently get ludicrously expensive.

Anonymous 0 Comments

It is vague. Because it could mean $10b for just the current model or it could include all the previous models/architecture to finally get to where we are.

A lot of the money goes towards the machines. These chips have to be made using specific and very specialized machines. These machines typically cannot be reused because they are designed to assemble and make the chips at a microscopic level. And we get smaller and smaller with each new generation of chips. In other words we put more stuff on the chips if the machine can do it more precisely at a smaller level.