Do single event upsets ever effect normal computing?

627 views

I just read about [single event upsets](https://en.wikipedia.org/wiki/Single-event_upset) and it’s pretty fascinating. One thing that got me was that a speedrunner of Super Mario 64 experienced a single event upset.

So that leads me to believe that commercial electronics and regular CPUs and GPUs must have a chance to experience these single event upsets. When I research it, there’s only discussion on how it affects space electronics and FPGAs. But there’s gotta be a chance it affects my normal laptop, right? Why would FPGAs be more susceptible to SEUs than CPUs?

If I’m writing a Python script and I set a boolean to False, what’s the probability it gets set to True instead? If I’m logging into a website, what’s the possibility that the server side misinterprets my input? If it can affect an N64 in someone’s living room, there’s gotta be a non-zero chance, right?

In: Engineering

7 Answers

Anonymous 0 Comments

Just to add in some perspective here – A single bit error would in most circumstances mean that a single pixel out of 2,073,600 pixels on a image that flashed up on your screen for a 60th of a second is a slightly wrong shade of which the difference is only barely distinguishable to the human eye in ideal circumstances.

You are viewing 1 out of 7 answers, click here to view all answers.