Do single event upsets ever effect normal computing?

631 views

I just read about [single event upsets](https://en.wikipedia.org/wiki/Single-event_upset) and it’s pretty fascinating. One thing that got me was that a speedrunner of Super Mario 64 experienced a single event upset.

So that leads me to believe that commercial electronics and regular CPUs and GPUs must have a chance to experience these single event upsets. When I research it, there’s only discussion on how it affects space electronics and FPGAs. But there’s gotta be a chance it affects my normal laptop, right? Why would FPGAs be more susceptible to SEUs than CPUs?

If I’m writing a Python script and I set a boolean to False, what’s the probability it gets set to True instead? If I’m logging into a website, what’s the possibility that the server side misinterprets my input? If it can affect an N64 in someone’s living room, there’s gotta be a non-zero chance, right?

In: Engineering

7 Answers

Anonymous 0 Comments

FPGAs are basically on the very edge of physics, grinding out maximally fast electronic responses that can only be beat by designing a whole damn custom circuit board (ASIC). There is no room for error checking.

Indeed, if two FPGAs are communicating at different clock speeds, the time it takes for one “pulse” of info to fire has a period of instability. Reading the data at this time in the pulse gives you a 50/50 of getting the right data. This isn’t even the space lazer bs yet, this is something that happens in almost every device, and solutions to reduce the impact slow down both systems.

Once you’re literally counting individual bits ASAP and calculating results off of that single data read, you gotta account for anything, and I mean ANYTHING.

You are viewing 1 out of 7 answers, click here to view all answers.