Do single event upsets ever effect normal computing?

635 views

I just read about [single event upsets](https://en.wikipedia.org/wiki/Single-event_upset) and it’s pretty fascinating. One thing that got me was that a speedrunner of Super Mario 64 experienced a single event upset.

So that leads me to believe that commercial electronics and regular CPUs and GPUs must have a chance to experience these single event upsets. When I research it, there’s only discussion on how it affects space electronics and FPGAs. But there’s gotta be a chance it affects my normal laptop, right? Why would FPGAs be more susceptible to SEUs than CPUs?

If I’m writing a Python script and I set a boolean to False, what’s the probability it gets set to True instead? If I’m logging into a website, what’s the possibility that the server side misinterprets my input? If it can affect an N64 in someone’s living room, there’s gotta be a non-zero chance, right?

In: Engineering

7 Answers

Anonymous 0 Comments

For a python script that chance is impossibly small, as a Boolean is not a single bit. Modern programming languages don’t typically express things is such primitive representations anymore. Moreover, computers have techniques for handling these errors. They are a non issue for the vast majority of computational need.

You are viewing 1 out of 7 answers, click here to view all answers.