A glitch is basically any unexpected behaviour. That *can* come from errors (bugs) in the code, in which case a developer needs to find the problem code and fix it (debugging). A glitch could also be caused by faulty hardware or other problems, though that still generally involves developers debugging the code (until they either determine the hardware is at fault, or they give up and chalk it up as a one-off issue).
The process of debugging is a skill developers have like any other. As other answers have said, there is only so much code, the developers should already be familiar with how it works, they should have an idea what sections of code to look at based on the bug description and any data dumps/error logs (that’s why detailed bug reports are important), and ideally the code is structured well enough that it’s relatively easy to determine which code is responsible for which behaviours (failure to do so makes debugging harder, and is a major cause of [technical debt](https://en.wikipedia.org/wiki/Technical_debt)).
Latest Answers