how accurate is our hearing in detecting off key notes in music?

322 views

As you know, we have an innate sense of musicality and we can sense when certain tones are off key and wrong. Im curious to what degree can we sense it? Is there a limit of deviation from the correct pitch where we can’t recognize that the note is off pitch anymore and it starts sounding normal?

In: 6

6 Answers

Anonymous 0 Comments

As you said, we’re really, really good at hearing scales. [This video](https://www.youtube.com/watch?v=ne6tB2KiZuk) always blows my mind.

As far as scale deviation, it would depend on culture as all humans don’t use the same scales, Asian and Middle Eastern music for example use some wildly different scales than Western Music, that’s part of why music from India (for example) can be so jarring to the Western ear.

I’d say first that it’s less the key or scale, but the “distance” between notes that we hear as an audience. And a songwriter can set an expectation of a key and then play with the distance (called “intervals”) to screw with a listener in fun ways. The song “Smells like Teen Spirit” is a very famous example of a chord progression that deviates, pleasantly, from what the listener “wants” to hear. If you hear that opening chord progression, rock music usually revolves around 4 chords in a pattern, that’s why you can almost always guess the next chord in a song. But in SLTS Cobain breaks that into 2 pairs of chord patterns. The 1st and 2nd are a pair, then he shifts to a “Surprise” 3rd chord but plays the same interval from 3-4 as he does 1-2. That gives the song that “allllmossst” feeling with a bit of anxiety in the listeners ear, Cobain used this trick very, very well.

Finally, you can make “deviations” sound normal by doing a whole key changes midway through a song, sometimes repeatedly. Listen to “Africa” by Toto (or the Weezer cover). That sound actually jumps around through 3 distant scales and keys from verse to chorus and you kind of barely notice it, but it’s just jarring enough to make it a total ear worm.

Anonymous 0 Comments

It comes down to the basic principles of what sound is and what chords are. The waveforms of pleasant-sounding chords overlap in ways to constructively add to each other, resulting in a rich but still relatively coherent waveform. When a note is off, it doesn’t add to the other notes quite right, so the resulting waveform is much messier, causing a dissonance we can hear. As to how far off a note can be, there is probably some correlation to the math on how the waveforms add up and the point at which a human can detect it.

Another interesting thing to note is that some people have perfect pitch and can tell if a note is off just on its own. But most people do not and wouldn’t be able to tell without some reference note. As such, if you were to play a solo on an instrument that has all of its notes shifted slightly out of tune by the same amount, most people wouldn’t be able to notice as all the waveforms would overlap nicely with each other.

Anonymous 0 Comments

It depends.

In practical terms the intervals between pitches in music work because of their approximate ratios rather than their absolute values actually having their own meaning. And if you recall from math classes, ratios between 2 numbers change depending on what the numbers actually are. An octave in music is a 2:1 ratio where the interval is the note that’s twice as high as the other note. Most musicians tune using a reference pitch of A4=440Hz, its octave is 880Hz and there are 440Hz between the two. If you started at 880Hz that note’s octave is 1760Hz and the distance is 880Hz. But in terms of music notation and what we consider notes, the same 12 notes are present in each octave. So dividing 440Hz by 12 gives a number roughly half the size of dividing 880Hz by 12. We can more easily identify the tuning between notes that are close together and higher in pitch but have more difficulty identifying the interval of notes that are further apart or lower in pitch.

Anonymous 0 Comments

It varies *enormously*

My housemate can’t hear the difference between notes closer than about a tone, and I have another friend who hilariously couldn’t even tell when she was over-blowing a note when she was forced into trying to learn euphonium as a child (she wasn’t sure even when the notes were an octave apart!)

I, in contrast, regularly find my cheapo guitar tuner not quite sensitive enough to do the job without going back and tweaking it by ear after the fact, and that’s a [cent]( https://en.wikipedia.org/wiki/Cent_(music) ) or two at most (a cent being a hundredth of a semitone) and I’m sure there are plenty of professional musicians who out-do me by a fair margin – I’m just a dabbler in the musical arts who happens to have a fairly good ear.

Of course, the timbre of a sound makes a fair difference. If the sound you’re listening to is a pure sine wave, then it’ll only have one frequency in it and that makes measuring that single frequency easier (by ear or with whatever other apparatus you might want). If it’s something with lots frequencies that are neatly on the harmonic series (i.e. the main frequency and simple multiples of it) then that’s even better^1. There’s a limit to this, though – a square wave is the limit of the sum of harmonic series pattern and ends up sounding a little “noisy” so there’s presumably a sweet spot somewhere between sine and square waves.

If the sound has lots of non-harmonic overtones (e.g. base frequency multiplied by some non-obvious ratio like 1.43x or whatever) then that can make it difficult to hear it as a single note at all, let alone identify the base frequency precisely. An example of sounds rich in this way include the “gravelly” voices of certain singers.

It’s worth noting that “being exactly on key” isn’t necessarily what we mostly train our hearing to, though. A lot of music uses deliberately not-quite-in-tune notes for stylistic reasons and we learn to perceive these notes as the note they’re nearest to, because that’s how that kind of music works.^2

Is there a fundamental limit to how good a human ear can get? Presumably, there’s a point at which the difference between two notes is smaller than the gap between two consecutive hairs in the cochlea. That might not be a hard limit, though, as there are subtle clues even to higher frequency resolutions than that. Consider what neural networks do to upscale video, for example – we’re probably doing similar things all the damn time in our auditory cortices (and elsewhere in our brains, of course). I can’t imagine that would extend your resolution by a *long* way, though.

I’d love to hear from somebody with a really *really* in-depth understanding of acoustics & ear physiology to get some actual *numbers* to this 😛


^1 Our ears/brains are very good at responding to the harmonic series and we’ll even “hear” frequencies lower than our ears can actually directly detect by *inferring* them from higher harmonics. A sound with a rich set of overtones fitting the harmonic series will hit all these sweet spots, giving our brains a lot to work with to nail down the base frequency.

^2 Obvious examples include sliding into/out of a note, or using vibrato (wibbling around slightly either side of a note) which we perceive as the note at one end or other of the bend, or the one in the middle of the wibble, even if most people could easily tell the difference between the note as played and the note as written if we heard them side by side.

A particularly amusing example is the first note of “Yesterday” by the Beatles, which isn’t really one note or the other at all, causing much consternation among those trying to score the tune! I wanted to link an Adam Neely video that touched on this, but haven’t managed to find it, sadly…

Anonymous 0 Comments

There are two different kinds of “off key” notes.

The first is a “wrong note.” This is a note that is played outside of the expectations set by the music around it. Maybe it is outside of the key the music is in, or descends when the melody wants to ascend. A wrong note is quite easy to identify in a piece of music that you are familiar with but very difficult to confidently identify in music you aren’t familiar with. Even if it sounds jarring or odd, maybe that was the point. If someone plays the “wrong note” with enough confidence, who’s to say it’s wrong?

The second is “out of tune.” This is when the tone produced is *close* to the intended note but not quite there. The hallmark of a note that is out of tune is that it will create “waves” or “beeps” when played alongside the other notes it is meant to harmonize with. Music that is badly out of tune will sound “off” even to an untrained ear, but it might be confused for poor technique in some other aspect of the playing. It can take a relatively trained ear to really hear when something it out of tune, but if you can hear it, you can be pretty confident that the music is “wrong.” Musicians almost never play out of tune on purpose.

Anonymous 0 Comments

If by ‘wrong or off key’ you mean hearing whether a note is out of tune, the typical limit is quite small even to an untrained ear.

Musical tone is a logarithmic scale with each octave being a halving of the sound frequency. In western scales there are 12 tones between octaves with each tone having 100 cents. Normal adults are able to recognize pitch differences of as small as 25 cents very reliably. One trained professional demonstrated the ability to tell the difference in tone quality with a variation of 5-6 cents. Some tone deaf adults could not tell the difference between notes that were a whole tone or more (over 100 cents) apart!

Wikipedia link for [cent (music)](https://en.wikipedia.org/wiki/Cent_(music)#Human_perception)