You are reading content from Scuttlebutt
User has not chosen to be hosted publicly
User has not chosen to be hosted publicly
User has not chosen to be hosted publicly
User has not chosen to be hosted publicly
@aljoscha %wjgq9H1KBoWhyvWuxNcEvEb+eX0T/PV3V/cbVix+fhs=.sha256

I think there's too much in Western music theory that doesn't make a lot of sense, and this bothers me, since deep down, music is just math on sound waves.

An xkcd comic about non-experts failing to meaningfully criticize a purposefully developed system.

Western music notation is designed for efficient reading of western classical music. It is neither designed for easy writing, nor for easy reading of arbitrary music.

Correct usage of enharmonic equivalents is crucial for efficient sight-reading. c# e# g# I can parse, memorize, and play in an instant, c# f g# is nonsensical garbage I have to treat as three individual notes rather than a meaningful unit. If the piece is written in C# major, i.e., has 7 sharps, I don't even parse those three particular notes, I simply note I am to play the tonic and move on. Its the same pattern of dots in any key, merely translated on the y-axis.

Moving the pattern to certain positions will give not major chords but minor chords. An hey, those happen to be the correct choice while the piece stays in that key. The pattern just says "play the default chord". If I need to do something exceptional — say, if I'm in the key of C major, to play A major rather than a minor, the notation explicitly signals that something exceptional is going on, by adding a sharp.

This also explains why you'd ever find an e# minor chord: going from g# major to e# minor looks perfectly natural in western notation (because it is the relative minor, a super common concept), whereas going from g# major to f minor is nonsensical, and, written in the key of g# major (or in anything from which you would sensibly reach g# major), looks completely jarring.

Western notation happens to correctly convey absolute pitches, but those are fairly unimportant to most listeners. They have shifted over time! Significantly more important for the effects that music has is how one sound relates to the next. And that's where western notation really shines, in conveying relations between notes.

And it pays of for the trained practitioner. Whereas reading music written by composers with perfect pitch who don't care about music theory is an effing pain.

Stuff like C# being the same as Db, but contextually having to use one of these over the other, just bothers me. Why would you have to call me only Andre on Tuesdays but call me Staltz on Wednesdays? It's the same person!!

Imagine somone asks you how to get to the train station. Why would you have to give them different directions depending on where they ask you? It's the same place!! You should just give them GPS coordinates, that is so much simpler and will definitely help them.

@aljoscha %3UJjQjVECEmp4nWKJ3ORww0E5fNOrF7sQ87mz84TSGg=.sha256

Gah, sorry for ranting. But this particular topic happens to trigger me and I've independently encountered this "improvement" one time too often this week.

User has not chosen to be hosted publicly
@andrestaltz %b46wRy2V76g36kpDcC9L+bbrVDQe5iNNebuL1m5mquw=.sha256

@Aljoscha

Western music notation is designed for efficient reading of western classical music. It is neither designed for easy writing, nor for easy reading of arbitrary music.

Well, we can agree on that, for sure. Western music notation feels a lot more biased than e.g. mathematical notation, and I think there are "language" aspects to it. Most spoken languages at some point are not very logical, and they could make small tweaks to be more logical, but they don't. I think WMM is like that, and there's nothing I have seen so far that makes it universal and absolute. In particular, I think there could be alternative music notations that preserve all the qualities of the classic western one, but improve on the "pay off" efficiency (it would be quicker to learn). E.g. there is nothing inherently accidental about the accidental notes. People just happened to favor C major for whatever historical reason. And favoring major and minor scales, by the way. Music today is far more diverse and we need music notation that breaks those biases.

I am equally triggered by claims that WMM is perfect and you just need to accept it and practice until you're good. :))))))

@aljoscha %tVa4wB5E+unQXGFrAQ6Zl8OvYF9jrV8CapoB5DRONcc=.sha256

I'm not claiming it's perfect. I'm pointing out it's a hybrid between absolute and relative (to the tonal center) instructions on what to do.

What triggers me is when people (frequently computer scientists) completely fail to see the value in those relative instructions (or their existence), design a system that only conveys absolute instructions, and then claim it to be superior.

Look at an article like https://musicnotation.org/tutorials/intervals/ They just assume that current notation is dumb, without devoting a single word to considering why it "distorts" intervals the way it does.

Or they have lovely passages like

In other words, the visual representation is not proportional to the sound it is representing. What one sees does not correspond to what one plays or hears.

No! Of course a diminished seventh sounds completely different from a major sixth, even on a piano. Because neither of them occurs in a vacuum.

Or https://musicnotation.org/tutorials/reading-playing-music-intervals/ Playing/improvising/singing by interval between individual notes is how they claim making music works. But that's just a stepping stone to thinking relative to a tonal center. For actual mastery, thinking in intervals is a bad habit that needs to be overcome. Traditional music notation is not "badly designed", it is tailored to (a particular kind of) mastery.

That whole website is a shining example of never trying to falsify one's hypotheses. They claim that interval notation is full of "ambiguities and inconsistencies". Do they ever stop to question whether that assumption might be false? No! What they should do is let 100 classically trained musicians transcribe Bach by ear, then analyze how many inconsistencies there are. I'm happy to wager that not a single classically trained pianist, conductor or composer would put down a major sixth where Bach wrote a diminished seventh.

@andrestaltz %Hf3lFC/1ylg4yOtFkTraIFsxxzxvS5zr0xdqm8csMEM=.sha256

@Aljoscha I know you're talking about notes in relation to the root in some tonality, but you don't seem to have addressed the basic argument that I started with, which is: why name only 7 notes in our 12-tone equal temperament note system? There are literally 12 notes, why are 5 of these named in relation to their neighbors? There should be unique names for all 12, e.g. A,B,C,D,E,F,G,H,I,J,K,L. Then of course you could pick a key like H minor and play I# or Jb or whatever.

That would make a lot more sense than the current classic music notation we have, and honestly I still haven't heard any convincing argument for why we stick with A,A#,B,C,C#,D,D#,E,F,F#,G#, all I hear is dogma and I've been hearing this since I was 5 years old learning this shit. Like the youtube video I linked, it's kind of crazy how people get dogmatic about this, see e.g. how long humans insisted on Pythagorean tuning even to the point that the Catholic church persecuted those who used "bad intervals" until the Enlightenment came and we finally got rid of some bad logic in music tuning. We need to keep getting rid of illogical things in music.

@andrestaltz %a1VLLupOf3dBZVNvUFU1b/Az+8YQ/UC9LGRidQAdi0Y=.sha256

Oh wow, I made a quick search for "isomorphic keyboard/piano" and ended up finding something that resonates with me A LOT: the Dodeka keyboard and the Dodeka music notation.

User has not chosen to be hosted publicly
User has not chosen to be hosted publicly
@aljoscha %YOs2pJBTpKqh+v6PDmueSKAYP/qQ/AVT0DUUHivd1+U=.sha256

you don't seem to have addressed the basic argument that I started with, which is: why name only 7 notes in our 12-tone equal temperament note system?

Compression.

@aljoscha %IqM/AE8VW40wlJl3GqUVgUQRN305/oRMMCbJtKGqYZY=.sha256

Western classical music has a significant bias toward seven of the twelve possible notes. The notation is an organically grown, highly simplified variation on huffman coding that provides reasonable amount of compression with negligable decoding latency.

This is not primarily about written notation being more compact (though that's a nice bonus, all those 12-tone notations are comparatively unwieldy), but about reducing the bandwidth at which you can meaningfully read. Once you've internalized the decompression step, you will be more efficient than if you had to decode a 12-tone notation. Citation or experimental evidence needed of course, but that's the underlying idea. It's not like past humans where too stupid to try out 12-tone-based notation. It just happens to be less efficient (citation needed again, admittedly, I'm at work and can't actually research this right).

Current notation is certainly not optimal in this respect, I'd be happy to read up on proposals that address the inefficiencies of writing in melodic minor. Unfortunately, I have yet to find a proposal that is even aware of issues of relative orientation and compression.

@aljoscha %AIsfEkY5oiGQ5sMQzsbPCpeff+cCamkqIbPKmg5Lik8=.sha256

For what it's worth, I agree that thinking about notes as A,A#,B,C,C#,D,D#,E,F,F#,G# is indeed a bad system. But imo the correct response is movable do solfege, not 12-tone based notation.

User has not chosen to be hosted publicly
Join Scuttlebutt now