I'm not claiming it's perfect. I'm pointing out it's a hybrid between absolute and relative (to the tonal center) instructions on what to do.
What triggers me is when people (frequently computer scientists) completely fail to see the value in those relative instructions (or their existence), design a system that only conveys absolute instructions, and then claim it to be superior.
Look at an article like https://musicnotation.org/tutorials/intervals/ They just assume that current notation is dumb, without devoting a single word to considering why it "distorts" intervals the way it does.
Or they have lovely passages like
In other words, the visual representation is not proportional to the sound it is representing. What one sees does not correspond to what one plays or hears.
No! Of course a diminished seventh sounds completely different from a major sixth, even on a piano. Because neither of them occurs in a vacuum.
Or https://musicnotation.org/tutorials/reading-playing-music-intervals/ Playing/improvising/singing by interval between individual notes is how they claim making music works. But that's just a stepping stone to thinking relative to a tonal center. For actual mastery, thinking in intervals is a bad habit that needs to be overcome. Traditional music notation is not "badly designed", it is tailored to (a particular kind of) mastery.
That whole website is a shining example of never trying to falsify one's hypotheses. They claim that interval notation is full of "ambiguities and inconsistencies". Do they ever stop to question whether that assumption might be false? No! What they should do is let 100 classically trained musicians transcribe Bach by ear, then analyze how many inconsistencies there are. I'm happy to wager that not a single classically trained pianist, conductor or composer would put down a major sixth where Bach wrote a diminished seventh.