Tassilo relies very heavily on the alleged analogy between language and music. I am convinced that this analogy is inappropriate because music is not language in any but the most metaphorical sense. Language - even a programming language - requires semantics, and semantics is strongly dependent on reference: the relation between symbols and the physical world. No such relation is required in music. The series of sounds C, E, G and Bb does not refer to anything in the physical world, whereas the series of sounds "the cat is on the mat" does refer to cats, mats and the physical relation of one object being on top of another. At best, music can be seen as analogous to the "emotive" part of language, i.e., the part consisting not of declarative or interrogative sentences, but of expressions like "Damn!", whose only purpose is to express emotions rather than to convey or to request information.
The crucial difference between the two becomes obvious when we note (as I implicitly did in my discussion of Stockhausen's Kurzwellen) that sound events without an underlying systematic structural organization can be perceived as music, but nothing of the sort is possible with language.
Finally, Tassilo points out that naive listeners deepen their musical experiences (and presumably their intuitive grasp of the structure of a musical work) when they "return again and again to the Beethoven symphonies, learning to hear more in the process". I cannot overstate how much I agree with this point, and it is precisely this point that I made in an earlier post when I insisted that "It is by hearing, say, Beethoven's Fifth Symphony in performances by Nikish, Toscanini, Furtwangler, Szell, Norrington, and Eotvos that will give one a comprehensive grasp on that symphony as a work of art."