Computers demand unambiguous notation, programmers want brevity and maintainers of software want clarity. Users of mathematical notation demand brevity and clarity, but the demand for unambiguous notation is less pressing in mathematics. Hence, development of unambiguous notation has mainly occurred in the fields of mathematical logic and computer science, with computer science as the place where unambiguity is most needed. It is therefore no surprise that computer science has developed a number of formalisms (i.e. programming languages) that are completely unambiguous.
In many respects, mathematical notation is superior to computer science ditto. This is so because mathematics has had longer time to develop its notation and because computer science has been restricted to a character set with 96 characters and typewriters that could merely arrange those characters as simple, linear strings. The present paper aims at combining the best from the two worlds.
Such a combination of notation from two worlds will necessarily offend both worlds as not all properties of each world will be included or even appreciated.
Section 2 will present a number of ambiguities that appear in contemporary mathematical notation. Section 3 will present the choices that were made in the development of the notation. Section 4 develops the notation itself.Klaus Grue, August 27, 1996