The inverse word play in this conference title indicates that space and music are always already intertwined. But while space and music relate to the symbolic order, the time-based (or time-basing) sonic signal rather performs a Möbius loop with “space”. It makes sense, therefore, to clearly differentiate, in the “spatial” context, between acoustics, sound, and music.
While the enframing is “spatial” (such as theatre stages and opera houses, but as well the radio apparatus and computing architectures), the sonic event itself is temporal product(ion). Any “musical” composition is a kind of geometrization of the genuinely temporal fabrics, or woven carpets of sound, while the acoustic signal is a function of time. While musical notation (and its technological equivalent “digitization”, as much as any sound “archive”) is a geometrization of sono-temporal patterns, for the acoustic signal there is no “space” but rather genuine time functions such as delay (known from echolocation). Where sound takes place, there is no space; McLuhan's term “acoustic space” is an oxymoron.
The archaeology of knowledge on the relationship between music and space becomes media-active archaeology when it comes to re-enacting past faded-away acoustics - be it ancient theatres, or the songs of the Homeric Sirens. Sound, here, is no dramaturgial supplement, but becomes genuine media theatre. But with echography, radar, and the sonograph, “space” became a direct product(ion) of signal (re-)transmission.