Research


Research is a systematic investigation of some aspect of thought or reality which leads to transferable knowledge. 
In artistic research, this knowledge, embedded in compositional or performative work, may be expressed through diverse media, including but not confined to written text.

 

Laura Agnusdei (2nd-year master’s)

Combining wind instruments and electronics within the timbral domain

My research at the Institute of Sonology is focused on composing with wind instruments and electronics. The starting point for this is my background as a saxophone player and my compositional process aims to enhance the timbral possibilities of my instrument, while still preserving its recognizability. In line with the artistic interest of blurring the perceptual distinction between acoustic and electronic sounds, I process acoustic sounds from my instrument using digital software like Spear, CDP and Cecilia – carefully selecting procedures based on analysis and resynthesis techniques. Furthermore, future developments of my project will lead me to work on different recordings techniques and to explore how the perception of wind instrument sounds can change on the basis of their position to a microphone and the space in which a recording takes place. 

Musical points of reference in my compositions are chosen from different contexts – such as free jazz, electroacoustic composers, experimental rock, and my interest in timbre has also encouraged me to explore many extended techniques on my instrument. Additionally, when composing for saxophone I consider the special position that instrument occupies in music history, straddling influences between Afroamerican culture, pop music and contemporary classical music. Aside from the saxophone however I plan to use other wind instruments for the sake of investigating their specific timbral peculiarities and how to use their sonorities in a personal way. 

More precisely, I am interested in working with acoustic instruments because of the possibilities they offer not only to timbre but also to expressiveness and how it is possible to translate this into the electronic domain. Therefore, my research will take place in the studio dimension as well as in live performance. The final outcome of my studies should be to incorporate discoveries I made mixing and processing into my live set. Moreover, in my improvisational practice (both solo and group) I want to expand my research in order to combine live processed sounds with purely acoustic ones as well as experimenting with different amplification techniques.

 

Görkem Arikan (2nd-year master's Instruments & Interfaces)

Interactive Chair: A Physical Interface for Live Electronic Music Performance

In live computer music, many remarkable works have been created that utilise new sound sources and digital sound synthesis algorithms. However, in live electronic music concerts, we may be encountering a lack of visual/physical feedback between the musical output and the performer, since looking at a screen doesn't really convey much to an audience about a performer’s actions. Therefore, in my concerts, I have been looking for ways to minimise the need to look at the screen by using various kind of setups, largely those consisting of MIDI controllers and sensor-based systems to transform physical acts into sound. 

At the Institute of Sonology, I am focusing on building an interactive system consisting of an office chair equipped with various sensors that provide the performer with the ability to emphasise their movements, specifically tilting, rotating and bouncing. Inevitably, a crucial part of the project is how I interpret these movements, which rely on mapping strategies as well as the quality and functionality of the sound engine. Overall, my goal is to design an interface mediating between the system and the performer in order to provide an expressive performance.

During my research I will explore prior studies and methodologies including sound synthesis techniques, digital signal processing algorithms, mapping strategies, micro-controller programming, wireless data-transfer techniques and devices, as well as the potential use of compatible sensors. Following the realisation of such a system I will conduct a user study and then discuss the results in the written part of my thesis (including its entire construction, a user evaluation, and future implementations of the system). 

The system will be presented in a public performance by a musician familiar with the system, in addition to an exhibition where the audience will have the opportunity to physically experiment with the system simply by sitting and moving in the chair.

 

Matthias Hurtl (2nd-year master’s)

DROWNING IN ÆTHER
software defined radio – a tool for exploring live performance and composition

In my practice I am often fascinated by activities happening in outer space. Currently, I am interested in the countless number of signals and indeterminate messages from many of the man-made objects discreetly surrounding us. Multitudes or satellites transmit different rhythms and frequencies, spreading inaudible and encoded messages into what was once known as the æther. Whether we hear them or not, such signals undeniably inhabit the air all around us. Radio waves, FM radio, WiFi, GPS, cell phone conversations, all of these signals remain unheard by human ears as they infiltrate our cities and natural surroundings. Occasionally though, such signals are heard by accident, emerging as a ghostly resonance, a silent foreign voice, or as something creating interference in our hi-fi systems. Yet aside from these accidental occurrences, tuning into these frequencies on purpose requires a range of tools, such as FM radios, smartphones, wireless routers and navigation systems. 

Presently, my research at the Institute of Sonology includes placing machine transmissions into a musical context, exploring what inhabits the mysterious and abstract substance once referred to as æther. This exploration fundamentally delves into how one might capture these bodiless sounds into a tangible system, so they can be transformed into frequencies like an oscillator or any other noise source. Additionally, I employ methods or indeterminacy; a methodology assisting the emergence of unforeseeable outcomes. Specifically, this includes using external parameters that engage with chance and affect details or the overall form of my work.

Principally, this research project will focus on grabbing sound out of thin air and using it in a performative setup or within a composition. I expect this to be concealed as possible, as well as concealed signals, as well as finding methods for listening to patterns in the static noise and using tools to generate sound in bursts of coded or scrambled signals.

 

Slavo Krekovic (2nd-year master's Instruments & Interfaces)

An Interactive Musical Instrument for an Expressive Algorithmic Improvisation

The aim of the research is to explore the design strategies for a touch-controlled hybrid (digital-analogue) interactive instrument for algorithmic improvisation. In order to achieve a structural and timbral complexity of the resulting sound output while maintaining a great degree of expressiveness and intuitive ‘playability’, possibilities of simulations of complex systems drawing inspiration from natural systems and their external influence via the touch-sensor input will be examined. The sound engine should take advantage of the specific timbral qualities of a modular hardware system, with an intermediate software layer capable of generating a complex, organic behaviour, influenced by a touch-based input from the player in real time. The system should manifest an ongoing approach of finding the balance between deterministic and more chaotic algorithmic sound generation in a live performance situation.

The research focuses on the following questions: What are the best strategies to design a ‘composed instrument’ capable of autonomous behaviour and at the same time being responsive to the external input? How to overcome the limitations of the traditional parameter control of hardware synthesizers? How to balance the deterministic and more unpredictable attributes of a gesture-controlled interactive music system for an expressive improvisation performance? The goal is to use the specific characteristics of various sound-generation approaches but to push their possibilities beyond the common one-to-one parameter-mapping paradigm, thus allowing a more advanced control leading to interesting musical results.

 

Hibiki Mukai (2nd-year master’s)

An Interactive and Generative System Based on Traditional Japanese Music Theory 

The origin of notation in western classical music. This meant that the score should be in a form that was easy for the general public to understand and also be passed on future generations. Since then, this western notation system has been widely used as a universal language throughout the world.However, its popularity does not mean that there is not a loss of important sonic information. In contrast, most traditional Japanese music was handed down to the next generation orally, but was also accompanied by original scores that conveyed subtle musical expressions. These 'companion scores' were written with graphical drawings and Japanese characters.

In my research at the Institute of Sonology, I plan to reinterpret traditional notation systems of Japan (ie Sho ̄ myo ̄ 声明 and Gagaku 雅 楽) and designing real-time interactive systems which analysis relationships between these scores and the vocal sounds made by performers. In doing this, I plan to generate that exists in digital form. This will allow me to control parameters such as pitch, rhythm, dynamics and articulation by analyzing intervals in real time. Furthermore, this research will culminate in using this system to realize a series of pieces for voice and western musical instruments (eg piano, guitar, harp) and live electronics.   

Overall, I believe it is possible to adapt this traditional Japanese music theory into a western system by using electronics as an intermediary that processes data. In addition, by re-inventing traditional Japanese notation I expect it to be easier to access expressive ideas in western notation. However, using this type of score I also aim at extending many of the techniques of western musicians and composers – designing an interactive relationships between instruments, the human voice, and the computer. 

 

Yannis Patoukas (2nd-year master’s)

Exploring Connections Between Electroacoustic Music and Experimental Rock Music of the Late 60s and early 70s

From the late 60s until the late 70s the boundaries of music genres and styles, including popular music, were blurred. This situation resulted from rapid and numerous socio-economic changes and an unpredicted upheaval in the music industry. Rock music, later renamed to “progressive rock” or “art rock”, separated itself aesthetically from “mass pop” and managed to blend art, avant-garde and experimental genres into one style. Also, the shift from just capturing the performance to using the recording as a compositional tool led to increasing experimentation that created many new possibilities for rock musicians.

Undoubtedly, many bands were aware of the experiments in electroacoustic music in the 1950s (elektronische Musikmusique concrète) and drew upon influences from the compositional techniques or avant-garde / experimental composers. However, many questions arise about why and how art rock was connected to experimental and avant-garde electroacoustic music; and secondly, whether it is possible to trace common aesthetic approaches and production techniques between the two genres.

My intention during my research at the Institute of Sonology is to elucidate and exemplify possible intersections between experimental rock and the field of electroacoustic music, especially in terms of production techniques and aesthetic approaches. The framework of this research will include a historical overview of the context that experimental rock emerged from, exploring why and how certain production techniques were used at that period in rock music, and investigating into whether the aesthetic outcome of these techniques relates to experiments in the field of electronic music.

Parallel to this theoretical research, I also plan to attempt a reconstruction of some production techniques which I will explore for the sake of developing my own aesthetic and compositional work. The driving force behind this type of reconstruction will include exploring tape manipulation, voltage control techniques, and their application to different contexts (such as fixed media pieces, live electronics and free improvisation).

 

Orestis Zafiriou (2nd-year master's)

Mental Images Mediated Through Sound

I'm interested in researching correlations between physical and perceptual space in the process of composing music. Concentrating on the ability of human perception to create images and presentations of the phenomena it encounters, I propose a compositional methodology where the behaviour and the spatiotemporal properties of matter are translated into musical processes and represented through electronic music.

Sounds in my work represent objects and events, as well as their movement in space-time. This implies that a musical space is formed by a succession of mental images unfolded in the perceptual space of the composer when encountering these events. With this method I also aim to point out the importance of music as a means to communicate objective physical and social processes through a subjective filter, both from the standpoint of a composer as well as from the perception of a listener.

Orestis Zafeiriou has studied mechanical engineering at the technical university of Crete, completing his thesis on sound acoustics specifically pertaining to the physical properties of sound and its movement through different mediums (active noise control using the finite-element method). In addition to his present and past studies, he also actively composes music in the band Playgrounded, who released their first album (Athens) in 2013 and their second (In time with Gravity) in October 2017.

 

Katrina Burch Joosten (2nd-year master's)

Diagrammatic Blue Deeps: Transforming Sound for Poetry

My compositional work focused at the Institute of Sonology combines electroacoustic music, extended vocalism, digital processing, poetry, and abstract theatre. Using digital sound transformation tools, I explore the voice and poetry, to investigate how phononormativity is shaped by the temporality of language. My research considers the role of what I call ‘erotic abduction’ and symbolic language in the formation of impermanent, sounding mental images. My research is opened with a feminist perspective. I seek to question how and why the de-reification of sound can become a starting point for the deracination of the listening subject. I probe feminist writings to map out philosophies of (dis)embodiment, intuitive rationality, and anthropologies of alienation and transformation. Interweaving these philosophical threads and complex memory systems sustains my compositional practice. As well, it situates my work within a broader social milieu. Finally, I seek to intersect differing starting points for subjectivity, through voice and poetry, in order to generate sonic myths centred around emancipatory notions of technological futurity.