ICLI 2018, 4th International Conference on Live Interfaces. Inspiration, Performance, Emancipation.
Edited by José Alberto Gomes, Miguel Carvalhais, Rui Penha.
Thursday, June 14
|14:00 FBAUPPS13B||Doctoral Symposium
The Doctoral Symposium is limited to presenting participants and chairs.
|21:00 Passos Manuel||Conference Opening|
|21:30 Passos Manuel||Performance Session 1|
Friday, June 15
|9:00 Casa da Música||Registration|
|10:00 Casa da MúsicaCyber||Keynote
|11:00 Casa da MúsicaCyber||Paper Session 1
|12:30 Casa da Música||Lunch|
|14:00 Casa da MúsicaCyber||Paper Session 2
|15:30 Casa da MúsicaSala de Ensaios 2||Performance Session 2
|17:00 Casa da Música||Coffee Break|
|17:30 Casa da MúsicaCyber||Performance Session 3|
Saturday, June 16
Rajele Jain From the Natyashastra
How can a person who does not feel sorry, cry in pain? How can a miserable person appear joyful in happiness? When one feels sorrow or joy and shed tears or feels thrilled, that is called his emotion; and so the bhava is called emotional.
That which conveys the meaning intended by the poet through words, physical gestures and facial changes is a bhava.
There are four ways of expression (or acting) — physical, verbal, material and emotional.
Rasa is the cumulative result of vibhava (stimulus), anubhava (involuntary reaction) and Vyabhicari bhava (voluntary reaction). For example, just as when various condiments and sauces and herbs and other materials are mixed, a taste (different form the individual tastes of the components) is felt, or when the mixing of materials like molasses with other materials produces six kinds of tastes, so also along with the different bhavas (emotions) the Sthayi bhava becomes a "taste" (rasa, flavour, feeling).
Rasa is the seed of all (Sthayi) bhava-s (of the spectators).
Based on the elements and functions of interaction and mediation fundamentally described in Natyashastra (Indian dramaturgy), a definition of interface is extracted that can also enrich current research on digital interfaces. Especially the consideration of the constitution of the audience and the importance of emotions, their triggers and carriers, are often neglected in the often technologically shaped discussions about interfaces, while in marketing applications it only degenerates into a simple manipulation strategy. Indian theory and practice on the possibility of conveying meaning is a rich source for an understanding what an interface could be.
Andrew McPherson Comparative Musical Instrument Design
The design of digital musical instruments (DMIs) serves many simultaneous goals, both aesthetic and technical. While most instruments are first and foremost artistic products, their creation and use can also yield insight on how musicians creatively interact with technology, and DMIs can even inform human-computer interaction research beyond the musical domain. This talk discusses a comparative approach to musical instrument design, in which two or more variations on the same instrument are created and compared in a performance context. Several case studies will be presented, drawing on the work of members of the Augmented Instruments Laboratory at Queen Mary University of London. In our lab, comparative instrument design has been used to investigate themes including accessibility to novices, skill transfer for experts, perception of the audience, hackability and appropriation. The talk will present the specific instruments and what we learned from them, concluding with a general reflection on how individual DMI designs can simultaneously serve goals of research and artistic practice.
Vincent Goudard Gestural Ergonomics of Visual Interfaces: The MP.TUI Library for Max
The design of digital musical instruments, freed from the physical constraints of acoustics, is essentially driven by issues in ergonomics and representation related to the musical context. Moreover, the programmability of virtual instruments allows dynamic reconfigurations of mapping relationships between gestural interfaces and synthesis. In this respect, graphical interfaces stand on the edge between representation and control. Recently enhanced by the advent of multitouch, they allow all kind of tangible interactions. Their customization (behaviour, shape, colour, etc.) plays a crucial role, whether for the virtuosity of professional musicians, for the accessibility of people with disabilities or for particular contexts such as collective interaction on the same touch-screen. I will first raise a few aspects of visual ergonomics that inspired this research then present recent developments of dynamic, polyphonic and customizable touch-screen interfaces, based on the concept of “dynamic intermediate model” and an ad-hoc protocol for expressive control.
Raul Masu and Nuno N. Correia Penguin: Design of a Screen Score Interactive System
In this paper, we present Penguin, a system for live scoring, and Studio I, a piece composed for the system and an accordion. The system is shaped in two modules, one that generates a musical stream in real-time and the other that manages a live scoring process. Penguin is designed to be used in interactive performances alongside traditional instruments. Studio I is a piece for Penguin and accordion. The interaction design of the system and the piece were fine-tuned involving the instrumentalist. We provide a general description of Penguin and present the design process that led to the development of the interactive performance. The design process led to two main contributions. Firstly, we identify and frame a new performer role that mixes performing and conducting elements. Secondly, we discuss how the design process of the system affected the ownership of the aesthetic of music.
Raffaella Folgieri, Maria Elide Vanutelli, Paola Maria Sala, Ludovico Dei Cas, Dario Dei Cas, and Claudio Lucchiari The Creative Mind: DRACLE Further Development
The presented performance, using an EEG-BCI (Brain Computer Interface), is dedicated to artists, scholars and experts interested in the whole world of creativity and the related psychological and neuro-cognitive mechanisms. The aims of this work are: to identify possible biomarkers (EEG) related to the creative process in specific tasks, exploring it in a real-time ecological setting; to investigate the relation between explicit and implicit mechanisms, between creativity personality trait, and semantic memory; to validate a tool to study creativeness. In a previous pilot study, we revealed the presence of significant relations between personality components, EEG indices and creative processes, suggesting that the use of a self-echo setting may be applied also to boost creativity in people with specific thinking styles and personality traits, and to empower creativity in a tailored fashion. In this paper we extended the experimentation, consolidating the previous obtained results.
Henrique Portovedo, Paulo Ferreira-Lopes and Ricardo Mendes HASGS: The Repertoire as an Approach to Prototype Augmentation
This paper discusses the development of HASGS regarding augmentation procedures applied to an acoustic instrument. This development has been driven by the compositional aspects of the original music created in specific for this instrumental electronic augmented system. Instruments are characterized not only by their sound and acoustical properties but also by their performative interface and repertoire. This last aspect has the potential to establish a practice among performers at the same time as creating the ideal of community contributing to the past, present and future of that instrument. Augmenting an acoustic instrument places some limitations on the designer ́s palette of feasible gestures because of those intrinsic performance gestures, and the existing mechanical interface, which have been developed over years, sometimes, centuries of acoustic practice. We conclude that acoustic instruments and digital technology, are able to influence and interact mutually creating Augmented Performance environments based on the aesthetics and intentions of repertoire being developed.
Tiago Ângelo, Rui Penha, José Alberto Gomes and Pedro Rebelo Actuated Musical Instruments: a State of the Art
This article provides an overview of the state of the art in research driven towards the modification of the timbral properties of acoustic musical instruments through the use of electromechanical actuators (actuated instruments), allowing for synthetic sound generation to blend with the sound diffusion patterns of acoustic instruments. A selection of acoustic instruments and experimental research representing four Hornbostel-Sachs classes (idiophones, membranophones, chordophones and aerophones) is presented and their nouvelle characteristics and subsequent implementation is discussed, focusing on the techniques employed in the acoustical actuation.
Visda Goudarzi, Enrique Tomás and Artemi-Maria Gioti Collaborative Design Methods towards Evaluation of a Tangible Interface
This paper is a reflection on our experience of design of an interactive instrument and its evaluation and redesign using a collaborative creativity process. This paper examines the interface from three different perspectives; designer, performer, and expert audience. The designer describes and evaluates the chain of decisions taken to release an experimental tangible interface for professional use by a duo of electronic musicians. The performers examine the usability aspects, and a group of composers participate in a creative workshop to explore different aspects of the interface in a collaborative creativity process.
Alex McLean, Dave Griffiths and Ellen Harlizius-Klück Digital Art: A Long History
A digital representation is one based on countable, discrete values, but definitions of Digital Art do not always take account of this. We examine the nature of digital and analogue representations, and draw from a rich pre-industrial and ancient history of their presence in the arts, with emphasis on textile weaves. We reflect on how this approach opens up a long, rich history, arguing that our understanding of digital art should be based on discrete pattern, rather than technological fashion.
Thor Magnusson Ergomimesis: Towards a Language Describing Instrumental Transductions
This speculative paper proposes a terminology of ergomimesis for engaging with the way new musical instruments derive their design from previous music technologies. What new instruments translate from earlier technologies are not simply the simulation of an interface, but a whole constellation of embodied contexts, where trained movements, musical actions, human-instrument relationships and other processes are transduced or moved over to a technology of a different material substratum (from organic to digital material). The concept of ergodynamics in a musical instrument is subsequently contextualised in relation to the semiotics of mapping, from the background of the Peircian analysis of the sign.
Marije Baalman, Simon Emmerson and Øyvind Brandtsegg Instrumentality, Perception and Listening in Crossadaptive Performance
Crossadaptive processing describes situations where one performer’s output eﬀects the audio processing of another, thus imposing direct modulation on the sound of another performer’s instrument. This is done by analysis of the acoustic signal, extracting expressive features and creating modulation vectors that can be mapped to audio processing parameters. Crossadaptive performance can be situated between the performance practices of the audio processing musician, augmented (acoustic) instruments, live algorithms, group improvisation and interconnected musical networks. The addition of crossadaptive processing to these musical practices brings up questions of agency and instrumentality. Performance with crossadaptive techniques produces complex behaviours that are diﬃcult to describe by the performer or the listener. This paper covers issues of transparency & technical language, instrument and ensemble learning. For the performer a shared ensemble identity may emerge. And for the listener we discuss the role of intention and emergent musical behaviour.
Michael Rottmann Before Ink Starts to Blink: Scripts and Diagrams on Paper as Interfaces for Machines and Humans (in Creative Processes)
Creative processes, which can be treated as live performative acts, are seen nowadays as an interplay of humans, materials, media and machines. Interfaces are a part of this and often understood as technical devices, which bridge between humans and machines – Ivan Sutherlandʼs Sketchpad counts as a prime example for it. Picking up this historical case this media-theoretical paper wants to introduce scripts and diagrams on paper as interfaces for machines and humans. Coming from historical case studies it will be shown that both media with regard to their operativity have to be considered even as “auto-interfaces”, which allow for example to influence someoneʼs self. Therefore scripts and diagrams as well as the interface-concept will be reflected media-theoretically. Thus, the paper expands the interface-discourse and links it to media theory, especially to diagrammatics and notational iconicity and provides a better understanding of creative processes based on handwriting or -drawing.
Ricardo Melo and Miguel Carvalhais The Interactor Cedes Control: An Heuristic for Planned Serendipity in Interactive Systems
As part of our development for a framework for serendipity in interactive systems, we identified specific heuristics that, when implemented in the design of systems, encourage serendipitous experiences, meaning experiences that are unpredictable and valuable. One of these heuristics-Interactor Cedes Control-and the subject of this paper, serendipity is not the result of a natural occurrence or a designed system to which the interactor is unaware, but occasions where the interactor purposefully relinquishes control from the interaction as a creative methodology or in order to increase the delight and surprise in, otherwise, mundane activities. To that end we begin this paper with an overview of the serendipitous potential and history of the digital medium, followed by an argument for artificially created serendipity that enables the design of serendipitous systems. Lastly, we identify the distinct methods (namely Generative Systems, Automatisation, Randomisation, and Multiple Agents) which constitute the Interactor Cedes Control heuristic of the larger framework.
Thanos Polymeneas Liontiris, Thor Magnusson, Chris Kiefer and Alice Eldridge The Ensemble as Expanded Interface: Sympoetic Performance in the Brain Dead Ensemble
This paper reports on an interactive and interconnected music ensemble from the perspective of the interface. More specifically it aims to canvass the dynamic relationships established within the Brain Dead Ensemble. It describes how the reconfigured relationships between performers and instruments are inherent to this ensemble from a technical point of view. In addition, it aims to survey the phenomenological aspect of the relationships established between the performers of this ensemble and how these relationships suggest the possibility of an ensemble itself conceived as interface.
Cécile Chevalier and Chris Kiefer Listening Mirrors
We introduce ongoing developments of Listening Mirrors, a sound art installation and live interface for musician and non-musician alike. The piece, in its construction and interaction design, investigates ways in which collective sonic expression can be made possible using Audio Augmented Reality technology (AAR) and acoustic mirrors, whilst asking how such environments promote collective sonic expression. Listening Mirrors is composed of a virtual acoustic mirror (an IOS app built with OpenFrameworks, LibPD with bone-conduction headphones), parabolic acoustic mirrors (inc. piezo mic), networked with transducers for realtime collective performance. The installation creates interplay between real and virtual sound worlds, and explores the nature of human experience within these borders by drawing on Merleau-Ponty’s Ontology of the Flesh.
Paul Granjon and Patrick Hénaff Guido and Am I Robot? Case Study of Two Robotic Artworks Operating in Public Spaces
This article is a case study of two artworks that were commissioned for and exhibited in art venues in 2016 and 2017. The first artwork, Guido the Robot Guide, guided the visitors to an art-science exhibition, presenting the exhibits with a robot's perspective. Guido was the result of a collaboration between artists and engineers. The concept was an irreverent robot guide that could switch transparently from autonomous mode to operator control, allowing for seamless natural interaction. We examine how the project unfolded, its successes and limitations. Following on Guido, the lead artist developed the robotic installation Am I Robot? where the idea of a hybrid autonomous/remote-manual mode was implemented fully in a non-utilitarian machine that was exhibited in several art galleries. The article provides a concise contextualisation and details technical and design aspects as well as observations of visitors' interactions with the artworks. We evaluate the hybrid system's potential for creative robotics applications and identify directions for future research.
Koichi Samuels and Hadi Bastani Digital Media, Live Interfaces and Inclusion: ethnographic perspectives
This paper discusses the potential of digital media and live interfaces in musical composition and performance for subverting exclusionary structures towards inclusion. Coming from backgrounds in electronic music and ethnography, the authors present two case studies that investigate music making practices with live interfaces. These case studies explore the relation between musical experimentation and the use of digital media in catalysing new forms of practice that move beyond restrictive categorisations and limiting boundaries constructed as a result of historical, social, and political processes. While the cases are differentiated in their approach, they converge in their emphasis on the inclusive potential of the digital media.
Jung In Jung Bridging Abstract Sound and Dance Ideas with Technology: Interactive Dance Composition as Practice-Based Research
In this paper, I argue that the engineering perspective of this field of research should be broadened to include, in particular, creative composition processes in collaboration with professionally trained contemporary dancers. I also argue for the value of music and dance composition using interactive system as practice-based research by examining the electroacoustic composition pedagogy of composer Simon Emmerson and the composition method of choreographer Trisha Brown. As a consequence, and based on my own compositional experience, I propose how the relationship between new musical interfaces and performers can be arranged to avoid a mere utilitarian approach.
Renick Bell and Joana Chicau A Trans-Disciplinary Tool for Collaborative, Choreographed, and Embodied Audio-Visual Live Coding
Joanne Armitage and Shelly Knotts Vibez: A Small Sense of Presence at a Distance
The paper describes a performance by live coding duo ALGOBABEZ in which they communicate telematically using biometric sensors and haptic devices. Inspired by the recent relocation of one of the band members to Australia, ALGOBABEZ are interested in how they can recreate a sense of the other’s physical presence in performance and/or what additional data they could share to build a sense of empathy between performers. As algorithmically inquisitive beings, they are also interested in how algorithms may disrupt, disturb or subvert this process, and give the opportunity for performer’s to actively adjust the honesty level of their biometric data stream.
Joana Chicau and Renick Bell Círculo e Meio
The performance reflects how language boundaries are enacted through the computing environment and society, exploring how movement, gestures, discourses, and behaviours are choreographed and communicated through these apparatuses, and how our hybrid systems and transdisciplinary research co-construct each other. It is informed by recollection of sources that reference principles of non-linear composition, non-hegemonic time and space constructs, and techno-feminist understandings. It combines two connected digital interfaces. Using a shared choreographic vocabulary, the performers create meaning around the act and conditions of coding.
Ryan Kirkbride, Lucy Cheesman and Laurie Johnson Fingerprints
Fingerprints is an improvised performance for collaborative live coding that explores ownership and identity within group creativity. It is performed by The Yorkshire Programming Ensemble (TYPE) and utilises a real-time concurrent multi-user text editor to facilitate meaningful creative exchanges in collaborative processes within the practice of live coding. The editor, Troop (Kirkbride 2017), allows multiple performers to share the same text buffer and write their own code while also interacting with code written by their co-performers. In fingerprints, each performer will work on code independently and create sound using “their own” musical algorithm before attempting to reshape their collaborators’ work. As this process continually repeats, the piece evolves and the performers are asked whether they can retain their own identities within a state of perpetual flux or if the communal process takes on a greater identity of its own.
Masato Kakinoki Act: Study
In this performance, fixity and fluidity of history, digital materials and of acts of documenting and recording will be explored by improvising the act of studying history, and improvising with the immediate recordings of the act. The recorded sounds of reading, writing, squiggling and perhaps occasional mumbling will be fluidified through the author’s physical manipulation. What the author does in the performance can be regarded as learning, examining, organising, disorganising, manipulating or forging both certain history and the act of learning it. The nature and the poetic sentiment of learning are other inherent elements behind the performance. The author is working on research-based art practice, particularly interested in the field of the culinary culture and human history, and of the sexual and marital culture and human history. He exposes his ongoing research process in the format of performance.
Filipe Lopes, Gilberto Bernardes and Clara Cardoso Variações sobre Espaço #6
We present Variações sobre Espaço #6, a mixed media work for saxophone and electronics that intersects music, digital technologies and architecture. The creative impetus supporting this composition is grounded in the interchange of the following two concepts: 1) the phenomenological exploration of the aural architecture (Blesse & Salter 2007) particularly the reverberation as a sonic effect (Augoyard & Torgue 2005) through music performance and 2) the real time sound analysis of both the performance and the reverberation (i.e. impulse responses) intervallic content — which ultimately leads to a generic control over consonance/dissonance (C/D). Their conceptual and morphological nature can be understood as sonic improvisations where the interaction of sound producing bodies (i.e. the saxophone) with the real (e.g. performance space) and the imaginary (i.e. computer) acoustic response of a space results in formal elements mirroring their physical surroundings.
Marcin Pietruszewski formulae si:v
The ‘formulae si:v’ is an experimental opera; a duo for synthetic voice and an algorithmic script for auditory scene formulation; an elemental synthetic laboratory where the sensible, the intelligible, the artificial, and the natural are animated and combined. Integrating a state of the art machine learning program, a novel hybrid sound and speech synthesis design, and an original spatialisation score, the work probes the experimental capacity of sound synthesis at the intersection between microsound, psychoacoustics and computational linguistics.
Alice Eldridge, Chris Kiefer, Thor Magnusson and Thanos Polymeneas Liontiris Brain Dead Ensemble: An Acoustically Networked Feedback Quartet
The Brain Dead Ensemble are an acoustically networked feedback quartet/assemblage in which the structural, acoustic feedback pathways within and between “open” instruments create a fundamentally distributed musical agency. The current ensemble consists of two feedback cellos, a feedback bass and a Threnoscope, acoustically coupled to form a multi-instrument, multi-channel system - an expanded music interface. The feedback cellos and bass are electro-acoustic-digital resonator instruments. Each instrument has pickups under each of its strings and one or more transducers built into the acoustic instrument body, inducing electromagnetically-controlled feedback which can be subject to digital processing. The classical model of a bowed instrument is inverted: the player no longer controls and excites the strings to produce sound, but negotiates with an ongoing, lively, self-resonating instrument. The threnoscope is a software system created by ixi audio for drones, live coding and microtonal, spatialised composition. All the instruments are networked acoustically: the seven channels of the threnoscope are diffused to a quadraphonic PA plus the integral speakers of the string instruments. The acoustic result of these feedback processes is characterised by a variety of sonic colours including airy microtonal micro-melodies, serene yet colourful drones, complex spectral gestures, and vast explosions surfacing gradually or unpredictably into screams. Performances are improvised; an emergent, negotiated form of performance which involves the steering and shaping of evolving, distributed, sonic energies rather than the instigation and exchange of discrete musical ideas. No one is in control, although everyone is playing.
Henrique Portovedo Performance Proposal
This project is part of the research driven by the saxophonist and sound designer Henrique Portovedo, designated Multidimensionality of Contemporary Performance. Starting as an artistic exploratory project, the conception and development of the HASGS (Hybrid Augmented System of Gestural Symbiosis ) for Saxophone became, as well, a research project including a group of composers and engineers. The project has been developed at Portuguese Catholic University, University of California Santa Barbara, ZKM Karlsruhe and McGill University Montreal with insights from researchers as Henrique Portovedo, Paulo Ferreira Lopes, Ricardo Mendes, Curtis Roads, Clarence Barlow, Marcelo Wanderley. The pieces for this performance were composed by Balandino di Donato, Giuseppe Silvi, Nicolas Canot and Tiago Ângelo. This performance will not only provide insights on the development of Augmented Instruments, but at the same time, it will provide data analysis for programmers and composers to prepare pieces for this specific augmented instrument. The pieces presented will be analysed according to new notational and compositional paradigms whitin HASGS as well as contribute to perceive the evolutionary trajectory of the instrument according to the repertoire.
Joe Watson The Thing Breathed
The Thing Breathed is a modular synthesis composition for live performance. It explores nested feedback networks instantiated in analogue synthesis, presenting a chaotic complexity that occludes attempts to fully understand the system. It is a ‘black box’ to its performer, who spends performance time searching for rare yet fruitful zones of sonic interest that have been discovered through rehearsal and experiment. As such the nature of the performance is one of risk and commitment, steering rather than commanding, performative rather than pre-programmed.
Pedro Louzeiro and Henrique Portovedo Comprovisação nº 9, for augmented saxophone (HASGS), ensemble and real-time composition and notation system (Comprovisador)
“Comprovisação nº 9” is meant to be a musical performance made of one soloist who uses an augmented saxophone (HASGS), an ensemble of musicians who sight-read an animated staff-based score and a real-time composition and notation system (Comprovisador) operated by both soloist and performance director/mediator.
Amy Alexander and Curt Miller A performance of PIGS: Percussive Image Gestural System
PIGS (Percussive Image Gestural System) is an instrument created by Amy Alexander for improvised visual performance with musicians. It focuses on layered visuals that are not bound to traditions of rectangular frames and “movie” structures — and on developing a performable instrument suited to improvisation. PIGS uses live gestural data as improvisational elements to create visual forms. Gestures can be used independently, or repeated with algorithmic variation through the use of drum interfaces to create visually rhythmic structures. To facilitate improvisation of video as a rhythmic “instrument,” PIGS incorporates a variety of percussive interfaces including MIDI drums, iPads, and Leap Motion. Currently Alexander collaborates with musician and sound artist Curt Miller, who has created a software instrument in parallel with PIGS in which he combines live clarinet with real-time processing of recorded source material.
Lee Westwood and Danny Bright Noise Peddler: A live exploration of the pedalboard as performance interface
Noise Peddler is a part-composed, part-improvised performance for two people, two pedalboards, and four amplifiers involving the re-appropriation of guitar effects pedals to create independent musical interfaces capable of generating and manipulating their own sounds. The result is a visually symmetrical live performance that utilises dual stand-alone pedalboards, generative MIDI/CV control, and video projection, to explore the area between composition and free improvisation. The hybrid performance system employs a selection of cutting edge modern pedal technologies, alongside well-established analog circuits, and explores their potential as an independent interface, away from the guiding force of a traditional acoustic instrument.
Alexandra Tibbitts, John Sullivan and Brice Gatinet A Method for Gestural Control of Augmented Harp Performance
Here we present an interdisciplinary collaboration and performance, featuring a gestural control system designed to augment harp performance. From a performers perspective, the developed interface system and prototype presented opportunities for real-time control and manipulation of the traditional instrument. Collaborators exchanged ideas and commentary, as well as problem-solved, in real-time. This was advantageous for direct and efficient regulation and implementation of the hardware and software into the artistic phase of this project and resulting composition. For the performer, the device needed to be lightweight, ergonomic, and user-friendly. In this performance, the device uses amplified harp, as well as voice of the player and electric tape, as sound-sources for computer-based audio effects and processing.
Adriana Sá and John Klima CityStrings
CityStrings is performed with an audio-visual instrument and a long wire stretched in space. The audio-visual instrument combines a custom zither (multi-string instrument) and AG#3, a 3D software that processes sound and image based on pitch analysis from the zither input. The wire - “magnetic wire” - is amplified via a transducer constructed from a coil of wire wound round a magnetic shaft. Both instruments allow for certain unpredictable sonic events, which conveys an understanding of expression. The role of the image is quite different: projected over the performers, it works as a reactive stage scene without distracting attention from the music.
Joanne Armitage and Shelly Knotts Vibez
Vibez is a telehaptic performance by the live coding duo ALGOBABEZ. The work brings together strands of research in biometrics, haptics, telematic performance and algorithmic systems. The performers using sensors to track and send their biometric information (HR, HRV and GSR) to their geographically distant improvisation partner. The other performer receives this information as haptic messages via an armband and uses this to feel a sense of physical closeness with and empathy for their collaborator. The performers can choose to subvert the process by moving an 'honesty' slider up or down, randomising the data to various levels.
Alex McLean Yaxu: Feedforward
Feedforward is a text editor designed for the TidalCycles live coding environment. The feedforward project began in February 2018 and is under active development. It forms the basis for experiments in pushing the limits of text-based live coding interfaces, including through in-line visual feedback, keyboard shortcuts into the transformation of pattern, and the live manipulation of edit history, both from past and present performances. This is a continuation of work begun with my first live coding interface ‘feedback.pl’ from 2003 until around 2009, when I first began work on TidalCycles. Feedback.pl supported live self-modification of code, in order to provide in-line visual feedback to the user. Feedforward is also heavily inspired by work of others in this area, including on the SuperCollider History Class by Alberto Campo et al , of Charlie Roberts et. al on the Gibber family of live coding environments , and the work by Thor Magnusson on Ixi Lang . It also intends to draw from experiments in intelligence augmentation, most famously Douglas Engelbart in his 1968 ‘Mother of all demos’ and more recently the Dynamicland project.
Luís Aly Performative Sound Design
Performance art demands from the sound designer great invention and flexibility, which barely complies with the typical fixed media practices employed in the field. To organically bind sonic elements with open performative structures, I strive to design a computational system as a sound-instrument which is capable of integrating the compositional elements of the performance in a volatile way, within specific (sonic) constraints defined beforehand. The sound-instrument will be rooted in two main components: sound archives and meta-creation models. The first departs from the existing mass preservation mechanisms providing the performances some coherence across its multiple renditions. The second defines a set of creative models that can explore biofeedback stimuli from both the stage performers and sound designer to navigate and explore the large sound archives in real-time. As such, I aim to guarantee an expressive, dynamic, and interactive symbiosis between performative action and the sound design.
Jack Armitage Supporting Live Craft Process in Digital Musical Instrument Design
Despite digital lutherie’s goal of enabling liveness in performance, digital lutherie as a process often lacks liveness. The tools of digital lutherie, adapted from domains where liveness was neither feasible or important, can make craft process feel dull, blind and isolated. Understanding and supporting live craft process in digital lutherie is important for advancing and disseminating the art, and for improving digital luthiers’ control over the liveness of their instruments. This requires a shift in focus from declarative and explicit knowledge of instruments, to the study of liveness, craft process and tacit knowledge in digital lutherie. This research aims to provide a foundation for this shift through integration of traditional and digital lutherie, and detailed comparison of digital luthier behaviour in different live crafting environments.
Raul Masu DMIs Design: Fostering Authorship of Composers and Creativity of Performers
My project aims to study the adoption of scores in mixed performances with DMIs and traditional instruments fostering performers' creativity while keeping the composer authorship over the piece. I intend to develop my research in the context of professional music performances and pedagogic scenarios. In this paper I introduce related works on DMI, touching the concepts of the composed instrument and composer-performer. I give an overview of existing literature that investigates relations between score and music technology, and I describe in detail the methods I intend to apply to achieve my goal. In general, my project relies on User-Centered design approach. In my research two artifacts - scores and DMIs – and three actors – composers, performer, teachers – are involved. All the actors and the artifacts are described. The conclusions of the paper present some work already done.
Terhi Marttila Playful Readings and Deeper Meanings
I explore the specific case of interactive artworks which are predominantly based on the use of speech, text or language (ergodic literature) and which utilise this materiality to deliver a profound or somewhat serious message about a specific topic. Through case studies, a technology survey and a practical project, I look at both the history and current and future state of language as material for play in interactive arts.
Francisca Gonçalves Acoustic Ecology as Tool for Environmental Awareness: The Ocean Soundscape
In a context in which the awareness of the impact of urban sound on our society is being raised, we are still facing the problem of increasing noise pollution. Moreover, according to studies in emerging fields such as soundscape ecology, animal sound communication has changed due to a soundscape transformation caused by increasing anthropogenic noise. This also applies to the underwater ecosystems: pile driving, shipping and renewable energies are some of the threats that we are facing today as they contribute strongly to underwater noise pollution. The recording of the underwater soundscapes can help us establishing a sound map to comprehend the development of this environment that is so important for the climatic conditions of this planet´s health. Moreover, the study of oceanic sound dynamics can reveal useful information in terms of our planet´s health. And not only that: by utilizing these underwater recordings in an artistic context, we can help raise awareness of acoustic problems marine life has to face today.
firstname.lastname@example.org @Liveinterfaces facebook.com/liveinterfaces