luva

Live code GUI for interactive concerts (2021)

     Since 2D visual abstraction is an objective sharing experience (the object is the same when it observed by 2 different persons), in LUVA we projected a possibility to share the composition process with its audience.

     At the interface LUVA, augmented instruments, audiences and space can be projected to a macro scale sonic composition: a concert.  Live recordings of the Audience, space and the instrument can be live processed, redesigned and reshaped. That means that everyone at the concert hall will be not a listener but an active member, including sound field recordings, such as Long wave synthesis of Geographic location or even echolocation reverberance.

 

This combination of agents manipulates their own sound recording by circular music generation methods and Genetic algorithmic calculation.

Show your screens

In the live code music scene, the movement show your screens promotes transparency and inclusion by showing the code that you type. But does abstraction in art reaches the idea of transparency? We're use to depend and be supported by symbols and numbers.  In a world where technology and science dominate, It's difficult to understand that there are cognitive dimensions in the experience of appreciation that can't be summarized by data. Numbers, letters, objects can bother the real concept of 'transparency' in a musical performance.

Another discussion that should be pointed out is about transparency itself, should we aim Transparency? Which advantages does a performance has by showing the process of composition?

I understand this wish because I also sensed that problem writing musical scores. Transparency is related to the 'pitch' of your artistic expression or an empirical validation of cognition. I sense this global issue also in the general idea of a concert and its sound specialization: everything comes from out, not inside.

Crossover & Genetic Algorithms

Recordings of the public or Soundfield recordings  will be live recorded, saved in sample files, processed, streamed and calculated by Genetic Algorithms. We will calculate the best fitness generation and codify  the result in a Lindenmayer system, which are translated in the three simple chars:

   F = development
   + = variation
   -  = contrast

   So the best fit will look like this:

   "FF+--F+F---F+FFFF"

   We can use the process material to ponder, combine and to connect via steno Message.