Musical Gesture Generator
The name of this tool is the word in Spanish for scribble and has been under development during the last two years. Garabato allows the user to create musical gestures by entering shapes and scribbles that control the pitch, dynamic, length, density, and timbre evolution. The first four parameters use an XY plot in which X axis is time and the Y axis represents the corresponding range. The timbral evolution required the implementation of machine learning and the user draws the trajectory of the timbral change over previously trained “timbral areas” on the plot. The final output of Garabato is a musical gesture that can be heard in real time and that can also be exported as audio and/or as a score (using .xml format).
Multiple projects and tools are used in Garabato, for example, machine learning is executed using two main tools Wekinator (www.wekinator.org) and the ml.* package in Max (www.benjamindaysmith.com). The heart of the device uses the Max packages bach, dada, y cage, a series of CAC (Computer Aided Composition) tools developed by the bach project (www.bachproject.net). An early version of Garabato uses multiple folders of prepared piano samples as the initial sound source and a couple of third party VST’s to transform them. Another version uses MIR (Music Information Retrieval) that extracts information from custom samples using IRCAM descriptors inside a system created with the mubu package in Max (https://ismm.ircam.fr/mubu/).
Currently, I am developing the piece called Scribbles for trumpet, piano, and percussion written for SPLICE ensemble.