On June 9th, an Algorave was held in Santa Barbara that primarily consisted of members of MAT 276IA, a class I taught on sound synthesis and algorithmic composition in the Media Arts & Technology program at UCSB. Everyone except for the guest headliner, Chad McKinney, used Gibber in their performances. I was really happy with the performances and it was great to hear all the different types of music that came out; no two were alike.
One other notable aspect of the concert was the use of a new system for ensemble performances that I’ve named Gabber. Gabber provides clock synchronization and a bunch of other neat features. I’ll be presenting it at ICLC (International Conference on Live Coding) in July. Thanks to Hafiz, Anis, and Marco for being willing guinea pigs with the new system.
Some months back, the post-punk band 65daysofstatic posted what I believe to be the first example of an algorithmic composition by a popular music group published via real-time rendering in the browser. Using Gibber, of course. It received a really positive response when the band posted it to Facebook, especially after I reconfigured Gibber’s server to not crash under the load of hundreds of fans trying to view the piece simultaneously.
The model is interesting to me because it enables fans to potentially play around with the code, remix it and republish their results. I hope to see more of this going forward with Gibber. For another interesting take on publishing music via code, see Alex McLean’s recently announced Peak Cut album. Peak Cut is distributed both as audio files and as executable source code on a USB flash drive. It also includes a bootable image of Linux with Alex’s Tidal live coding software pre-installed.
When you launch Gibber, the 65dos demo is one of the first ones that appears in the file browser. Take a listen!
The second was a performance that featured live coding in Gibber by Roger alongside vocals by his partner, Hazel Smith (who contributed text / vocals to Bird Migrants as well). When looking for information on the performance I found a nice review of it by the Sydney Morning Herald. The performance also featured Bird Migrants.
A month or so back I gave a talk at JSConf.Asia about Gibber. The talk runs a little over half-an-hour, with a very short performance at the very end. It goes into a little more technical depth about how Gibber is made; here it is if you’re interested in such things:
I have a backlog of people doing interesting stuff with Gibber to post about; a great problem to have! Jesse Allison has been leading a research group (the Experimental Music and Digital Media program, or EMDM) at Louisiana State University who have been using Gibber in a variety of ways. First, they’ve been using it to perform in the Laptop Orchestra of Louisiana; below is a pic of a recent performance:
The Laptop Orchestra of Louisiana performing with Gibber.
They’ve also been using it to teach middle school and high school students through their EDMM Academy; very cool!
The EMDM program has also released a number of interesting open source software packages. The first is a web GUI framework for digital musical instruments that recently received a major overhaul / upgrade: NexusUI. The second is BRAID, a tool for quickly prototyping digital musical instruments. BRAID currently combines NexusUI with gibber.audio.lib, and although it will support more audio libraries in the future I’m happy Gibber got to go first. These recent projects are primarily the work of Ben Taylor, an incredibly talented creative coder studying with Jesse at LSU.
From the I-should-have-posted-about-this-months-ago category:
Alex McLean led a workshop on digital weaving and used Gibber a while back; for a while the “Recent” menu in the Gibber file browser was filled with examples of weaving done by participants using Gibber’s 2D graphics mode. You can read about Alex’s work and also get a link to a Gibber publication to try out your hand at weaving a la Gibber:
The workshop was part of a grant involving a number of people titled “Weaving Codes, Coding Weaves”. Live coder / artist / developer Dave Griffiths has since done some really impressive work on software (and hardware!) for digital weaving:
Going to try and be a little more active on this blog; I especially want to highlight all the work other people have been doing involving Gibber / p5.gibber. But to start, some info about some nice feature updates to Gibber.
First, the new pattern library. There are now lots of ways to manipulate the underlying values of audiovisual sequences. You can reverse, invert, rotate, scale, transpose, crop… many tools for creating variation over time in sequences. Check out the Gibber pattern tutorial for details (just close the welcome screen that pops up):
I’ve also added some tools for visualizing pattern manipulations, you can see them at work in this video:
One common request for Gibber is for more instruments and styles of sounds. I’ve addressed this to some extent by adding SoundFont integration to Gibber. SoundFonts are collections of samples that match the General MIDI spec (https://en.wikipedia.org/wiki/General_MIDI). Adding the SoundFont means there are now a variety of more traditional sounds (piano, organ, cello, flute etc.) that can be used in giblets. You can try out the SoundFont integration here:
A few events coming up in the next couple of months. First, a long-form performance (~ 1 hour?) on October 26th as part of the ACADIA Conference (Association for Computer Aided Design in Architecture) in Los Angeles. I’ll be performing during their hackathon session: http://2014.acadia.org/hackathon.html
Next, multiple performances as part of the ACM MultiMedia Conference in Orlando. I’ll be performing twice a day on the 5th and 6th of November as part of the interactive arts program. I’ll also be presenting a long paper on Gibber on the 4th. http://acmmm.org/2014/
Most importantly, I’m hoping to release a significant refactoring of Gibber sometime in the next couple of weeks before all this happens…