Spring of 2016, as a Mini-Maker in Residence Program at UNC
Maker-in-Residence Elliot Inman, Amateur Musicology / Musical Circuits
Recordings from the Final Concert and Extracts from Workshops. Four Arduino stations with variations on the circuits here.
Final Concert
Final Concert 4-12-2016 I
Final Concert 4-12-2016 II
Workshop Extracts
Workshop Extract I
Workshop Extract II
Workshop Extract III
Syllabus
Arduinos and MIDI: Coding a Musical Idea
In this workshop, we will build a circuit with the Arduino that can be used to send MIDI messages to control a sound module or virtual synthesizer on your laptop. With this basic circuit, you can design all kinds of musical controllers from a classic beatbox or bass sequencer to an experimental sound interface, customized to meet the demands of your own imagination. This workshop will focus on how to program the controller to do basic things like play a pitch for a specific duration, change the tempo of a series of notes, modify volume and articulation, and control other essential features of a musical idea. No previous experience with electronics or programming is required. The workshop is open to all UNC students. To ensure we have sufficient kits for all participants, please sign up to attend. Note that this same workshop is repeated, once on Thursday evening and again on Sunday afternoon. Please sign up for only one session of this workshop. All materials will be available for use in the Makerspace. Please bring your own laptop. Workshop: Thursday, March 3, 6-8pm Workshop repeated: Sunday, March 6, 2-4pm
Arduinos and MIDI: Designing a Musical Instrument
In this workshop, we will focus on using an Arduino to create a musical interface, a musical “instrument” that gathers gestures from a performer to control the music. We will consider what makes a musical instrument playable and build circuits to experiment with alternative ways of controlling sound. What makes a piano, a piano; a clarinet, a clarinet; and Buchla’s Music Easel something completely different? We will develop programs that incorporate functions and features that will make our physical interface more usable as a musical instrument and see how those design choices affect the music we can create with it. This workshop is open to all UNC students, but students should be familiar with a breadboard and Arduino, either through the first workshop or other experiences. To ensure we have sufficient kits for all participants, please sign up to attend. Note that this same workshop is repeated, once on Thursday evening and again on Sunday afternoon. Please sign up for only one session of this workshop. All materials will be available for use in the Makerspace. Please bring your own laptop. Workshop: Monday, March 21, 6-8pm Workshop repeated: Wednesday, March 23, 6-8pm
A Brief History of Musical Time: A Concert Featuring Participants in the Musical Circuits Series Workshops
Tuesday, April 12, 6-8pm.
As part of the Day of Making, BeAM@KSL will host an experimental music concert. Participants in the Musical Circuits Series at UNC Makerspaces will perform a concert of musical works composed for instruments built during the workshop series. Solo and group performances.
The composer Edgard Varèse defined music as “organized sound.” One of the defining characteristics of the way in which we organize sound is time. “Sound over time” is a definition that works well to describe all of the music from Bach and Beethoven to Taylor Swift and Kamasi Washington.
But with the advent of modern, inexpensive electronics and software synthesis, we now have the power to control time with a degree of precision far beyond the capabilities of even the most well-trained fingers playing the best instruments. Turning a dial or clicking a button, we can execute complex sequences of timed notes that no human could replicate on a Steinway or Stradivarius. Music is not measured in quarter notes; it is measured in milliseconds.
With that in mind, what does all of this technology do in terms of advancing our ability to express our own musical imaginations? To sing a song we could hear, but could not sing? What music can we play now that before we could only envision, but could not express?
At the same time, what have we potentially lost or ceded to this technology, allowing the machine we imagine we are controlling potentially to take control of the sounds we are making? Have we created, as e.e. cummings said, “a fine specimen of hypermagical ultraomnipotence?” Or simply created a highspeed cacophony with, as William Blake might say, a “fearful symmetry.” What can we or should we do to ensure that the machines that allow complete control of time still allow us to produce music that sounds…well, human?
~ WEI 2016
You must be logged in to post a comment.