Composing in Code: Musician-Programmers Are Changing the Way Music Is Made

Radiohead's Jonny Greenwood performs in Austin, Texas, in 2012. Greenwood is renowned not only for being an amazing guitarist but also a tinkerer and a computer programmer who's used programming in his music. Andy Sheppard/Redferns via Getty Images

Software developer Dave Yarwood released a new programming language this fall. It's called Alda, and it has an unexpected audience: composers.

Yarwood, a classically trained musician who studied composition at the University of North Carolina at Chapel Hill, was composing music before he was composing code. Musicians, he says, are often drawn to programming.


"I think there are a lot of parallels between the two worlds," Yarwood writes in an email. "Both fields are very regular and mathematical in nature, yet contain a strong element of artistic creation."

Links between math and music are certainly well-established, and technology has always shaped musical creation. Starting in the early 20th century, musicians adopted sound amplification, and then recording, and then synthesis. Composing on computer predates the PC, going back at least to 1957, when an engineer and an IBM mainframe in New Jersey proved a computer could play a series of notes in a preset arrangement. 


No Instruments Necessary

Today, musicians can compose entire concertos without touching any instrument besides a computer, typically using composition/notation software. It's a user-friendly approach, with graphical user interfaces (GUIs) and instant auditory feedback. Anyone can compose with it.

GUIs can be distracting, though, and instant feedback enables a kind of "trial-and-error" composing process that, Yarwood learned, is best to avoid.


"My composition professors taught me that this is a poor approach to composing music, because it limits the way that you think about music," he says.

Yarwood wanted more control, and something closer to the blank-slate experience of sitting down at a piano with pencil and paper. So he taught himself to program and joined the select group of computer-savvy musicians who compose using audio programming languages (APLs), not software. There are dozens of APLs, many of which are open source, and they let composers work directly with the computer, skipping the software middleman.

Some APLs require high-level programming ability to use. Others are more moderate but still not beginner stuff. Yarwood created Alda in part to give beginning programmers access to the benefits of composing in code. The Clojure-based language is uniquely intuitive. Type piano: c d e f into a text editor, hit enter, and a virtual piano plays the notes. Tack on > c, and it continues up the C scale.

You can hear one of Yarwood's compositions in the clip below.


Blurring the Line

Composer David Cope, professor emeritus at University of California, Santa Cruz, says via email he believes programming is key to composing on computer "if you really want the music to sound like you."

Cope is a computer scientist as well as a composer, and for him, the roles are inseparable.


"I've used algorithms since I first began composing around [age] five, at that point using what we call paper algorithms — instructions that I then followed to complete the work," he explains. "In the mid-'70s, when I had access to my first mainframe, I considered that computers would make this process a lot faster and probably more accurate." And they did.

Cope, a forerunner in the computer-music field, works entirely by "algorithmic composition." He creates computer programs that compose music.

"People are beginning to realize that computers are extensions of arts and composers."
David Cope, Composer, Professor Emeritus at University of California, Santa Cruz

In other words, his music is composed by artificial intelligence.

Cope began his "Experiments in Musical Intelligence" in 1981 during a bout of composer's block. Out of desperation (and curiosity), he decided to let his computer do some of the creative work. He began developing a program that would both know his musical style and keep track of his ideas as he composed, so it could suggest a next note or next measure when he was stuck.

What he ended up making was a program capable of creating original compositions that indistinguishably replicated the styles of masters like Bach and Beethoven.

A follow-up program composed original works in its own style, established through feedback from Cope.


Is It Art?

Some call the products of algorithmic composition "artificial music." It does raise some questions about the nature of art and artistry. (Is there a certain degree and type of human contribution required?) Early on, musicians wouldn't play Cope's work. The tide is turning, though.

"People are beginning to realize that computers are extensions of arts and composers," Cope says.


 The increasingly intimate relationship between music and computing appears to expand human creativity, not restrict it. They can easily experiment with new techniques and forms and quickly adapt their music to any instrument for which an audio sample exists.

With the software Muse and a motion sensor, they can compose purely through movement, selecting and editing notes by waving their hands.

Even turning over some creative control to computers can lead to new creative possibilities in the human realm. It's a concept Dave Yarwood finds exciting, and he's experimenting with it, developing algorithms that limit his composition range to certain audio characteristics.

His results, he says, are songs "filled with ideas that a human being could perhaps never come up with, being afraid of dissonance or afraid to 'break the rules' of traditional music theory."

And then, there are "brief moments of compelling rhythms and melodies that sound almost human-composed."

For Yarwood, it's the juxtaposition that's so fascinating.