The Linnterview: A Conversation with Roger Linn
Roger Linn hasn’t lost an ounce of passion when it comes to shaping the way musicians interact with machines. According to him, there’s more life in them yet. All that’s needed is that human touch.
Interview: Ross Bencina
Photos: Daniel Sievert
Roger Linn began his musical life as part of the vibrant West Coast scene in the late 1970s. A jack of all trades he was living most people’s rock and roll fantasy; even co-writing a hit for Eric Clapton. But as often happens, fate had other plans.
Roger Linn: “After high school, I was immediately out on the road playing guitar with various bands. And when I was 22, I started working on the prototype of my first drum machine idea — just for myself. Suddenly musicians and major artists were calling me to say, “Can I have one of your drum machines?” I said, “No, you don’t understand, I’m a guitar player, producer, songwriter. This is just for me at this stage.” But they still wanted to buy one anyway. I recognised that as a guitar player, I was one of many. But as a designer of musical products I had more to offer. So I just rode the horse in the direction it was going.”
And the horse bolted with Roger on it. Roger’s creations have gone on to appear on countless hit songs and provided a foundation for whole musical genres. His first drum machine, the LM-1, was released in 1980. At a time when drum machines used analogue sound synthesis, the LM-1 was the first to use sampled drum sounds stored on computer chips. The only other sampling-based musical instrument on the market at the time was Australia’s own Fairlight CMI.
The drum machines created in the early 1980s by Linn Electronics Inc. became staples of the ’80s synth pop sound. Roger then went on to join forces with Akai to produce the MPC60 — an all-in-one sampling and sequencing workstation that became the tool of choice for hip hop producers.
Roger was visiting Sydney recently to present masterclasses on his most recent creation: Tempest — a hybrid analogue-digital performance drum machine created in collaboration with Dave Smith Instruments. Roger took some time out from signing gear and delivering master classes at Sound Devices to pop into the studio there and talk about his past work, the Tempest, the future of electronic musical instruments, and his current project, the LinnStrument.
EARLY DAYS
Ross Bencina: What brought about the genesis of your first drum machine?
RL: I was thinking about how to get the drums into my recordings without actually having to call a drummer to come in. I liked being able to play along with a drumbeat while jamming or writing, and drummers usually didn’t like that very much.
Funny story: Giorgio Moroder, a disco producer in the ’70s who produced Donna Summer and the Flashdance soundtrack, told me that when he was younger in Germany, he would ask his drummer to record 20 minutes of a simple beat, which he would later use as a background for writing and recording. He said it was very frustrating for the drummer, so he really enjoyed my machines because he didn’t have to feel bad about the drummer suffering through that mechanical process.
RB: When you originally had the idea, did you have repetitive dance beats in mind or more articulated, expressive drumming?
RL: I basically wanted the ability to create any beat with arrangement elements like intros, fills, endings, etc. But it was also important that it be able to produce high quality sound and for the beats you created to have a natural and human feel. That’s why I included things like multiple dynamic levels and swing timing. I recognised that a machine couldn’t replace a human drummer’s ability to listen to the other musicians and respond to them with creative and appropriate musical parts, but I at least wanted the grooves to have a great sound and feel.
The funny thing is that today, I listen to almost no drum machine music. I usually like to listen to music that has rubato timing. I don’t like the constant beat all the time. I like to hear things that are expressive, where the tempo changes or the percussion goes away for a bit and comes back, and it’s a creation made not just by the composer, but by the magic that happens when you get good musicians performing together in a room.
UNDER THE INFLUENCE
RB: Your machines have become key ingredients in whole new genres of music. In a way the machines have become part of the music that’s made with them. How do you feel about that? Can you relate to it, and does that factor in to your design process?
RL: It feels great to know that a product I made has had an influence on music, because I love music. But I’m often surprised by the music made on my products, and it certainly is true that the capabilities and limitations of the machines affect the type of music that people create on them.
For example, it’s easy to make music in perfect time on a drum machine or sequencer, so a lot of the first records in the ’80s that used drum machines or sequencers tended to sound very rigid and robotic, which actually became a style. I didn’t like that because I had worked very hard to put features like multiple dynamic levels, swing and drumbeat arrangements in the original LM-1 so that it could sound more human and natural. But most of the early popular recordings made with the LM-1 didn’t use any note dynamics, swing or arrangements.
I remember the first hit made with the LM-1, which was The Human League’s Don’t You Want Me Baby. It began with a constant sixteenth-note synthesiser sound with all notes at one volume. I thought, “Couldn’t they have put some dynamic variation in there to make it sound more human?” It never occurred to me that in many cases they didn’t want it to sound more human. In fact, the nature of much of this new electronic style was to make it sound as inhuman and rigid as possible. Though at the same time there were guys like the drummer of Toto, Jeff Porcaro, who used the LM-1 on a song for Elton John called Nobody Wins, and programmed it superbly to sound like there was a real drummer on the track.
Peripheral to this is the subject of technology influencing art. The example I love to use is the effect of the invention of the camera on painting. Once the camera was invented, there was less need for portrait painters and some say this helped spawn the Expressionist movement in art, because the human painter had to do something that the camera couldn’t. The same thing happened when ARP’s String Ensemble came out in the ’70s. Many string players who would come in to a session and receive union scale for holding one note over Hey Jude, simply didn’t have quite the same popularity. The technology influenced how the music was made. So the bad news is that some musicians weren’t being hired, but the goods news is that these machines democratised music-making for many musicians, allowing them to record the music they heard in their heads, even if they didn’t have the money to hire other musicians.
Madrona Labs Soundplane
Soundplane has a walnut playing surface that can be configured as either a 150-note keyboard with position and pressure sensing on each key, or as one continuous multi-touch surface.
Haken Continuum
The Continuum fingerboard senses finger pressure and position on its continuous rubber surface allowing expressive timbre control and seamless polyphonic glissandi.
Eigenlabs Eigenharp
The Eigenharp has 120 highly sensitive keys, two strip controllers and an optional breath pipe. Each key is like a tiny joystick allowing subtle note expression in three dimensions.
EXPRESS YOURSELF
RB: Why do your designs skew towards dance music creation, despite the genre not being a personal preference?
RL: I tend to enjoy softer music or even silence much of the time. However, my job isn’t to make music but rather to make the machines that make the music. And to do that, it’s my job to try to spot long-term trends in the way people are making music.
One of those trends is the real-time integration of composition, recording, editing and performance into one seamless experience. The traditional model is for a musician to first write a song, then record it, then edit and mix it, then perform it live. However, some of the most compelling performances I’ve seen recently involved one musician creating an entire piece of music from scratch using looping and sequencing, layering the composition part by part, then arranging the parts and sonically manipulating it, all as a real-time performance and without stopping. Like watching a painter create a painting, the journey becomes the reward.
This was a primary design criterion for our new Tempest drum machine. Because its job is to make rhythmic music, we designed it so you can do almost everything without ever stopping the beat. For example, you can create a beat, then while it plays you can tweak and refine the individual sounds, and without stopping, copy it and enhance the copy, make another copy and manipulate it, then arrange these three beats in real time, adding a variety of sonic performance manipulations. That’s all in real time without stopping.
RB: Was this the main design goal of the Tempest?
RL: Capturing performance gestures was certainly a major design goal. To my thinking, beat-oriented electronic music performance has become somewhat static. If you go to a club where someone’s DJing with a computer, you’ll often see someone with a pale, screen-lit face, moving his fingers around a trackpad in a performance that resembles checking email. Even if using physical controllers, the UI usually consists of data-input elements like knobs, sliders, buttons and a trackpad. There’s not much to capture subtle human gestures, like vibrato in a violin, string bends on a guitar, or breath expression on a reed instrument.
The decline of human gesture capture in the user interfaces of electronic music machines is a negative trend. The drum machine has evolved into its own musical instrument, not unlike a guitar, keyboard or violin. And talented musicians have developed complex and virtuosic finger gestures using the pads and controls. However, these user interface elements don’t permit the capture of subtle human gestures found on acoustic musical instruments. In Tempest, we tried to solve this problem by incorporating — in addition to the 16 pressure- and velocity-sensitive pads — two linear touch strips that are sensitive to both position and pressure. For example, you can use one of these touch strips to filter a beat’s playback, with filter frequency controlled by position, and resonance controlled by pressure. And the other touch strip can use position to affect the pitch of the beat’s playback while pressure modulates the attack or decay time of all the sounds simultaneously. With Tempest I wanted to make something that gave these electronic performers an instrument they could really perform with.
I was very pleased to see that when we first released Tempest, the first user videos we received were not about solitary offline production but rather about performance, using the machine as a real-time musical instrument. And the gestures were wonderful to watch, with musicians moving their fingers in ways not unlike the performance gestures you’d see from players of acoustic instruments. The UI inspired them to practice and perform new performance gestures in order to create more interesting music.
People are thinking about the human computer interface much more than they ever have, so maybe the time for new human interfaces in music has come
COLLABORATING WITH DAVE SMITH INSTRUMENTS
RB: You worked on the Tempest in collaboration with Dave Smith. What was the nature of the collaboration?
RL: One way to look at it is that I started from the outside in and Dave started from the inside out. Dave is excellent at designing the circuit boards and software, and his superb analogue voice circuits are great examples of that.
By comparison, I like to start from the musician’s perspective, beginning with a product concept, then visualising it in a 3D model and renderings, gradually creating and fine-tuning the feature descriptions, user interface, display screens and industrial design. Then I drill down into the details, like creating a drumbeat data structure optimised for non-stop recording, editing and performance; and designing the custom drum pads and touch strips. I gradually fill in the details of the 3D model until it contains details like the circuit boards, controls, metalwork and graphics. But I didn’t do this alone. Dave’s employees were very helpful in the design because they are more knowledgeable about current music than I, and were passionate about the great ideas they came up with.
That said, there was as lot of overlap between Dave and I. Dave contributed greatly to the selection and arrangement of the synth controls, an area that he knows very well. Also, it was important for Dave to keep it small and portable, something that could fit in a backpack. And although I don’t have Dave’s circuit design skills, I had some ideas that affected the analogue circuits. For example, I pushed for adding distortion and compression to the stereo output, which had to be analogue in order to keep the signal path fully analogue. But I felt that the distortion should have more of a guitar amp sound, and because of my experience making guitar amp models for my AdrenaLinn guitar processor, I knew that a simple distortion wouldn’t cut it. It needed a highpass filter before the distortion to remove bass mud, and a lowpass filter after the distortion to mimic the high-frequency rolloff of guitar amp speakers. The design of the compressor was a similar process. I wanted it to have the sort of pumping sound that compressors can create. We ended up using a compressor chip from THAT Corporation, the same chip used in some of dbx’s compression products.
RB: And what about the overall sound of the machine, did you try many different circuits?
RL: Well, Dave’s been refining his analogue voice for 35 years, so that’s really his domain. Analogue sounds are at the heart of Tempest and a big reason for its sound quality. Unlike a sample-centric drum machine, we wanted a machine with the sonic malleability of synthesis — as opposed to the static ‘snapshot’ nature of sampling — but also with the warmth and quality of Dave’s analogue synthesis. At the same time, we realised that analogue synthesis can’t make all types of drum sounds, so we decided to add sampled sound sources. While Tempest isn’t a sampler, we included a large library of high-quality drum, percussion, noise and effects samples. What’s important is that the analogue low pass filter, high pass filter, VCA, feedback and modulations all come after the sample, giving you the accuracy of samples with the malleability and sonic quality of analogue synthesis. And the sounds people are creating with Tempest are surprisingly unique and compelling.
FROM PRESSING PLAY TO PAGANINI
RB: Do you see today’s market as different when compared to the market for your early drum machines?
RL: One thing that’s changed is that the line between artist and listener is blurring. I see it as a continuum. At one end you’ve got the passive listener, somebody who just slips in a different song into their iPhone and plays it, which is not a creative process. If you move about an inch towards centre, you get a DJ who’s skilfully selecting entire songs to match the event, and if you move about another couple of inches in toward the centre you’ve got a guy manipulating pre-made loops, beats and sequences in a creative way — something I call ‘Object Oriented Composition’. He’s being more creative but he’s not really creating the individual elements of the music. At the other end of the continuum, you’ve got virtuoso violinists, saxophonists, guitarists, etc., who create all the notes and the subtle nuances that make each note beautiful. These days the technology allows people to place themselves anywhere on this continuum, so they are. You can pretty much choose what you want to do.
One of the negative aspects of this trend is that a lot of young people want to be able to perform music but don’t want to have to learn how to play those pesky notes. They can get themselves a computer and some loop programs, and are content to simply to manipulate and combine loops and beats created by others. Some would say that’s not as creative as creating all the notes themselves, and I think that’s probably an accurate statement, but then again what they create is often pretty amazing. For example, with a fairly low amount of practice they can create a very exciting performance that excites an entire group of people in a dance club. By comparison, a beginning violinist can take a few years before she’s able to make pleasing tones, and must meet with a few friends if she wants to make a chord. Both paths have their merits but I must admit that I sometimes lament the loss of the note as a focus of musical creativity. I sometimes say that I’m on a one-man quest to save the note from extinction.
INSTRUMENT EVOLUTION
RB: Are next generation music controllers like the LinnStrument still your focus?
RL: The most exciting thing for me now is to finish the development of LinnStrument. It’s the product that I’m most excited about, and it’s probably the product that will sell the least. This general field of instruments is called alternate controllers, and what I like to say is that the road to alternate controllers is paved with the bodies of those who have tried and died. Most have been financial failures. But then again the zeitgeist today is about human interface, with multi-touch on phones and tablets, and innovative human/computer interfaces like Kinnect or the forthcoming Leap controller. People are thinking about the human computer interface much more than they ever have, so maybe the time for new human interfaces in music has come.
Not too many people are studying orchestral instruments any more. In the Darwinism of musical instruments, what’s survived these days is primarily keyboard and guitar, and guitar is often used for old-fashioned music. In contemporary popular music, I hear some wonderful singers with their vocal gymnastics but not much in the way of a truly expressive solo instrument voice. I think this is because a MIDI keyboard is the main input device for synthesis, which isn’t very good at capturing expressive performance because it’s little more than an array of velocity-sensitive switches. If you’ve ever tried to perform a convincing solo violin, sax, cello or guitar on a MIDI keyboard, even with the most advanced synthesis, you’ll know that it’s very difficult.
The truth is that there’s a wonderful array of expressive software synthesis out there, from analogue, to physical modelling, to FM, to waveguide, to additive and more, but a MIDI keyboard isn’t very good at unlocking their tremendous expressive capabilities. What I’m trying to do with LinnStrument is provide a truly expressive input surface for synthesis, in order to unlock that expressive potential.
Thanks to Sound Devices for graciously hosting the interview at its Sydney studio, and for Daniel from Sound Devices for providing some great images of Roger and the master class event.
RESPONSES