Music on Tablets – A Taste of Some iPad Apps
A whole new world of sound generating and modifying possibilities are available today with tablet computers and smartphones. Here we show you some of the many ways that the Apple iPad can be used for sound production and performance.
by Warren Burt, May 2015
There has been a lot of discussion these days about tablet computers and their use in pro, (and semi-pro, and amateur) audio. For the past two years I’ve been researching this and have made quite a lot of music with these beasties. There are basically two flavors of tablet usable for audio, the Apple iPad and the Android tablet. A number of the apps that run on the iPad and Android tablets also run on iPhones and Android smart-phones as well. The question, in fact, is not whether usable audio programs can run on these devices, but which platform you prefer and how much “screen real-estate” you require to work in a way that suits your musicality.
For this article, I’ll concentrate on the Apple iPad. It has (as of this writing, at any rate) the most advanced audio facilities, and the most apps of the most diverse kinds for it. In future articles in this series, I’ll cover the use of Android devices, and even music apps for smartphones. For now though, I want to concentrate on the iPad. First of all, we might ask, what is a tablet computer? Depending on how much you pay, the answer is more or less, it’s a very powerful netbook computer, but with a touchscreen instead of an ASCII keyboard. And although on my commute, I see a lot of people on the trains with iPads being controlled from separate Bluetooth keyboards, I think the touchscreen is actually what distinguishes these devices from netbooks, or other small portable computers. I should also emphasize that even now, this is still early days for these devices. Last year, Apple updated their iOS operating system, and simultaneously, Audiobus, the main program that allows music and sound apps to talk with each other, was also upgraded. This caught many app developers by surprise, and eight months later, some of the main music apps I used have still not been updated, and are, until the updating occurs, useless. This sort of thing will sort itself out, eventually, but for the moment, it’s sometimes a pretty wild ride using these devices.
Why would one want to use tablets for music making? Just like with the netbook, the first answer is one of convenience, and portability. When tablets first came out, the question was how much quality were you willing to trade off for portability. Thankfully, now that things are developing, that’s not such an issue. I’ve made fully professional pieces on my iPad, and these have been issued on CD. Admittedly, these were pieces with a limited track count, and were mostly interactive works where I was performing sound-changes on various apps and recording them in a multi-track recording program, but they were still professional level recordings. The second reason is actually the touch-screen itself, which allows a kind of control that other computer interfaces, such as mice and keyboards, don’t. The Canadian media theorist Marshall McLuhan pointed out that usually the first use of a new technology is to do things that the old technology could do perfectly well. Only later on do people figure out uses for the new technology that is intrinsic to the technology itself. So one could, for example, with the right interfaces, make a multi-track recording of guitars, basses and drums, and mix and edit the recording all in the iPad, and the results will be perfectly acceptable. In this case, the iPad is just being used as a smaller cheaper computer. In fact, there is already a book describing this process. (“The Musical iPad Quick Pro Guide: Creating, Performing, and Learning Music on Your iPad” by Thomas Rudolph and Vincent Leonard, published by Hal Leonard. Watch this space for a review of this book in a future issue of Soundbytes.) But what’s been more interesting for me has been using the iPad and its touch screen for controlling interesting aspects of sound making.
When I first started using the iPad, I was most attracted to the idea of using it as a kind of touch-surface gesture controller for other gear. The program Lemur, from Liine, was specifically designed to do just this kind of thing. I quickly made a whole suite of control surfaces that would control algorithmic composing programs on my netbook, connecting the iPad and the netbook via Wi-fi. Here’s an example:
With this control surface, all push buttons, no sliders, I was able to control various aspects of a program on the netbook that produced a three-voice piano-sounds canon which was sonifying one of Julien Sprott’s fractals. The top line allowed me to select one of eight different tempi, the next line controlled the transpositions of the three voices. Next down, buttons allowed on to select the range, in octaves, of the melodies, and below that, various scales that the melodies would be mapped into were selected. Then volume ranges could be selected, and the behaviour of the virtual piano’s pedals could be selected. And at the middle left edge was a controller to turn the whole thing on and off. As you can see, this gave me a lot of control in a very small area. I made a number of pieces like this, some with sets of sliders, some with just buttons, and some using other kinds of controllers available in Lemur, such as a bouncing ball control that allowed one to get control information off a “bouncing ball in a box” device that enabled one to play with “laws of physics,” such as friction, inertia, etc.
Soon after doing these pieces, however, I realized that the sound producing apps in the iPad itself were quite powerful enough to be doing interesting sound making on their own, and I began working with them. And one thing that really enabled me to work this way was the development of the app called Audiobus. Audiobus is an app which allows one to chain various sound and MIDI producing apps together. Apple’s initial philosophy for apps for the iPad was that devices would be “sandboxed,” that is, each app should be a self-contained universe, and the device should operate one app at a time. Needless to say, that philosophy has nothing whatever to do with the creative way that musicians use gear, interconnecting stomp-boxes or sound processing equipment in the hardware world, and using things like DAWs running the VST protocol in the computer world. So Audiobus very definitely filled a particular need that musicians have. Here’s the interface of Audiobus as I used it in a recent piece:
In this diagram, Analog MIDI Sequencer, a MIDI sequencing app, is controlling the Fairlight CMI Pro app (more on that in a moment). The sound output of the Fairlight app is routed into Bird Stepper, a rather wonderful automatable effects app, and the output of Bird Stepper goes into Steinberg’s Cubasis, their iPad friendly version of Cubase. I’ve only got one chain of apps in this instance of Audiobus, but one can create as many parallel chains as one’s CPU can handle.
Here’s a screenshot of Analog MIDI Sequencer. In this app, I’ve got 16 different sets of MIDI notes, velocities, durations and channel numbers from 1-8 randomly being selected in this preset. And I’ve set up four different presets that I can select between, although that’s not shown in this screenshot. At the right side of the screenshot you can see a taskbar which has controls for all the apps currently in Audiobus, so you can control aspects of your programs in real time, without needing to have the screen focus on the apps for everything you want to control.
The MIDI output of Analog MIDI Sequencer is routed into the Peter Vogel Fairlight CMI app. This is a recreation in software of some of the favorite features of the original Fairlight CMI, the world’s first commercial sampler, from 1978. In fact, the app came about when Peter Vogel, the engineer of the original CMI, noticed that a modern iPad had more computer power than the microprocessors that had powered the original CMI. So it seemed to him that the CMI sampling program and the iPad were a natural fit. Recently, they have added a new page (“Page 6”) to the program, which, like the original CMI, allows the user to draw waveforms of varying kinds of complexity. I drew eight waveforms which tended to the noisy end of the spectrum, and then loaded them into “Page 3,” the instrument page, so that the eight samples were controlled by eight different MIDI channels. The CMI is then able to play my eight homemade samples responding to the Analog MIDI Sequencer. Note the Audiobus control bar at the bottom middle of the screenshot. (The CMI app, by the way, is just one of dozens of interesting and powerful synthesizer and sampling apps for the iPad. I hope, in a future article, to give a rundown on a number of these.)
Next in our chain of apps in Audiobus is Bird Stepper, a very fun effects sequencer. Bird Stepper has eight effects to choose from, and you can place them in any order. It then has sequencer graphs in which you can draw automation curves which will control aspects of your effects. In this patch I controlled Spectre, a filter/delay effect, and Ambience, a reverb/delay effect, with graphs like the one in Figure 5. These curves were being moved through very slowly, controlled by the internal clock. For other applications, you can, of course, take a sync pulse from a DAW to have effects in sync with the project clock.
The final stage in our chain is to route the sound into Steinberg’s Cubasis, a scaled down version of Cubase. Scaled down but still very powerful. In this piece, I simply used Cubasis to record four passes of the patch (using a different preset on Analog MIDI Sequencer for each pass), each one independent of the others. These four tracks were then mixed and panned together. The “Mixdown” facility of Cubasis was used to make a 2 track WAV file of the short piece, which then was transferred to AudioShare using the “Share” facility in Cubasis. (AudioShare is a sound editor program which allows transfer of audio from one program to another on the iPad. Programs like this allow one to further escape the “sandboxing” syndrome.) Here’s what the Mixer page of Cubasis looks like:
Once the piece was edited in AudioShare, I then exported the WAV file, via iTunes, to my PC. All operations, from making the original waveforms, to designing the MIDI sequences, and the Bird Stepper effects, through multi-track recording, mixing and editing of the final piece, was all done within the iPad itself. The result is clean sounding audio of professional standard. The iPad has had the potential, for many years, to be a serious music making platform. Now, it’s finally arrived there.
Before we finish this first look at the iPad, I want to show a few more apps which do interesting sonic things in unique ways. Regular readers of my reviews in SoundBytes know that I consider microtonal capability to be an essential ingredient of any serious music making platform. Fortunately, this is an area of development in the iPad app world. There are a number of programs which can now be used to generate microtonal scales and save the results in the universal Scala format. And there are a number of synthesizer apps which can now use these Scala files to play microtonally. For example, here’s a screenshot of the Wilsonic app (full disclosure: I’m a beta-tester for this app), which allows one to explore some aspects of the tuning universe of the tuning theorist Ervin M Wilson. This page is used for exploring the “Eikosany,” a twenty-note scale type based on multiplying harmonics against each other.
Thumbjam is a very wonderful sampling app which allows you to use their samples, or import your own, and it can also import (via Wi-fi) Scala files so that you can play your samples in any microtonal scale you wish. It also accepts MIDI, and can record and sequence loops of various kinds. It’s a very useful and fun program. In this screenshot, at the bottom, you can see that the Acoustic Guitar samples are being played in a 20 note to the octave Eikosany scale.
Just before we leave, I should mention that there are a number of iPad apps which allow unique ways of processing sounds and samples. These apps usually feature unique interfaces, which allow one to control sounds in interesting and innovative manners. Here are two apps among many that have unique interfaces and unique capabilities.
First is Sector. It divides your input sound into 2 to 32 segments. You then specify how the segments will succeed each other, and with what probabilities. You then specify how the segments will be warped, or not, and the probabilities of which kind of warping being chosen each time a segment is accessed. Tempo, sync, set sequences versus random sequencing, and a number of other parameters can be controlled. The result is a unique kind of sound fragmenting and sequencing. And, of course, Sector can be used in combination with other apps through AudioBus, so all its sequencing and fragmenting possibilities can be used in conjunction with other sound processing and recording programs.
Second is Borderlands Granular. This is a program that enables you to import samples, which are then displayed on the screen as waveforms. These waveforms can be moved about the screen, expanded, contracted, zoomed as you please, and then double tapping at a particular place will create a circle with smaller circles surrounding it. These circles are the parameters of a granulator. You can move the circles about, which will then select different parts of the waveform to draw your grains from. You can have a number of these grain generators operating simultaneously. And you can even activate the “G” button, which turns on the iPad’s accelerometers, so that you can move the granulating circles about the screen by tilting the screen! If you tilt the screen too much, you lose that granulating circle. This is the most “game-like” interface I’ve seen which allows one to play sound modifications like a game, but the results are still very musical and interesting. In this screenshot, there are five waveforms being displayed. On the upper three, grain generating circles (crop circles?) have been placed. In this example, all three circles are made as small as possible, so that grains are drawn from a single point in the waveform. This creates clouds of an extended pitch and timbre, depending on what part of the waveform is accessed. By moving the circles around, I can create new chords/textures based on the original sounds I placed on the screen.
This article has just barely scratched the surface of the many kinds of apps available for the iPad. And I haven’t even mentioned the different kinds of interfaces that are available, interfaces which allow you, for example, to process an instrument’s sounds live. Nor have I mentioned the possibilities of live performance with MIDI controllers. The iPad, and other tablets and smartphones offer a whole new world of sound generating and modifying possibilities, and in future articles, we hope to show you some of these.