Thursday, September 30, 2010

HP Creative Camp

Welcome, HP Creative Camp participants who have found your way here. And thank you for all the positive feedback I got at camp, and in email afterwards. If you want to follow my music/microcontroller experiments add this bookmark:

http://gordophone.blogspot.com/

I had the distinct pleasure of participating in HP Software's Creative Camp (modeled on Foo Camp) in Sep. 2010 at the Stanford Sierra Conference Center. I gave a hastily prepared talk on the work I've been doing with microcontrollers and music, gave a demo of my recently completed 3-dimensional trombone, and some weird stuff I've been doing around sonification (sorry, no public posts about that yet). I also "performed" at the big party event, making some sounds with my iPhone+TouchOSC and OSCUlator+Logic, accompanied by other Creative Camp musicians. It was a blast. A video summary of the event was made, and I'm at 4:51.

Oh, and the Silent Disco (everyone puts on wireless headphones) put on by DJ Motion Potion was pretty damn cool! Robbie gave a presentation earlier in the the day (also hastily prepared. according to Robbie, but way, *way* better than mine) about DJ technology and history. As a traditionally trained musician, I've always found DJ culture puzzling, but the way he framed the whole presentation around passion really spoke to me. Thanks, Robbie.

P.S. I looked over the comments from the camp, and someone commented that my session caused them to "rocket into ideation mode". That's exactly what happened to me when I attended the New Music Controllers Workshop at Stanford, and if I managed to trigger that in someone else's mind, that's pretty great. Thanks for sharing that!

Tuesday, August 31, 2010

The 3-D Trombone

When I was at the New Music Controllers workshop at CCRMA this summer, after I'd demoed my trombone controller and mentioned that I was interested in using a retractable string for a slide (based on a suggestion from Chris Graham), a CCRMA grad student named Michael Berger clued me into a very cool device - the GameTrak Controller.

The GameTrak is a pretty amazing little device. Think of a joystick, which can measure X and Y axis movements, then add a retractable cord that protrudes through the handle and can measure Z-axis displacement. There are actually two of these in each GameTrak, and the player wears a pair of gloves that attach to clips on the end of the Z-axis cords. Internally, there are 6 potentiometers that hook to a small board that does all the analog to digital conversion and appears as a USB HID device. Hooking one of these up to, say, Max/MSP or pd is super simple, and here are some cool things that have been done with GameTraks:

Game Trak Theory (A CCRMA performance)
Cop de Cap by Experimental Headbang Orchestra (Stanford)

Since the Wii pretty much destroyed the GameTrak in the gaming market, they're available very cheaply now - I think I now own 6, and I got them for $20 each.

Originally I was only interested in cannibalizing one for the z-axis retractable cord, to use that for the slide of my trombone. But once I got one of the GameTraks open, I thought "why limit the slide to linear motion? Why can't we build a trombone "slide" that operates in 3 dimensions?" And so the 3-D Trombone was born.

Opening up the GameTrak is very easy, and the spring/joystick mechanism for one-half of the device can easily be removed with just a small philips screwdriver. I remounted the assembly in a project box, which is a lot larger than I'd like it to be, but it's a prototype.

I built another breath controller using garden irrigation tubing and the same Freescale Pressure Sensor I used for the Gordophone. For overtone selection, I epoxied some momentary switches into piece of PVC tubing (in the Gordophone, these switches are in the joystick handle that is used to move the slide).

After wiring everything up, I made two changes to the Arduino sketch that does the sensor reading and MIDI event generation:

- Rescaled the "slide" motion limits, since the GameTra can measure about 6 feet of z-axis motion, but a trombone slide is only a couple of feet.
- Coded things so that the X and Y axis controllers produce MIDI continuous controller data on controllers number 16 and 17.

Finally, I put together a patch and an effect in Logic, as follows:

- A simple sine wave instrument using the ES2 synth. The breath controller is mapped to the oscillator amplitude.
- The EVOC 20 TrackOscillator Filter. I used Logic's "Learn" mode to set things up so that the slide's X axis motion controls the Formant Shift, and the Y axis motion controls the LFO intensity. I set the LFO frequency at 100 Hz so it really distorts the sound (I really lean into that distortion at around 0:52 and 1:02).

I made the X and Y axis controls very non-subtle, so I could tell when they were working. This video shows the whole thing in action.


video

One thing that's very tricky is maintaining a constant slide position on the Z-axis while moving around the X and Y axes. While playing a traditional trombone, one can move around, but the player's body position relative the the instrument remains constant. With the 3-D Trombone, all that changes, and it (so far) seems like a radically different experience. More exprimentation is due.

Sunday, July 4, 2010

A week of good CCRMA



The Knoll - CCRMA Home

This past week I cashed in some vacation time and attended the 5-day New Music Controllers Workshop, led by Edgar Berdahl and Wendy Ju, at Stanford's Center for Computer Research in Music and Acoustics (CCRMA, pronounced "karma"). It was a great experience, with terrific guest lecturers and a fun group of participants.

(Update 7/22/10: Video of all the workshop demos is now online on the CCRMA website.)

The workshop is a one-week version of the same material covered in the Physical Interaction Design for Music course offered at Stanford (Ed and Wendy teach that course as well). Topics covered in the course include the Verplankian Physical Interaction Design Framework, basic electronics, the Arduino, using Max/MSP and pd, and using sensors to interact with the real world.

The guest lectures were one of the high points of the workshop. We heard from Bill Verplank (Interaction Design), Dan Overholt (Music Interface Technology Design Space), Alexandros Kontogeorgakopoulos (Cardiff School of Art and Design - Haptic Digital Audio Effects), and Ge Wang (Chuck programming language, Smule). I particularly enjoyed seeing all the cool projects that Dan Overholt did, including Overtone Violin, Sphere Spatializer, and his Overtone Labs work. And if you've never seen Ge Wang talk, you really, really should. He blew me away with his geek chops, his musician chops, and his business chops -- all in one hour.

I also really enjoyed meeting all the other participants. We were all over the range in terms of experience with music performance, synthesis, hardware, software... but everyone chipped in to help each other fill in their knowledge gaps.

Here are some photos and videos of the workshop:



Ian's Thumb Piano (force sensitive resistors, an acceleromoter, and some Max/MSP code)

Ian is also no stranger to geeking out with electronics and music. Here he is with his Guitamoton (built previously):



and a video of it in action:

video




A sensor glove from Jenifer (Masters student in Intermedia Music Technology at the University of Oregon)



Jeremy and Chekad testing their "conductor" device that allows the player to change the tempo of a performance by conducting. They had about two days to build it, and it actually worked!

For the final demo/concert in the CCRMA Stage, we had a great audience of CCRMA faculty and students, including John Chowning (!).



Between sessions, I wandered around the CCRMA building. Man, that place is one big house full of cool toys:


The door to the Max Lab, named after Max Matthews



Inside the lab - a music/hardware/software hacker's paradise. And we had full access to it for the week.




Yes, there's one whole bin just for accelerometers.

On the CCRMA ground floor there's a museum of sorts. Check this stuff out:



A NeXT cube. Man, I used to support those things at U of M. The optical drive qualified as a percussion instrument.


A Yamaha DX-7, the first commercially successful digital synthesizer. The FM synthesis patent that Stanford licensed to Yamaha was at one point the most lucrative patent held by the University.



An early prototype FM synthesizer from Yamaha? Check out the console:



Notice how there are actually four separate monitors.



Interesting aural possibilities...

And, in the 2nd floor common area:


When reading Computer Music Journal, it's best to have an ample supply of Tabasco on hand.

After the demo/performance, some beer was consumed, and then Chekad pulled out his violin. I had no idea he was such an accomplished violinist, both in western styles and in the styles of his native Iran. He jammed with Dan Overholt, and then with Alexandros, who managed despite lacking a piano bench:


video


video

A nice end to the week.

-Gordon

Friday, June 4, 2010

Updated Trombone Controller

Over the past month or two, I've been slowly working on a refinement to the trombone controller, and I've finally got something that's physically stable and reasonably playable. Here's an update:

First, here's a photo of the prototype:



If the wooden handle (used to hold the instrument) looks like an axe handle to you... you'd be right. I'm still working on how the performer's left hand is involved, but for the time being, I've relegated it to a supporting role.

The major change between this prototype and the previous prototypes is the slide. The new prototype is essentially a copy of one half of Thomas Henriques Double Slide Controller. The basic construction is pretty simple, and involves:

  • A piece of 1/8" x 1 1/2" x 36" aluminum stock (Home Depot) - the substrate for the "trombone slide"
  • A 24" x 2" piece of plexiglass (Tap Plastics - a 15 minute walk from my house) - the bearing surfaces of the "slide"
  • A 5" x 3" piece of plexiglass, that holds:
  • Two 3/4" x 2 3/4" blocks of teflon, which I shaped with a router, and ride along the long plexiglass piece. These are equivalent to the outer parts of the "slide"
  • A repurposed joystick controller, which the player holds in his/her right hand. The controller handle is used to move the "slide", but also has 5 buttons that control overtone selection and a slide quantization mode (I'll explain that later). The buttons were epoxied into holes drilled into the joystick handle.
  • A project box that contains a Freescale pressure sensor, and all the wiring interconnects.
  • A "mouthpiece" that allows the performer to blow into the pressure sensor.
To set the stage, here's a little video that shows how the instrument is held, and how the slide moves:


video

Sensors

There are three types of sensors on the instrument:

Breath is detected via a Freescale pressure sensor. In the photo below, the mouthpiece (the clear plastic vinyl tubing that the player blows into) is connected to a box with some 1/4" tubing. The box contains the pressure sensor, along with the other wiring interconnects. At the top of the mouthpiece, there is a "T" connector that allows half of the airflow to exit (so the player feels like s/he is blowing through the instrument) and the rest goes to the pressure sensor. The Arduino code reads the sensor and produces MIDI Breath Controller data.




Slide Position is detected with a SpectraSymbol 500mm SoftPot linear potentiometer. The SoftPot is adhered to the aluminum stock using the adhesive backing provided with the SoftPot. The aluminum stock is screwed to a slightly wider piece of plexiglass. A mechanism slides along the edges of the plexiglass. The bearing surfaces are made of teflon block. I cut a groove in two pieces of the block using a router, and these grooves line up with and ride along the plexiglass, as shown below.



In the middle of the clear block, you can see what appears to be a setscrew. This is actually a stylus that is manufactured by SpectraSymbol. You can't see it in the photo, but the end of the screw is a small nylon stylus that rides on a small spring that provides a constant force. This makes the pressure on the SoftPot very consistent, which means the slide behavior is very predictable.

The Arduino code reads the slide position and computes the appropriate pitch bend to send. My most recent firmware also includes a mode where the slide positions are quantized - in effect, the slide "clicks into position" automatically. Although a glissando isn't possible with this setup, the instrument's notes are always in tune. The slide quantization mode can be toggled on and off using the thumb of the right hand (there was a spare button on the joystick handle that was perfect for that).

A future enhancement I'm considering is providing an LED on the instrument that lights up when the player has the slide in one of the seven positions (well, actually, within a certain range of the dead-on position). Trombonists are used to reaching out to touch the bell to gauge where 3rd position is; this LED would do the same thing, but for all seven positions.

Overtone Selection is accomplished via a set of four switches on the handle that the player uses to move the slide. Here's a picture of the handle:



The handle is a repurposed joystick, with some momentary switches epoxied into some (very crudely drilled) holes I made in the handle. The "trigger" button is actuated with the index finger, and the remaining buttons are operated with the second, third, and fourth fingers. By using a simple "chording" method, the player can select any one of eight overtones:

Overtone 0: off off off off
Overtone 1: on off off off

Overtone 2: on on off off
Overtone 3: on on on off
Overtone 4: on on on on
Overtone 5: off on on on
Overtone 6: off off on on
Overtone 7: off off off on

Now, only 8 overtones is not really enough to make a trombone player feel at home (most accomplished players can produce at least 10), so I need to think about this some more, but 8 overtones does give me enough range to play stuff that's interesting.

Here's a short video of me playing a little improvisation on the instrument. I still have some work ahead of me to produce a patch that makes the instrument play well. The patch you're hearing is one I built for the ES2 FM synth that comes with Apple Logic Express.


video


I have to say that, while the instrument is still pretty glitchy, I'm starting to feel like it's possible to be expressive with it. I'm also feeling like the instrument is pretty consistent; it behaves predictably, which allows me to practice a musical passage, get it right, and then to be able to perform it in a repeatable fashion.

Finally, as a point of comparison addressing the expressiveness of the instrument, here's an improvisation in the same vein, but on my trombone.



video

Monday, May 3, 2010

Gordophone Slide Handle

After trying to place the overtone selector switches on the left hand, I've decided that's no good - it's too hard to hold the instrument with the left hand and also actuate the switches. So I've decided to copy the general layout of Thomas Henrique's Double Slide Controller and put the overtone selectors on the right hand.

To accomplish that, I'll need some sort of a handle I can put tactile switches on. After I thought about it for a while, I realized what I needed would look an awful lot like a joystick handle. So, after a quick trip to Weird Stuff Warehouse, I had two old joysticks ($8 total). Time for disassembly!




I picked up two different joysticks, both of which have a large surface area on the right where I should be able to mount buttons. Also, the joysticks have some existing buttons that could be used for other functions, like patch changing.

I started with the simpler, two-button joystick. Off with the bottom plate...



And expose the inside of the handle, to see what's in there. Pretty simple, really - two switches, and 3 wires out the bottom.



The base housed the X-Y potentiometers. I won't be needing them, so out they came, which released the handle.



Verifying the wiring of the switches - as expected, green is ground.



So that handle is ready to try out. I'll need to attach it to the new slide I'm building. More on that in a later post.

I also bought a Wingman joystick. This one is more complicated - there's a trigger, three pushbuttons, and a small thumb-actuated joystick. Those all could be interesting to use.




Getting the handle out required some... surgery. Let's just say the warranty is definitely voided now.



There are seven wires coming out the bottom. I tried a little "black box reverse engineering" to see what leads corresponded to what switch, but that proved inconclusive.



So I opened up the handle. And all the switches promptly fell out. Everything is held in place by the two halves of the handle. I eventually got everything back together, but it wasn't easy.




Next up: order some sample tactile switches from Mouser and find just the right one. I'll use the joystick's trigger for the index finger, so I'll need three additional switches for the other three fingers.

Tuesday, March 16, 2010

The Marmonizer, Version 4

What is the Marmonizer?

The Marmonizer is a MIDI Harmonizer. To use it, you attach a MIDI instrument to the MIDI IN port, and attach a synthesizer, or a computer running a softsynth, to the MIDI OUT. When you play a note on the MIDI instrument, the Marmonizer sends that note, plus other notes, to its MIDI output.

I originally conceived of the Marmonizer as something that could be used by players of wind synths like the Yamaha WX-5 or the Akai EWI. But there's no reason it couldn't be used by players of other kinds of MIDI instruments. In the clips in this post, I'm using an older Yamaha WX-7.

There are a number of different harmonization algorithms that the Marmonizer knows how to produce. Some are quite simple. For example, one of the harmonizations produces a major triad in first inversion. If you play a C, the Marmonizer will send C, the G below it, and the E below that. If you play an E, the Marmonizer will send E, B, and G#. Here's a clip of this harmonization. You'll hear it unharmonized once, then with the harmonization.










Other algorithms are more complex. For example, there is a harmonization that sounds a raised-ninth chord (the played note becomes the raised ninth), and then sounds a bass note underneath, forming a slash chord. The actual bass note that sounds is randomly selected from one of four possibilities, which can produce some interesting voice leading when you play a melody.










For those of you who are familiar with Michael Brecker's EWI work, this idea of changing the bass note around comes from the patch he uses on Original Ray's from his debut album.

In this clip, I play a line unharmonized, then I play the same line in each of the 11 harmonizations the Marmonizer knows about, and finally the unharmonized line again.











There's also mode where the Marmonizer cycles through all the harmonizations it knows. Each new note gets a different harmonization. This can produce some pretty zany results when you drive an Asian percussion ensemble. In this clip, I'm just double-tounging a single note for a couple of bars, then a different note. Since the actual output notes are changing as the Marmonizer cycles through its 11 harmonizations, the result is pretty interesting.










Since this is all just code running on a microcontroller, the harmonizations can be more complicated than the ones mentioned above. As of now, I've only begun to think about all the possibilities, but some thoughts are:

  • Allow the player to specify a key, and make the harmonizations make sense in that key.
  • Select several different harmonizations and automatically cycle between them on each new note.
  • Allow different notes to be steered to different MIDI channels. Probably the most useful configuration would be to sound the topmost voice on one MIDI channel, and the other voices on a different channel.
  • Allow new harmonizations to be programmed by the user.
  • Allow changing the harmonization by sending the unit a MIDI program change. For a live setup, the player could use a stomp box to select harmonizations. Possibly allow harmonizations to be grouped into banks (or maybe use a folder paradigm) to allow a performer to choose a set of harmonizations that work well together for a particular piece.
  • Save/load harmonizations via MIDI system exclusive messages.
  • Allow the player to control how many notes of the harmonization sound, perhaps via a knob or expression pedal. Or, make the number of notes sounding a function of the note on velocity.
  • Adding new "algorithmic" harmonizations that give the player a high degree of control and reproducability.
Finally, here's a little improvisation:










The Hardware

The hardware is pretty simple: an Arduino microcontroller, some pushbutton switches, some toggle switches, and some potentiometers. The current version only utilizes one pushbutton, one toggle switch, and one pot, but future versions may enable more controls. It's just a prototype at this point.

For a final version, I'm investigating using Ruin & Wesen's Minicommand, which is an outrageously cool idea, and way more roadworthy than anything I'll ever be able to build.

The code:


/**

The Marmonizer

The Marmonizer is a MIDI harmonizer. It takes MIDI data on its input port and
sends harmonized data on its output port. The types of harmonizations will
eventually be user-programmable and extremely flexible.

Version 4:

Version 4 builds on version 3, which was a simple MIDI harmonizer with some
randomization. Version 4 introduces:

- multiple voicings (11 to be precise)
- allows the player to select which voicings are playing
- allows a cycle mode, where each new note on selects a new harmonization algorithm
- allows a split channel mode, where the top note in each harmonization
goes to a primary MIDI channel, and all others go to a secondary channel
(currently primary and secondary are fixed at 1 and 2, respectively)
- passes continuous controllers
- allows the player to control how many of the possible notes in a
particular voicing are sounding (with a potentiometer).
- fixes a stuck note problem with v3

Limitations: the algorithm always "maps down" so we may roll notes off the
deep end of the MIDI spec.

Gordon Good (velo27 <at> yahoo <dot> com)
Mar 18, 2010

*/

#include <MidiUart.h>
#include <Midi.h>

#include <Debounce.h>

MidiClass Midi;

int ledPin = 13; // LED pin to blink for debugging
#define CYCLE_MODE_PIN 7 // Switch connected to this pin sets cycle mode (new harmonization on each note on)
#define SPLIT_CHANNEL_MODE_PIN 6 // Switch connected to this pin sets split channel mode

#define MAX_VOICES 5 // Maximum number of voices allowed in a harmonization
int nVoices = MAX_VOICES; // number of voices for current harmonization to sound

// A structure that represents a harmonization algorithm. It currently
// includes a name for the algorithm, and a pointer to a function
// that implements the algorithm.
typedef unsigned char* (*harmonizationAlgorithm)(byte); // a harmonizationAlgorithm knows how to harmonize any midi note
typedef struct {
char *name;
harmonizationAlgorithm algorithm;
} Harmonizer;

// Minimum and maximum values we read from potentiometers.
int POT_MIN = 0;
int POT_MAX = 1023;

boolean isCycleMode = false; // If true, a new harmonization sounds on each note on event

// Digital input 2 cycles through the harmonizations if
int PIN_HARMONIZATION_SELECT = 2;

// All available harmonizers
Harmonizer allHarmonizers[11] = {0};

int nHarmonizers = sizeof(allHarmonizers) / sizeof(Harmonizer); // the number of harmonizers

// The index of the current harmonizer
int harmonizationIndex = 0;

// These are the notes of the current harmonization.
unsigned char harmonization[MAX_VOICES] = {0};

// A structure that keeps track of a sounding note (note number, MIDI channel)
typedef struct {
unsigned char note; // MIDI note number
unsigned char channel; // MIDI channel
} SoundingNote;

// This array keeps track of all the cuurently sounding notes.
SoundingNote notesOn[128][MAX_VOICES] = {0};

// Boolean that tracks if we are sending the top note to one channel the the other
// notes to a different channel
boolean isSplitChannelMode = false;
#define PRIMARY_MIDI_OUT_CHANNEL 0 // Human-friendly name is channel 1
#define SECONDARY_MIDI_OUT_CHANNEL 1 // Human-friendly name is channel 2

// Instiantiate debouncers for the pushbutton switches.
Debounce debouncer_harmonization_select = Debounce(20, PIN_HARMONIZATION_SELECT);

/* ********** Harmonization Algorithms ********** */
// These are all the harmonization algorithms the program knows about

/*
* A harmonization algorithm that sounds like Michael Brecker's
* Oberheim XPander patch on "Original Ray's" from the album
* "Michael Brecker" (MCA Records, 1987).
*/
unsigned char *breckerizeAlgorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -4;
harmonization[2] = -9;
int rnd, sel;
rnd = millis() % 4;
sel = rnd % 4;
sel = (sel + 1) % 4;
if (0 == rnd) {
harmonization[3] = -14;
} else if (1 == rnd) {
harmonization[3] = -15;
} else if (2 == rnd) {
harmonization[3] = -25;
} else if (3 == rnd) {
harmonization[3] = -23;
} else {
harmonization[3] = 0;
}
harmonization[4] = 0;
return harmonization;
}
Harmonizer breckerize = {
"Breckerizer",
breckerizeAlgorithm
};

/*
* A tritone chord with the played note on top,
* and a random note on the bottom.
*/
unsigned char *tritoneChordAlgorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -5;
harmonization[2] = -11;
int rnd, sel;
rnd = millis() % 4;
sel = rnd % 4;
sel = (sel + 1) % 4;
if (0 == rnd) {
harmonization[3] = -14;
} else if (1 == rnd) {
harmonization[3] = -15;
} else if (2 == rnd) {
harmonization[3] = -25;
} else if (3 == rnd) {
harmonization[3] = -23;
} else {
harmonization[3] = 0;
}
harmonization[4] = 0;
return harmonization;
}
Harmonizer tritoneChord = {
"Tritone",
tritoneChordAlgorithm,
};

/*
* A major triad in first inversion with the played
* note on top (only three note are played).
*/
unsigned char *majorTriadFirstInversionAlgorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -5;
harmonization[2] = -8;
harmonization[3] = 0;
harmonization[4] = 0;
return harmonization;
}
Harmonizer majorTriadFirstInversion = {
"MajTriad",
majorTriadFirstInversionAlgorithm
};

/*
* A series of stacked fourths.
*/
unsigned char *fourthsAlgorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -5;
harmonization[2] = -10;
harmonization[3] = -15;
harmonization[4] = 0;
return harmonization;
}
Harmonizer fourths = {
"Fourths",
fourthsAlgorithm
};

/*
* A series of stacked fifths.
*/
unsigned char *fifthsAlgorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -7;
harmonization[2] = -14;
harmonization[3] = -21;
harmonization[4] = 0;
return harmonization;
}
Harmonizer fifths = {
"Fifths",
fifthsAlgorithm
};

/*
* A fifths-based voicing (from Brian Good)
*/
unsigned char *feetAlgorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -7;
harmonization[2] = -8;
harmonization[3] = -15;
harmonization[4] = -22;
return harmonization;
}
Harmonizer feet = {
"Feet",
feetAlgorithm
};

/*
* A Jon Hassell-style voicing (from Brian Good)
*/
unsigned char *hassell1Algorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -5;
harmonization[2] = -7;
harmonization[3] = 0;
harmonization[4] = 0;
return harmonization;
}
Harmonizer hassell1 = {
"Hassell1",
hassell1Algorithm
};

/*
* Another Jon Hassell-style voicing (from Brian Good)
*/
unsigned char *hassell2Algorithm(byte note) {
harmonization[0] = -2;
harmonization[1] = -5;
harmonization[2] = -7;
harmonization[3] = 0;
harmonization[4] = 0;
return harmonization;
}
Harmonizer hassell2 = {
"Hassell2",
hassell2Algorithm
};

/*
* A rootless Bill Evans-style voicing (from Brian Good)
*/
unsigned char *evans1Algorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -5;
harmonization[2] = -7;
harmonization[3] = -10;
harmonization[4] = 0;
return harmonization;
}
Harmonizer evans1 = {
"Evans1",
evans1Algorithm
};

/*
* Another rootless Bill Evans-style voicing (from Brian Good)
*/
unsigned char *evans2Algorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -7;
harmonization[2] = -9;
harmonization[3] = 0;
harmonization[4] = 0;
return harmonization;
}
Harmonizer evans2 = {
"Evans2",
evans2Algorithm
};

/*
* Yet another rootless Bill Evans-style voicing (from Brian Good)
*/
unsigned char *evans3Algorithm(byte note) {
harmonization[0] = 0;
harmonization[1] = -4;
harmonization[2] = -5;
harmonization[3] = -8;
harmonization[4] = 0;
return harmonization;
}
Harmonizer evans3 = {
"Evans3",
evans3Algorithm
};



/* ********** End Harmonization Algorithms ********** */

/*
* Figure out which MIDI channel this note should go out on. Currently,
* the top note of a voicing goes out on the primary channel, and all
* other notes go out on the alternate channel.
*/
unsigned char determineMidiOutChannel(boolean isTopNote) {
if (!isSplitChannelMode || isTopNote) {
return PRIMARY_MIDI_OUT_CHANNEL;
} else {
return SECONDARY_MIDI_OUT_CHANNEL;
}
}

/*
* Handle a note on event. Map the note to its harmonizations, and turn
* on those MIDI notes.
*/
void noteOnCallback(byte *msg) { // or is it uint8_t?
digitalWrite(ledPin, HIGH);
unsigned char origNote = msg[1];
unsigned char *harmonization = allHarmonizers[harmonizationIndex].algorithm(origNote);
for (int i = 0; i < nVoices; i++) {
unsigned char newNote = origNote + harmonization[i];
unsigned char channel = 0;
if (0 != newNote) {
channel = determineMidiOutChannel(0 == i);
MidiUart.sendNoteOn(MIDI_VOICE_CHANNEL(channel), newNote, msg[2]);
}
notesOn[origNote][i].note = newNote;
notesOn[origNote][i].channel = channel;
}
}

/*
* Look up all the transposed notes for the given note
* and turn them off.
*/
void noteOffCallback(byte *msg) {
digitalWrite(ledPin, LOW);
unsigned char note = msg[1];
for (int i = 0; i < 4; i++) {
unsigned char noteOff = notesOn[note][i].note;
unsigned char channel = notesOn[note][i].channel;
if (0 != noteOff) {
MidiUart.sendNoteOff(MIDI_VOICE_CHANNEL(channel), noteOff, 0);
notesOn[note][i].note =notesOn[note][i].channel = 0;
}
}
if (isCycleMode) {
harmonizationIndex = (harmonizationIndex + 1) % nHarmonizers;
}
}

/*
* Echo any received continuous controller data, e.g. breath controller,
* to the primary output channel, and to the secondary output channel
* if it is enabled.
*/
void continuousControllerCallback(byte *msg) {
MidiUart.sendCC(MIDI_VOICE_CHANNEL(PRIMARY_MIDI_OUT_CHANNEL), msg[1], msg[2]);
if (isSplitChannelMode) {
MidiUart.sendCC(MIDI_VOICE_CHANNEL(SECONDARY_MIDI_OUT_CHANNEL), msg[1], msg[2]);
}
}

void afterTouchCallback(byte *msg) {

}

void channelPressureCallback(byte *msg) {

}

void programChangeCallback(byte *msg) {

}

void pitchWheelCallback(byte *msg) {
int16_t bend = msg[0] << 16 + msg[1];
MidiUart.sendPitchBend(1, bend);
}

/*
* Turn off all the notes that are on, and reharmonize them with
* the new algorithm. This will eventually allow the user to
* cycle to a new harmonization without sounding a new note.
*/
void reharmonize(int oldIndex, int newIndex) {
// Not yet implemented. Before we can implement this,
// we need to keep track of which MIDI channel the sounding
// notes are on. We currently don't.
}

/*
* Read the hardware attached to the Arduino, and set global state
* accordingly.
*/
void readHardware() {
// Read the pot that controls how many voices should sound
nVoices = map(analogRead(0), POT_MIN, POT_MAX, 1, 4);
isCycleMode = digitalRead(CYCLE_MODE_PIN);
isSplitChannelMode = digitalRead(SPLIT_CHANNEL_MODE_PIN);
// Read the button that increments the harmonization type
if (debouncer_harmonization_select.update() && debouncer_harmonization_select.read() == HIGH) {
harmonizationIndex = (harmonizationIndex + 1) % nHarmonizers;
}
}

/*
* Enable a digital pin for input, and set the pullup.
*/
void enableDigitalInput(int pin) {
pinMode(pin, INPUT);
digitalWrite(pin, HIGH);
}


void setup() {
// Enable the MIDI library and register callbacks
MidiUart.init();
Midi.setOnNoteOnCallback(noteOnCallback);
Midi.setOnNoteOffCallback(noteOffCallback);
Midi.setOnControlChangeCallback(continuousControllerCallback);
Midi.setOnAfterTouchCallback(afterTouchCallback);
Midi.setOnChannelPressureCallback(channelPressureCallback);
Midi.setOnProgramChangeCallback(programChangeCallback);
Midi.setOnPitchWheelCallback(pitchWheelCallback);

// Set analog ports for input
pinMode(0, INPUT);
pinMode(1, INPUT);
pinMode(2, INPUT);

// Set digital pins for input, enable pullups, set up debouncers
enableDigitalInput(PIN_HARMONIZATION_SELECT);
enableDigitalInput(3); // Not used yet
enableDigitalInput(4); // Not used yet
enableDigitalInput(5); // Not used yet
enableDigitalInput(SPLIT_CHANNEL_MODE_PIN);
enableDigitalInput(CYCLE_MODE_PIN);

// Initialize the available harmonizers
allHarmonizers[0] = majorTriadFirstInversion;
allHarmonizers[1] = fourths;
allHarmonizers[2] = fifths;
allHarmonizers[3] = tritoneChord;
allHarmonizers[4] = breckerize;
allHarmonizers[5] = feet;
allHarmonizers[6] = hassell1;
allHarmonizers[7] = hassell2;
allHarmonizers[8] = evans1;
allHarmonizers[9] = evans2;
allHarmonizers[10] = evans3;
}

/*
* Main loop. Read buttoins/switches/pots, update global state,
* and handle any MIDI data that has arrived.
*/
void loop() {
while (MidiUart.avail()) {
readHardware();
Midi.handleByte(MidiUart.getc());
}
}

Saturday, March 13, 2010

Marmonizer Progress

Here's a little video about my progress building the Marmonizer (a MIDI Harmonizer). I'll post the Arduino sketches soon.

The basic idea is that you plug a MIDI instrument into the MIDI in port, and a synthesizer (or a computer with softsynths) into the output. When you play a note into the Marmonizer, you get different, and probably more, notes out. It's really optimized for a monophonic instrument like the WX-7 wind controller I'm playing in the video (disclaimer: I'm a trombonist by training, so I have no sax chops).

In this video, the WX-7 is plugged into the MIDI in of the Marmonizer, and the Marmonizer's MIDI out is plugged into a MIDI interface that's plugged into my Mac, which is running Logic Express 9. The sounds you hear are a Logic ES-2 synth patch that I modified to respond to breath controller data.

video

Thursday, February 11, 2010

Switch-Based Overtone Selector

My latest experiment with overtone selection on the trombone controller is to use four momentary switches, played with the left hand, to select overtones. I used some Radio Shack lever switches and epoxied them to a 1/2" by 1/2" piece of scrap wood I had, then tie-wrapped it to the handle of the instrument (hey, I'm just prototyping).

The four switches are wired to pull Arduino digital pins 2, 3, 4, and 5 to ground when pressed, and I coded up my sketch to give the following overtones for the given switch selections:

Switch
3210 Overtone
0000 OT_1 (B flat)
0001 OT_2 (F)
0011 OT_3 (B flat)
0111 OT_4 (D)
1111 OT_5 (F)
1110 OT_7 (A flat*)
1100 OT_8 (B flat)
1000 OT_9 (C)

Switch 0 is under the index finger, and switch 3 is under the pinky. Here's a short video showing how it is played:


video

In terms of playability, it feels pretty good. I can more or less play a scale and the fingers of the left hand will generally do the right thing.

Here's the sketch:



/*

Prototype sketch for a trombone-like MIDI controller based on the Arduino hardware.

Hardware:

- An set of four switches used to select an overtone. We use "chording" to allow
the 4 switches to select overtones. I'm not sure what the most natural method
of chording is, but let's try the following:

Switch
3210 Overtone
0000 OT_1
0001 OT_2
0011 OT_3
0111 OT_4
1111 OT_5
1110 OT_7
1100 OT_8
1000 OT_8

Switches 0-3 are wired to pull Arduino digital input pins 2-5 low when
pressed.

- A "slide". Currently, this produces pitch bend information, and is implemented
with a 500mm SpectraSymbol SoftPot linear resistance strip.

- A volume controller, implemented with a FreeScale pressure sensor. The player
blows into a tube that goes to a "T" - one leg goes to the pressure sensor, and
the other is open (a "dump tube") so that the player can put air through the
instrument.

Feb 9, 2010
Gordon Good (velo27 yahoo com)

*/
#include <MidiUart.h>
#include <Midi.h>

MidiClass Midi;

// If DEBUG == true, then the sketch will print to the serial port what
// it would send on the MIDI bus.
const boolean DEBUG = false;
//const boolean DEBUG = true;

const int BREATH_PIN = 0; // Breath sensor on analog pin 0
const int SLIDE_LPOT_PIN = 1; // Slide sensor on analog pin 1

const int OT_SW_0_PIN = 2; // Overtone switch 0
const int OT_SW_1_PIN = 3; // Overtone switch 1
const int OT_SW_2_PIN = 4; // Overtone switch 2
const int OT_SW_3_PIN = 5; // Overtone switch 3

const int PANIC_PIN = 6; // MIDI all notes off momentary switch on digital I/O 6

// The overtone series this instrument will produce
const int FUNDAMENTAL = 36; // MIDI note value of our fundamental
const int OT_1 = 48; // First overtone (B flat)
const int OT_2 = 55; // Second overtone (F)
const int OT_3 = 60; // Third overtone (B flat)
const int OT_4 = 64; // Fourth overtone (D)
const int OT_5 = 67; // Fifth overtone (F)
const int OT_6 = 70; // Sixth overtone (A flat - not in tune - need to tweak pitch bend)
const int OT_7 = 72; // Seventh overtone (B flat)
const int OT_8 = 74; // Eighth overtone (C)
const int OT_9 = 76; // Ninth overtone (D)
const int OT_NONE = -1; // No overtone key pressed (not possible with ribbon)

// All overtones for this instrument
const int overtones[10] = {FUNDAMENTAL, OT_1, OT_2, OT_3, OT_4, OT_5, OT_6, OT_7, OT_8, OT_9};
// Switch values for given overtones. 0xff means that overtone can't be selected.
const int overtone_sw_values[10] = {0xff, 0x00, 0x01, 0x03, 0x07, 0x0f, 0x0e, 0x0c, 0x08, 0xff};

const int MIDI_VOLUME_CC = 7; // The controller number for MIDI volume data
const int MIDI_BREATH_CC = 2; // The controller number for MIDI breath controller data

long ccSendTime = 0; // Last time we sent continuous data (volume, pb);
const int MIN_CC_INTERVAL = 10; // Send CC data no more often than this (in milliseconds);
const int PB_SEND_THRESHOLD = 10; // Only send pitch bend if it's this much different than the current value
const int VOLUME_SEND_THRESHOLD = 1; // Only send volume change if it's this much differnt that the current value
const int NOTE_ON_VOLUME_THRESHOLD = 50; // Raw sensor value required to turn on a note

// If a value larger than this is read from a SoftPot, treat it as if the player is not touching it.
// Note: for some reason, the two SoftPots interact, e.g. just actuating the slide pot gives me
// no-touch values all above 1000, but when also touching the overtone pot, the values can go
// as low as 999. I suspect I may be taxing the 5v supply line.
const int LPOT_NO_TOUCH_VALUE = 1010;

int currentNote = -1; // The MIDI note currently sounding
int currentPitchBend = 8192; // The current pitch bend
int currentVolume = 0; // The current volume

void setup() {
enableDigitalInput(OT_SW_0_PIN, true);
enableDigitalInput(OT_SW_1_PIN, true);
enableDigitalInput(OT_SW_2_PIN, true);
enableDigitalInput(OT_SW_3_PIN, true);
enableDigitalInput(PANIC_PIN, true);
enableAnalogInput(BREATH_PIN, false);
enableAnalogInput(SLIDE_LPOT_PIN, true);

if (DEBUG) {
Serial.begin(9600);
} else {
MidiUart.init(); // Initialize MIDI
}
}

/**
* Enable a pin for analog input, and set its internal pullup.
*/
void enableAnalogInput(int pin, boolean enablePullup) {
pinMode(pin, INPUT);
digitalWrite(pin + 14, enablePullup ? HIGH : LOW);
}

/**
* Enable a pin for digital input, and set its internal pullup.
*/
void enableDigitalInput(int pin, boolean enablePullup) {
pinMode(pin, INPUT);
digitalWrite(pin, enablePullup ? HIGH : LOW);
}


/**
* Read the slide pot and return a pitch bend value. The values
* returned are all bends down from the base pitch being played,
* and are in the range 8192 (no bend) to 0 (maximum bend down).
* This means that the synth patch needs to be adjusted to provide
* a maximum pitch bend of seven semitones, if you want it to
* behave like a trombone.
*
* Return -1 if the player is not touching the sensor.
*/
int getPitchBendFromLinearPot() {
// Get the raw value from the linear pot
int pbRawVal = analogRead(SLIDE_LPOT_PIN);
if (pbRawVal > LPOT_NO_TOUCH_VALUE) {
return -1;
} else {
return map(pbRawVal, 0, LPOT_NO_TOUCH_VALUE, 0, 16383 / 2);
}
}

int getPitchBend() {
return getPitchBendFromLinearPot();
}

/**
* Read the overtone switches and return the appropriate overtone.
* If an invalid key combination is found, return -1. Note that
* we invert the values from digitalRead, since these switches
* pull to ground, so switch enabled = digital 0.
*/
int getOvertoneFromOvertoneSwitches() {
unsigned char val = !digitalRead(OT_SW_3_PIN);
val = val << 1 | !digitalRead(OT_SW_2_PIN);
val = val << 1 | !digitalRead(OT_SW_1_PIN);
val = val << 1 | !digitalRead(OT_SW_0_PIN);
// now select the appropriate overtone
for (int i = 0; i < sizeof(overtone_sw_values); i++) {
if (val == overtone_sw_values[i]) {
return i;
}
}
return -1;
}

int getMIDINote() {
int ot = getOvertoneFromOvertoneSwitches();
if (-1 == ot) {
return currentNote;
} else {
return overtones[ot];
}
}

/**
* Read the breath sensor and map it to a volume level. For now,
* this maps to the range 0 - 127 so we can generate MIDI
* continuous controller information.
*/
int getVolumeFromBreathSensor() {
int volRawVal = analogRead(BREATH_PIN);
if (volRawVal < NOTE_ON_VOLUME_THRESHOLD) {
return 0;
} else {
return map(constrain(volRawVal, 30, 500), 30, 500, 0, 127);
}
}

int getVolume() {
return getVolumeFromBreathSensor();
}

void sendNoteOn(int note, int vel, byte chan, boolean debug) {
if (debug) {
Serial.print("ON ");
Serial.println(note);
} else {
MidiUart.sendNoteOn(chan, note, vel);
}
}

void sendNoteOff(int note, int vel, byte chan, boolean debug) {
if (debug) {
Serial.print("OFF ");
Serial.println(note);
} else {
MidiUart.sendNoteOff(chan, note, vel);
}
}

void sendPitchBend(int pitchBend, boolean debug) {
if (-1 != pitchBend) {
if (abs(currentPitchBend - pitchBend) > PB_SEND_THRESHOLD) {
currentPitchBend = pitchBend;
if (debug) {
Serial.print("BEND ");
Serial.println(pitchBend);
} else {
MidiUart.sendPitchBend(pitchBend);
}
}
}
}

void sendVolume(int volume, byte chan, boolean debug) {
if (abs(currentVolume - volume) > VOLUME_SEND_THRESHOLD) {
currentVolume = volume;
if (debug) {
Serial.print("VOL ");
Serial.println(volume);
} else {
//midi.sendControlChange(chan, MIDI_VOLUME_CC, volume);
MidiUart.sendCC(chan, MIDI_VOLUME_CC, 100 );
}
}
}

void sendBreathController(int volume, byte chan, boolean debug) {
if (abs(currentVolume - volume) > VOLUME_SEND_THRESHOLD) {
if (debug) {
Serial.print("BC ");
Serial.println(volume);
} else {
MidiUart.sendCC(chan, MIDI_BREATH_CC, volume );
}
}
}

void allNotesOff() {
for (int i = 0; i < 128; i++) {
sendNoteOff(i, 0, 1, DEBUG);
}
}

void loop() {

if (digitalRead(PANIC_PIN) == 0) {
allNotesOff();
}

int pb = getPitchBend();
int note = getMIDINote();
int volume = getVolume();

if ((-1 != currentNote) && (0 == volume)) {
// Breath stopped, so send a note off
sendNoteOff(currentNote, 0, 1, DEBUG);
currentNote = -1;
} else if ((-1 == currentNote) && (0 != volume) && (-1 != note)) {
// No note was playing, and we have breath and a valid overtone, so send a note on
sendNoteOn(note, 127, 1, DEBUG);
currentNote = note;
} else if ((-1 != currentNote) && (note != currentNote)) {
// A note was playing, but the player has moved to a different note.
// Turn off the old note and turn on the new one.
sendNoteOff(currentNote, 0, 1, DEBUG);
sendPitchBend(pb, DEBUG);
sendBreathController(volume, 1, DEBUG);
sendNoteOn(note, 127, 1, DEBUG);
currentNote = note;
} else if (-1 != currentNote) {
// Send updated breath controller and pitch bend values.
if (millis() > ccSendTime + MIN_CC_INTERVAL) {
sendPitchBend(pb, DEBUG);
sendBreathController(volume, 1, DEBUG);
ccSendTime = millis();
}
}
delay(50);
}

Saturday, February 6, 2010

Using Wesen's MIDIDuino library on a Mac

I ran into a few problems using Ruin & Wesen's excellent MIDIDuino library on my Mac - here are three things to know:

Thing 1 - Incompatibility with Recent Arduino IDEs

Because of some changes in the gcc bundled with recent Arduino IDEs, you need to use an older IDE (verson 0013 is known to work) with the MIDIDuino library. I'm sure that Wesen will eventually fix that, but for now, get 0013 and install it on your Mac (this advice applies if you're running it on a PC as well).

Thing 2 - Arduino 0013 and 64-bit Snow Leopard

If you're running Snow Leopard, depending on which model of Mac you have, you may be running in 64-bit mode, and if so, you'll get the following error when you try to launch Arduino 0013:



So search engines can find this, the text in the dialog is:

Cannot launch Java application

Uncaught exception in main method:
java.lang.UnsatisfiedLinkError:/Applications/
Resources/Java/librxtxSerial.jnilib:no suitable image found.
Did find: /Applications/arduino-0013/Arduino/
13.app/Contents/Resource/Java/librxtxSerial.jnilib:
no matching architecture in universal wrapper

To fix this, set Arduino-0013 to run in 32-bit mode. First, find the Arduino app (if it's in your dock, you can Crtl-click, then choose Options->Show in Finder. Single-click the app, then choose Get Info. In the inspector that appears, click "Open in 32-bit Mode" and dismiss the insepctor.




Thing 3 - Arduino preferences.txt problems

If you've run more recent versions of the Arduino IDE, you may have an Arduino preferences file that the Arduino 0013 can't read. You'll get this error:




Cannot launch Java application

Uncaught exception in main method:
java.ang.NumberFormatException: null

The simplest thing is to delete the file, or rename it. Of course, if you need to also run the newer Arduino IDE from time to time, you'll need to perform this every time you switch back. This is left as an exercise for the reader. :-)

To find the preferences file, choose "Preferences" from the File menu, and look at the bottom for "More preferences can be edited directly in the file".

Oh, and this advice about the preferences file also applies to PC users.

Friday, February 5, 2010

More Progress

Tonight I experimented with placing a 100 mm linear pot on the handle of the "trombone" instrument, where the player holds it with the left hand. By touching the pot with one of the four fingers of the left hand, the player is able to select one of five partials (no fingers, one finger, ... four fingers). With this arrangement, I was able to play a decent taps:










I also found that playing trombonistically was a lot more natural with this arrangement. For example, if you're playing F (2nd overtone) and want to go up to G, on a trombone, you'd go from first to fourth position, and blow up to the next partial. On my instrument, you would go from first to fourth position, and put the next highest finger down. When I tried this, my body just sort of did it naturally, probably because the physical orientation of the overtone selection was the same in my brain (up).

Now, since there are only 4 fingers to work with (and possibly the thumb, if I can free it up from its job of keeping me from dropping the instrument), to cover the typical 8 partial + range of the instrument, to make this work, we may need to figure out some sort of "chording" for the fingers of the left hand. One thing that occurs to me right away is to use the one-finger-per-overtone approach for the lower partials, then bring the other fingers back into the picture, e.g. (fingers are numbered 1 = index, 4 - pinky)

0th partial (fundamental) - no fingers (B flat)
1st partial - 4 (B flat)
2nd partial - 3 (F)
3rd partial - 2 (B flat)
4th partial - 1 (D)
5th partial - 1 + 2 (F)
6th partial - 1 + 2 + 3 ("A flat")
7th partial - 1 + 2 + 3 + 4 (B flat)

If you're a trombonist, you see that this is missing a few more playable partials (probably another 4-6 semitones is required of the physical instrument).

I don't have any good answers for how to solve these problems yet, but I'm revisiting my assumption that left-hand control of the overtone series is a dead-end. If I follow up on this approach, I think a set of momentary switches would work out a lot better than the linear pot.

On that left-hand-is-a dead-end front, I built a prototype mouthpiece with a baffle that splits the airflow into two vertically separated streams. After the epoxy hardens, I plan to use this to investigate the feasibility of using embouchure "gestures" as overtone selectors. It may be a total bust, but if I can make it work, I think it might make the instrument a lot more playable.

Wednesday, February 3, 2010

Force Sensitive Resistor (FSR) as an overtone selector

Tonight, I wired up a Force Sensitive Resistor (FSR) so that it would select an overtone on my MIDI trombone. I put it where the performer grips the instrument so that it could be actuated by the performer's left thumb. Then, I tried to play it myself (by trying to move the slide and actuating the overtone selector). The results were disappointing. Switching between partials required a far more subtle gesture than I was able to produce.

Back to the drawing board!

Monday, January 25, 2010

Marmonizer, v2

Tonight I coded the Marmonizer v2. For those of you with some music theory background, this harmonizer adds notes below the played pitch that spell a major triad in second inversion (a "6/4" chord). The algorithm just takes the input MIDI note, and outputs on the MIDI out connector the original note plus two other notes; one 4 semitones down, and another 5 semitones down.

In the examples below, I am playing (in a very hacky fashion, since I have no sax chops) a Yamaha WX-7 wind controller.

When you play a scale with this harmonization, it sounds like this:








And when you fiddle with the slide potentiometer we built for transposition in Marmonizer v1, you get something like this:








Here's the sketch:



/**

The Marmonizer

The Marmonizer is a MIDI harmonizer. It takes MIDI data on its input port and
sends harmonized data on its output port. The types of harmonizations will
eventually be user-programmable and extremely flexible.

Version 2:

Version 2 builds on version 1, which was a simple MIDI transposer, and
implements a simple MIDI harmonizer. The harmonizarion ia simple; the
input note is the top voice of a triad in second inversion (e.g. if the
input note is E4, then the voices below are G3 and C4. For discussion of
what those note names mean, see http://en.wikipedia.org/wiki/C_%28musical_note%29
The transposition pot is retained.

Limitations: the algorithm always "maps down" so we may roll notes off the
deep end of the MIDI spec.

Gordon Good (velo27 yahoo com)
Jan 25, 2010
*/

#include <midiuart.h>
#include <midi.h>
MidiClass Midi;

#define HARM_NOTE_1_OFFSET -4
#define HARM_NOTE_2_OFFSET -9

int trPotPin = 0; // Analog pin for reading the transposition potentiometer
int ledPin = 13; // LED pin to blink for debugging

int transposition = 0; // number of semitones to transpose (negative = transpose down)

void noteOnCallback(byte *msg) { // or is it uint8_t?
digitalWrite(ledPin, HIGH);
MidiUart.sendNoteOn(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition, msg[2]);
MidiUart.sendNoteOn(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition + HARM_NOTE_1_OFFSET, msg[2]);
MidiUart.sendNoteOn(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition + HARM_NOTE_2_OFFSET, msg[2]);
}

void noteOffCallback(byte *msg) {
digitalWrite(ledPin, LOW);
MidiUart.sendNoteOff(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition, msg[2]);
MidiUart.sendNoteOff(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition + HARM_NOTE_1_OFFSET, msg[2]);
MidiUart.sendNoteOff(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition + HARM_NOTE_2_OFFSET, msg[2]);
}

void continuousControllerCallback(byte *msg) {

}

void afterTouchCallback(byte *msg) {

}

void channelPressureCallback(byte *msg) {

}

void programChangeCallback(byte *msg) {

}

void pitchWheelCallback(byte *msg) {

}

void setup() {
MidiUart.init();
Midi.setOnNoteOnCallback(noteOnCallback);
Midi.setOnNoteOffCallback(noteOffCallback);
Midi.setOnControlChangeCallback(continuousControllerCallback);
Midi.setOnAfterTouchCallback(afterTouchCallback);
Midi.setOnChannelPressureCallback(channelPressureCallback);
Midi.setOnProgramChangeCallback(programChangeCallback);
Midi.setOnPitchWheelCallback(pitchWheelCallback);
pinMode(trPotPin, INPUT);
analogWrite(trPotPin, HIGH);
}

void loop() {
while (MidiUart.avail()) {
// Read the transposition pot, and map the value to a + or - one octave transposition
transposition = map(analogRead(trPotPin), 0, 1023, -12, 12);
Midi.handleByte(MidiUart.getc());
}
}



In the spirit of Agile, I'm doing the minimal coding necessary to achieve a goal, so some of this might seem a bit silly to experienced programmers.

Sunday, January 24, 2010

Marmonizer, v1

One of the things I've wanted to build for a long time is, for lack of a better term, a MIDI harmonizer. It would have:
  • A MIDI in
  • A MIDI out
  • Some number of controls, e.g. knobs, sliders, control surfaces, and inputs for foot controls
The idea is that any MIDI data presented to the input would be transformed by algorithms running on the box, producing some other set of MIDI output. A performer would be able to control the parameters of these transformations using the controls. Also, it would be possible to configure the box so that input parameters also affect the output in non-obvious ways.

The simplest application I can think of for such a device is a simple MIDI transposer, which is what I put together tonight. The breadboard for this experiment has a single slide potentiometer which controls the amount of transposition. If the pot is centered, no transposition is performed. At full travel one direction, the pitch is transposed up 12 semitones (one octave), and at the other end, the pitch is transposed down 12 semitones. Here's the sketch. It's based on the very excellent MidiDuino Library from Ruin & Wesen.

/**

The Marmonizer

The Marmonizer is a MIDI harmonizer. It takes MIDI data on its input port and
sends harmonized data on its output port. The types of harmonizations will
eventually be user-programmable and extremely flexible.

Version 1:

To prove some basic assumptions, the very first version is a simple MIDI transposer.
The input note is transposed up or down, and the amount of transposition is
controlled by a voltage applied to analog input 0, e.g. with a potentiometer.

This proves:
- That we can do the transposition with reasonable latency
- That we've got the Miduino library working properly

Note: this will probably leave dangling notes if the transposition is
changed between a note on and the corresponding note off. The final
code will have to account for user knob-twisting while playing, and
make sure it turns off the right notes. Probably some sort of a map
that relates a received note to all the note on messages it spawned.

Gordon Good (velo27 <at> yahoo <dot> com)
Jan 24, 2010
*/

#include <MidiUart.h>
#include <Midi.h>
MidiClass Midi;

int trPotPin = 0; // Analog pin for reading the transposition potentiometer
int ledPin = 13; // LED pin to blink for debugging

int transposition = 0; // number of semitones to transpose (negative = transpose down)

void noteOnCallback(byte *msg) { // or is it uint8_t?
digitalWrite(ledPin, HIGH);
MidiUart.sendNoteOn(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition, msg[2]);
}

void noteOffCallback(byte *msg) {
digitalWrite(ledPin, LOW);
MidiUart.sendNoteOff(MIDI_VOICE_CHANNEL(msg[0]), msg[1] + transposition, msg[2]);
}

void continuousControllerCallback(byte *msg) {

}

void afterTouchCallback(byte *msg) {

}

void channelPressureCallback(byte *msg) {

}

void programChangeCallback(byte *msg) {

}

void pitchWheelCallback(byte *msg) {

}

void setup() {
MidiUart.init();
Midi.setOnNoteOnCallback(noteOnCallback);
Midi.setOnNoteOffCallback(noteOffCallback);
Midi.setOnControlChangeCallback(continuousControllerCallback);
Midi.setOnAfterTouchCallback(afterTouchCallback);
Midi.setOnChannelPressureCallback(channelPressureCallback);
Midi.setOnProgramChangeCallback(programChangeCallback);
Midi.setOnPitchWheelCallback(pitchWheelCallback);
pinMode(trPotPin, INPUT);
digitalWrite(trPotPin, HIGH);
}

void loop() {
while (MidiUart.avail()) {
// Read the transposition pot, and map the value to a + or - one octave transposition
transposition = map(analogRead(trPotPin), 0, 1023, -12, 12);
Midi.handleByte(MidiUart.getc());
}
}


There are some stubs in there for passing through or acting on most of the other MIDI data types. They're not ever executed in this sketch.

Also, I built the MIDI interface detailed here: MIDI Shield. For now, I've got everything on a breadboard, but I will eventually make a real Arduino shield for it, maybe using one of the Adafruit Protoshields I have on order.

In the future, I'm planning to:
  • Write more interesting transformation algorithms, including complex harmonizations.
  • Allow the parameters of the harmonization to be controlled by the performer using knobs, sliders, foot pedals, randomness, etc.
  • Allow the parameters of the harmonization to be controlled by input parameters, e.g. note on velocity can select a different chord voicing.
  • Allow users to create new combinations of controller/input assignments, and save those as patches that can be recalled easily.
Any other ideas out there?