custom PureData suite for improvised computer-music, built for J. Pfeffer

Philadelphia musician and writer Jonathan Pfeffer recruited me to build some tools for custom musical control between his Laptop and his various musical hardware toys.

I build him two “patches” in code-free visual language PureData (extended).

The “myWhammyJammy”

The Whammy effect by Digitech is a foot-controlled pitch-shifter, allowing you to sweep a foot-treadle to bend notes. Use a dial to select pitch direction/amount…
John liked the sound of spinning the Mode dial in the middle of notes.

Whammy_large.jpg
the stock Whammy 4, by Digitech. Without MIDI control, limited to the physics of your foot.

I read the manual for his Whammy 3, and found that it can receive Program Change messages via MIDI, as well as Continuous Control messages (virtually) update bend-treadle’s position.

the early WhamJam module has only 3 controls…

  • a Mode selector
  • a slider to remote control the treadle
  • a PC button to send a random Program Change to switch modes
  • a Rate control for periodic triggering of the “PC” button; from every 5- to 5000 milliSeconds.

…and two modes:

  1. off. does nothing
  2. Program: sends PC messages at timing by Rate
  3. Treadle: ramps between random Treadle positions at timing by Rate
  4. Both:  random Program and Treadle sweeps by Rate

YT POC-slow.gif

…click picture to see a YouTube video proof-of-concept of the WhamJam

The final working version of myWhammyJammy had several more modes:

  • off
  • pitch shifter modes
    • jump Mode
    • sweep Treadle
    • Both
  • filter-sweep modes, apply similar behavior to a “wah” filter effect.
    • jump
    • sweep
    • jump and sweep

….and user-defined scales for the pitch shifter.

Screen Shot 2017-02-10 at 3.24.16 PM.png
focus on GUI for myWhammyJammy

the myAntiEffect2

JP wanted a straight-up software copy of the Anti-effect by ChaosSound.
Where the hardware by Chaos Sound periodically mutes the through-passing audio at an  random-sounding sequences, with adjustment of how fast it steps through these sequences of on/off states.

I built my original system to use a true random binomial variable with fully adjustable probability, so that, at the extremes, audio NEVER mutes or STAYs muted.

Screen Shot 2017-02-10 at 3.24.20 PM.png
focus on GUI for myAntiEffec2

My Anti-effect also goes much slower ( 5 seconds per update to 5 mSec, allowing for nasty square-wave AM type sounds.

Screen Shot 2017-02-10 at 3.51.35 PM.png
a look inside the sub-patch for the finished myAntiEffect2

I then took the “rate-defined random-changes in audio” to the expanded myAntiEffect2, which has modes for

  1. bypass
  2. unMute: mute or un-mute. classic Anti-effect broken-cable sounds.
  3. reVolume: update volume in any value between unity passthrough and defeat, either by jump or sweep. wild accents and stutter-yodeling.
  4. rindMod: sending audio through a ring modulator with carrier frequency periodically either jumping or sweeping between sonic frequencies. Dhalek and FM sounds abound.
  5. verbDub: to perdiodically send incoming audio to buss of the FreeVerb plugin. random dub-reggae and warped ambiance.
  6. verbFreeze: using “freeze’ mode of freeVerb plugin, randomly capture and freeze sound with arbitrary time and probability.

the 2^5 GroupMixer:

In this case, JP wants control to mix and route 8 audio channels. Somer are software-synths in the computer, others are analog inputs (mic, bass guitar, etc).
I built him a set of 8 “effects-struments, so that each channel is not just a source, but a processing buss. This required building (and and now interfacing with) a 64-point matrix mixer.

In addition to building message routing to route messages to each of the 64 volume controls, I built some presets, to recall useful states instantly (such as “1-to-1” and “everything to pitch shifter”).

Screen Shot 2017-02-10 at 3.26.51 PM.png
all routing in the GroupMixer was programmed by pressing combinations of the 2 rows of buttons on the Alias 8 MIDI controller.

Click HERE or the photo-link above to see a proof-of-concept for how one interfaces with this re-routing scheme. Video shows:

Rather than think of “64 knobs for 64 gain controls,” I just made it a matter of ON/OFF switching by the Livid 8’s two rows of 8 color-code-able buttons, Here single button’s MIDI message from the Alias could update points, rows, or columns in the matrix arbitrarily.

This simply required building the bottom row as a “from” and top as a “to”, where you have to push the buttons in a certain row order; push and hold a bottom-row button to select an effecstrument channel, then push the top-row button to assign it to 1 (at a time) destination buss.

  • if an effecstrument is sent to it’s own out buss, that feeds directly to the master out.
  • otherwise, that effecstrument feeds to the input of the target

In order to monitor and interpret tall MIDI messages from the Alias 8 into this simple message parsing, I had to built a virtual GUI of the control surface.

Screen Shot 2017-02-10 at 3.24.38 PM.png
the GUI virtualization of the Alias8 as “performance mixer/router”

 

This Alias GUI sent status-updates to the actual mixer, represented as a grid. This was heavily inspired by (and a more satisfying realization of) my work in hardware with using the Sound Sculpture SwitchBlade with my “PercussioNeuron”. Where my work with the SwitchBlade only allowed one channel at a time to go to a single process-buss (for re-sampling), this treated every non-self channel as a process buss, where each could only be assigned to one at a time (like “groups” on an analog mixing console.  Hence, the controls and display for each channel’s assignment was a “radio-button” control, seen as a row.

 

Screen Shot 2017-02-10 at 3.31.16 PM.png
a peek a middle layer of the PureData GUI for the GroupMixer, showing receivers and GUIs for updating the state of each channel (row) to another (column)

 

Needless to say, it was very helpful and satisfying to work with Pfeffer to make this make sense with the visual assistance of the light-up buttons.
Under the hood in PD, routing from these Inputs to Outputs is built as a 64-point lattice of audio paths through 8 “multiplex~” modules. Very tedious to stitch together the program.

Screen Shot 2017-02-10 at 3.30.59 PM.png
screenshot of lowest layer sub-patch of this mixer, showing the 64-point muting matrix for 8-channel/8-path routing, with  an 8-point radio button controlling each.  and yes… I programmed all that with my mouse.

…Even more tedious to interface with directly.

While the the mixer was tedious to program (repeatedly click-dragging to stitch the audio paths correctly), the most difficult part to design was proper handling of MIDI message to control and color the backlights of these buttons.

 

Drums and Drones

Jonathan had been using an Alesis Trigger IO with foot- and pad- triggers to play drums sounds from his favorite DAW, but had to stop and click a new kit to change sounds. He also liked playing drones from a Tascam tape mulit-track, but had to stop/eject tapes just to switch sounds.

Both of these ideas ideas seemed to be cumbersome and limited.
I decided to re-use some the 8-way button-switching to control channels in the mixer to built 3 simple devices in PD, each with 8 “modes”. Here, pressing the top row first selected the item, allowing the subsequent press of a bottom-row button to select 1 of 8 “modes” for that effecstrument. This promised much faster selection and control of sounds.

Additionally, each of these sound sources had a and a single Tweak control, and a Mix control.

Screen Shot 2017-02-10 at 3.24.28 PM.png

  • myTrig1: loads 1 of 8 drum sounds to be triggered by foot pedal.
  • myTrig2: loads 1 of 8 sample pairs to be triggered by head- or rim- zone of vDrum pad. Twea
  • myDroner: chooses 1 of 8 drones to be looped and filtered by

As mentioned before, these “effectstrument”  channels were not just sound sources, but destinations. The mix control balanced the output of each drum or drone against the sound of any OTHER channels bussed into them.

For the “myTrig” units, the Tweak controlled the decay time of a volume fade triggered with each hit. Thus, one could foot-stomp short blips of long sound, or, say, vocal routed to that drum’s effecstrument input.

Screen Shot 2017-02-10 at 4.59.34 PM.png
detail of myTrig2 sub-patch, showing mixing and triggered-fade-decay of both sounds from within and from other channels.

for the myDrone, the Mix control balanced between a clean mix of the drone loop and any audio bussed into that unit and a dirty mix of those two sources being ring-modulated. The myDrone  single Tweak control was a DJ-style cut filter: neutral in center, sweep left to cut highs, sweep right to cut lows.

Screen Shot 2017-02-10 at 5.00.26 PM.png
detail of myDroner sub-patch, showing facts of sound loading, looping, mixing, and  filtering.

 

OVERALL:

This involved a LOT of cycles of

  1. planning,
  2. reading their manual and product spec,
  3. reformatting my messages
  4. re-structuring my patch in PureData
  5. testing and debugging.
  6. backtracking
  7. dreaming and brainstorming

Overall, I’m as proud of this project as I was excited about J. Pfeffer’s outside-the-box thinking and demands. Reminds me of the how much designers should value the naive.

Hence, why we had to build this “from this middle, outwards” 

Look forward to seeing Peffer incorporate this into his performance setup.

Let me know if you or anyone you know are interested in developing similar custom hardware/software tools, especially for music creation and/or performance.

 

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s