Pixel sequencer for playing a .jpg like a sheet of music

Intro

For a long time I’ve been interested in treating an image as source material for music and raw sound synthesis.  There are a lot of cool interactions between visualization and sound that exist in the world, some of which are very complex.  For this project, I wanted to try out the simplest transformation of image to sound that I could think of: reading the pixels of an image as musical notes to be sent off for sound generation!  This approach creates a very simple “Pixel Sequencer” that lends itself to a lot of other experimentation.

1. Sound and MIDI

For this project I used MATLAB, a USB MIDI input/output device (Steinberg UR22), and a Roland RS-70 synthesizer. MIDI is a protocol for sending digital information between musical devices describing real-time music note triggering and many other sound parameters.  I am using it here to trigger notes on the synthesizer.  Thankfully there are many useful code sources that allow the integration of MIDI hardware into most coding environments.  For this project I used the MIDI functions that are found in the EEGsynth repository (https://github.com/eegsynth/eegsynth/tree/master/matlab).  Coincidentally, that is a very cool project that uses people’s brain activity to control a synthesizer.  Arguably way cooler than what I am trying to do here ;-).  The main function I use here is the midiOut function.

So the basic idea is to grab a pixel from an image, perform some transformation of that value to a MIDI note value, send the note to a synthesizer to produce a sound.  Note that it would be totally trivial to send the MIDI notes to a software synthesizer as well, — this would require the use of a virtual MIDI connection, loopMIDI is one option for this.

The code I used to set up a MIDI connection:

% Setup a MIDI device for output:
ss=midiOut('L');
index=[ss(:).index];
midiDeviceName='2- Steinberg UR22-1';
useDev=index((strcmp({ss(:).name},midiDeviceName) & [ss(:).output]==1));
midiOut('O',useDev)

2. Reading pixels

I decided to do the simplest thing possible and just read across rows of pixels in the image, using the Red, Green, and Blue values for each pixel to send different notes to three different synthesizers (one drumset, one lead, one ambient strings).

The picture I decided to use is this low resolution image of a sunset with “2017” superimposed. Maybe it contains a song about the year ahead of us? Here is the code I used to read the image file in and decompose it into RGB values:

% Load a .jpg file
pn='D:\Dropbox\Inspire A Cassette\';
fn='IMG_1294_2017.jpg';
im=imread([pn fn]);
r=im(:,:,1);
g=im(:,:,2);
b=im(:,:,3);
%Plot it to see that it looks right:
figure;
imshow(cat(3,r,g,b))

3. Playing Pixels as MIDI notes

I used some simple code to make sure the notes are signaled on and off for each pixel while animating the cursor, as shown in the video below. For this demo, Red is drum machine, Green was some ambient strings that I turned on and off, and Blue was the main lead synthesizer.

%% Midi picture play:
 
%Create vectors of RGB values to read off easily in order:
pg=reshape(g',1,[]);
pb=reshape(b',1,[]);
pr=reshape(r',1,[]);
[h,w]=size(b');
 
pitchScale=3.15; %Scaling factor for pixel R G B values (each 0 to 255)
gt=floor(double(pg)/pitchScale); %This value is used for a MIDI note
bt=floor(double(pb)/pitchScale);
rt=floor(double(pr)/pitchScale);
 
% Collect the coordinates for plotting the linearized pixels back into rows
% and columns of the image file for our cursor:
x=reshape(bsxfun(@times,ones(h,w),(1:h)'),1,[]);
y=reshape(bsxfun(@times,ones(h,w),(1:w)),1,[]);
 
f=figure;
imshow(cat(3,r,g,b)) %Plot the image
hold on;
h1=plot(0,0,'c^','LineWidth',3,'MarkerSize',10); %Plot the cursor as a cyan triangle
axis tight manual
 
i=1e3; %start with an initial offset (optional)
stepSize=200; %How many pixels to jump across for each MIDI tone reading
 
%Create an anonymous function to also use the pixel values for "note velocity"
% i.e. parameters that relate to how hard the key would have been pressed on a keyboard
rgb2vel=@(rgb) round(double(rgb)/255*127);
offVelocity=127;
 
writerObj = VideoWriter(sprintf('%svid_test1',pn),'MPEG-4');
set(writerObj,'FrameRate',10)
open(writerObj);
 
%MIDI CHANNEL to RGB: these are the three midi channels I am using to play R, G, and B pixel values
rgb_ch=[7 8 9]; %Requires that you have a synthesizer that can play 3 parts
 
%PLAY THE JPG FILE TO MIDI NOTES!
while i<(length(gt)-stepSize) && ishandle(f)
    %While there are still pixels left and the figure is still open, play
    %pixels out as midi notes,
 
    set(h1,'Xdata',x(i),'Ydata',y(i)) %Update cursor position
    drawnow %Draw it
    %Turn all of the MIDI notes on for the current pixel:
    midiOut('+',rgb_ch(1),rt(i),rgb2vel(rt(i)))
    midiOut('+',rgb_ch(2),gt(i),rgb2vel(gt(i)))
    midiOut('+',rgb_ch(3),bt(i),rgb2vel(bt(i)))    
 
    pause(0.1) %Arbitrary note duration of 0.1 second
 
    %Turn all of these notes off again;
    midiOut('-',rgb_ch(1),rt(i),offVelocity)    
    midiOut('-',rgb_ch(2),gt(i),offVelocity)
    midiOut('-',rgb_ch(3),bt(i),offVelocity)
    i=i+stepSize;
end
 
%Make sure all notes are off when the picture is done playing:
for i=1:length(rgb_ch)
   midiOut('.',rgb_ch(i)) 
end

4. The Video

In order to capture the video, I used a separate run of the code without MIDI note generation:

%% Animation generation version:
% Generate all previous variables, but replace the MIDI while-loop with this while-loop
 
writerObj = VideoWriter(sprintf('%stt',pn),'MPEG-4'); % Open movie object on disk:
set(writerObj,'FrameRate',10) %Frame rate of 10 would be for out note duration of ~0.1 second
open(writerObj);
while i< (length(gt)-stepSize) && ishandle(f)
    i=i+stepSize;
    set(h1,'Xdata',x(i),'Ydata',y(i))
    frame = getframe;
    writeVideo(writerObj,frame);
end
close(writerObj);

The outcome is mechanical and strange and repetitive–but I like it! The image is only providing the notes and note intensity (like a real sequencer), so I messed around with the knobs on the synthesizers in real-time to get sounds I liked during the recording.
 
One of the problems with this method is that the sequencer can experience subtle changes in note playing speed due to changes in the queue of events the computer’s CPU is attempting to crunch through. This can cause quite noticeable fluctuations in timing when additional demands are placed on the CPU (like even recording the audio at the same time as it is generated–I actually recorded the audio to tape to avoid this problem). It also means that the video must be stretched a bit to match the audio. In addition, the pause(0.1) line does not capture all of the delay between notes, as turning the notes on and off and updating the plots also takes time. I needed to stretch the video about 20% to align with the notes from the original playing. For these reasons, it became immediately clear that a more sophisticated clock would be necessary for any practical use of this kind of sequencer. I messed around with setting up fixedSpacing timer objects to trigger notes more precisely in time, but this doesn’t completely stop the issue: calls to the timer object can only be queued or dropped, both of which alter the playback. MATLAB is probably not the best coding environment for this sort of project, but it worked enough to get by!
 
Next steps might be: make a similar audio to RGB decoder! I imagine using some sort of Fourier method with 3 tonal ranges could work… But that’s for another day. Thanks for stopping by!