Capture midi messages in Processing during playback

The 2nd midi in Processing example will use the Receiver interface to capture all the midi messages during the playback of a midi file. The program uses the custom GetMidi class to implement the Receiver interface. During the playback, it will display the NOTE_ON message with information of channel, octave and note.

The source code of the example is also in the Magicandlove GitHub repository.

Sample Processing screen during midi playback

Using midi in Processing for playback

This is my first use of midi in Processing. I do not use the MidiBus library for Processing. Instead, I try to use the standard midi package in Java. The SE8 standard Java package also contains the javadoc documentation.

Screenshot of the Processing sketch

The Processing source code and sample midi files are in the Magicandlove GitHub repository. The midi example files are downloaded from the midiworld website.

The code basically needs a Synthesizer class to render midi instruments into audio and a Sequencer class to playback the midi sequence.

Synthesizer synth = MidiSystem.getSynthesizer();
Sequencer player = MidiSystem.getSequencer();
synth.open();
player.open();

All the midi music files are in the data folder of the Processing sketch. To playback each piece of midi music, we need to convert each into a Java File object and use the following code to playback it. The variable f is a File object instance containing the midi file in the data folder.

Sequence music = MidiSystem.getSequence(f);
player.setSequence(music);
player.start();

Intel Realsense colour image in Processing (Windows only)

The testing is based on the Java wrapper of the Intel Realsense SDK, version 2 found in the following GitHub repository.

https://github.com/edwinRNDR/librealsense/tree/master/wrappers/java.

It only provides the pre-built binary for Windows version. I used it to test with my Intel Realsense D415 camera. The image below is the screenshot of the camera view.

The source code can be found in the GitHub repository of this post.

 

Movement in Space (version 2) Testing videos

A new version of the Movement in Space project will be exhibition end of this year as an installation piece. Here are some testing videos.
 

 

 

 
The work is rewritten from the original web version to a Processing version. The animation is built with 3 parametric harmonic formulae. The outputs from one animation can be used as inputs for another formula, in order to simulate the artificial neural network.

Face landmark detailed information

Referring back to the post on face landmark detection, the command to retrieve face landmark information is

fm.fit(im.getBGR(), faces, shapes);

where im.getBGR() is the Mat variable of the input image; faces is the MatOfRect variable (a number of Rect) obtained from the face detection; shapes is the ArrayList<MatOfPoint2f> variable returning the face landmark details for each face detected.

Each face is a MatOfPoint2f value. We can convert it to an array of Point. The array has length 68. Each point in the array corresponds to a face landmark feature point in the face as shown in the below image.