This is another JavaScript test with the HTML5 Canvas in a blog post. It recreates the bouncing ball exercise.
Continue reading
OpenCV Features2D in Processing
The following Processing codes demonstrate the use of the OpenCV Features2D to detect key points from the webcam image. It is the first part of a more complex task to identify 3D motion from a 2D image.
People Detection in Processing with OpenCV
This is the original OpenCV people detection example ported to Processing with the Java library of OpenCV 2.4.8. It can achieve more than 20 frames per second. Here is a sample test snapshot.
Continue reading
OpenCV Motion Template Example in Processing
The following example ported the original OpenCV motion template sample code in C to Java/Processing. The original source is the motempl.c file in the OpenCV distribution.
The program started using the default video capture device and passed it to the class Motion. It employed the accumulated difference images to segment into different motion regions, delivered back with a list of rectangles, indicating where the motion components are. It then returned to the Processing main program with an ArrayList of the class Result.
Continue reading
Conversion between Processing PImage and OpenCV cv::Mat
This example illustrates the use of the Java version of OpenCV. I built the OpenCV 2.4.8 in Mac OSX 10.9. After building, I copied the following two files to the code folder of the Processing sketch:
- opencv-248.jar
- libopencv_java248.dylib
The program initialised the default video capture device. It converted the PImage first into an OpenCV matrix – Mat. The matrix is duplicated into another copy and then converted back to another PImage for display with the image command.
It can achieve around 60 frame per second in my old iMac. Here is the screenshot.
Continue reading
JavaScript Test
This is a first test with JavaScript canvas in a WordPress post. There is an external script file with the following content.
Continue reading
3D scanning and hosting
Here is a test run with the Skanect software and the 3D model hosting at Sketchfab.
Winnie the Pooh from chungbwc on Sketchfab.
Telling 32-bit or 64-bit in Java
Here is a short tip to check whether the current machine (JVM) is running 32 bit or 64 bit in Java and thus Processing.
int arch = Integer.parseInt(System.getProperty("sun.arch.data.model")); |
The integer variable arch will give either 32 or 64 depending on the JVM.
Kinect for Windows in Processing 3
Finally, the skeleton part of the library is done. In this very experimental version, I extract only one skeleton and store the joints information in an array (size 20) of PVector type in Processing. The tracking state is not implemented yet. I use the z-depth value to indicate if that joint is validly tracked or not. The x and y values are normalized to the range from 0 to 1 in the screen space. In the next version, I would like to implement the async event of tracking. It is time to integrate the all three components:
- Individual RGB and depth images
- Aligned RGB and depth image
- Skeleton tracking
Here is a copy of the sample Processing code.
import pKinect.PKinect; PKinect kinect; PVector [] loc; void setup() { size(640, 480); kinect = new PKinect(this); smooth(); noFill(); stroke(255, 255, 0); } void draw() { background(0); loc = kinect.getSkeleton(); drawSkeleton(); } void drawSkeleton() { // Body DrawBone(kinect.NUI_SKELETON_POSITION_HEAD, kinect.NUI_SKELETON_POSITION_SHOULDER_CENTER); DrawBone(kinect.NUI_SKELETON_POSITION_SHOULDER_CENTER, kinect.NUI_SKELETON_POSITION_SHOULDER_LEFT); DrawBone(kinect.NUI_SKELETON_POSITION_SHOULDER_CENTER, kinect.NUI_SKELETON_POSITION_SHOULDER_RIGHT); DrawBone(kinect.NUI_SKELETON_POSITION_SHOULDER_CENTER, kinect.NUI_SKELETON_POSITION_SPINE); DrawBone(kinect.NUI_SKELETON_POSITION_SPINE, kinect.NUI_SKELETON_POSITION_HIP_CENTER); DrawBone(kinect.NUI_SKELETON_POSITION_HIP_CENTER, kinect.NUI_SKELETON_POSITION_HIP_LEFT); DrawBone(kinect.NUI_SKELETON_POSITION_HIP_CENTER, kinect.NUI_SKELETON_POSITION_HIP_RIGHT); // Left Arm DrawBone(kinect.NUI_SKELETON_POSITION_SHOULDER_LEFT, kinect.NUI_SKELETON_POSITION_ELBOW_LEFT); DrawBone(kinect.NUI_SKELETON_POSITION_ELBOW_LEFT, kinect.NUI_SKELETON_POSITION_WRIST_LEFT); DrawBone(kinect.NUI_SKELETON_POSITION_WRIST_LEFT, kinect.NUI_SKELETON_POSITION_HAND_LEFT); // Right Arm DrawBone(kinect.NUI_SKELETON_POSITION_SHOULDER_RIGHT, kinect.NUI_SKELETON_POSITION_ELBOW_RIGHT); DrawBone(kinect.NUI_SKELETON_POSITION_ELBOW_RIGHT, kinect.NUI_SKELETON_POSITION_WRIST_RIGHT); DrawBone(kinect.NUI_SKELETON_POSITION_WRIST_RIGHT, kinect.NUI_SKELETON_POSITION_HAND_RIGHT); // Left Leg DrawBone(kinect.NUI_SKELETON_POSITION_HIP_LEFT, kinect.NUI_SKELETON_POSITION_KNEE_LEFT); DrawBone(kinect.NUI_SKELETON_POSITION_KNEE_LEFT, kinect.NUI_SKELETON_POSITION_ANKLE_LEFT); DrawBone(kinect.NUI_SKELETON_POSITION_ANKLE_LEFT, kinect.NUI_SKELETON_POSITION_FOOT_LEFT); // Right Leg DrawBone(kinect.NUI_SKELETON_POSITION_HIP_RIGHT, kinect.NUI_SKELETON_POSITION_KNEE_RIGHT); DrawBone(kinect.NUI_SKELETON_POSITION_KNEE_RIGHT, kinect.NUI_SKELETON_POSITION_ANKLE_RIGHT); DrawBone(kinect.NUI_SKELETON_POSITION_ANKLE_RIGHT, kinect.NUI_SKELETON_POSITION_FOOT_RIGHT); } void DrawBone(int _s, int _e) { if (loc == null) { return; } PVector p1 = loc[_s]; PVector p2 = loc[_e]; if (p1.z == 0.0 || p2.z == 0.0) { return; } line(p1.x*width, p1.y*height, p2.x*width, p2.y*height); } |
This version of the code (for Windows 7, both 32-bit and 64-bit) is available here.