AR Hockey and Pong
The logic of AR Hockey is similar to the logic of the arcade game Pong: two opposing paddles and a ball that bounces back and forth. To make my game I would learn how to make Pong myself, and then modify it to the specificities that AR Hockey demands like body tracking, working with projection in real-time, etc.
Hello World: Processing
To make Pong I used Processing, which is an open source programming environment. To get started with Pong I used the Collision default example and built up from that. This was my first try at programming so I figured out things empirically, and I failed a lot. I tried to fail faster, however, and a lot of patience was needed along this process. Learning Processing by Daniel Shiffman became my Bible, as well as the Processing forum. I had tutorials online with Pete and Joel and that helped me a lot as well.
Building Up: Computer Vision
The next stage was to blend real live video from the webcam and use my hand as a paddle. I had to come to the logic of how to make my hand the paddle. At this point I needed a mini program, so to speak, to deal with the paddle part. This meant taking a problem and slicing it up in smaller pieces, but being aware of how they interconnect in the bigger picture. This is something that as designers we do quite often (sometimes even unknowingly), so it was a challenge but I could also relate to this way of working. I started to use OpenCV, so I could use the webcam and superimpose the Pong graphics on video. The last step was to pair my hand with the paddle, so when I moved it, the paddle moved too.
What designers and artists use to track the participants’ movements in interactive installations that have full-body movement can be infrared cameras, like CCTV. The way these cameras (or computers) see us is referred to as computer vision. These cameras are usually very expensive (at least for a student). I needed an infrared camera to work that was affordable, so I had to build my own. I followed tutorials online and advice from my tutors to turn a webcam (PS3 Eye cam) into an infrared camera.
At this stage I had to get Pong and the infrared camera to work together. I needed to find the environment that allowed me to do this. Instead of the PS3 Eye camera I would use Microsoft’s Kinect and instead of working in Processing I needed another programming environment, more powerful to enable me to do the things I required to take AR Hockey into the final stage.
First tests with Processing and Kinect:
Some of the setbacks I faced involved programming. I didn’t have the right syntax to run the code. At one point the code got so complex that I ended up with spaguetti code: a tangled structure that could be more simple. I learned at that point that I needed to break it apart by learning what coupling is. I got black screens and blank screens, and I couldn’t figure out the right width and height of the camera feed to superimpose the digital paddles and ball. I had to try different ways to make things work, looking at tutorials online and asking questions on forums. When I did do something accurate, it felt like a small victory. Small victories helped me move forward. I also tried to see the big picture and use that as motivation when the smaller parts didn’t quite work.
So, at this point I ended up with an Augmented Reality Pong game, hoping to become the AR Hockey game once I advance further.
The source code for this project can be found here:
It was quite a leap for me to be able to make this stage work! Making things move/change as I create them was a big motivation for keeping on, and I wanted to show it so others could feel the same. In order to complete this project I think I need to try in openFrameworks perhaps, paired with a couple of kinects and projection mapping following the bodies and projecting a bit above the body’s blob (dilate perhaps). But I need to play more with these things to get to understand them and to know what I’m doing. But it is a start!