Motion-Tracking Project Lets You Be A Robot Puppeteer

— by


Embedded Systems Design student Annie Dai recently finished up her final project for the Spring 2013 semester at Cornell University. Based on an Altera DE-2 FPGA board, her project was designed to track realtime movement in a subject’s upper body (head, torso, right & left arms) using the DE-2, a VGA monitor, and a video camera on the hardware end.

The movement tracking is achieved by way of skin detection algorithms computed by the FPGA, using data derived from the video feed:

[The video streams were] filtered, averaged and stored in a down-sampled memory block. The down-sampled frames were used to compute the location of the head and arms of the user. Movements of the head, chest and arms are displayed on [the] VGA screen as a 3D robot constructed from several 3D boxes. The 3D projections are changed based on the user’s view of the camera to create a more realistic 3D robot. The user can change the current VGA view through a set of switches. The resulting system is able to mimic the user’s real time body movements.

This is a pretty cool proof-of-concept to be sure; it will be interesting to see what these student designers will be working on in five more years’ time! Annie posted some more project links and the entirety of her design on her Cornell student project page. Or, you can link to the project directly:

(via Embedds)

Newsletter

Our latest updates in your e-mail.


Leave a Reply