Happy Friday! We’ve got some fresh Master’s projects from Cornell University’s electronics school to share with you today. We’ll get right into it:
This first project is by Master’s students Anil Ram Viswanathan and Zelan Xiao, who have created a head-mounted eyeball gaze tracker, for lack of a better term. The hardware consists of two cameras mounted to a construction worker-style helmet. Camera one faces forward in the same direction that the user is looking. Camera two points straight at the user’s eyeball. The project designers wrote custom algorithms to detect the user’s iris as viewed by camera two, as well as which direction the eyeball is looking. Camera two’s video data is used to produce a “bullseye” mark – which is superimposed on the video from camera one to show in realtime where the user is looking.
Although commercial versions of this technology already exist at thousands of dollars’ price per unit, this one was implemented for under $500. Cheap, lightweight, and powerful. I can immediately see this thing being used for market research applications, so advertisers can, for example see where people’s gaze is most likely to be attracted to on a television screen or a computer screen. Or even gestural control interfaces simply based on where a user is looking? Lots of possibilities to be sure.
Additional information:
- Gaze tracking Project design report (PDF)
Next we have Cornell University Master’s student Thu-Thao Nguyen with her FPGA face tracking project. This one uses a video camera pointed at the user (or users) sending video data through an FPGA development board. She has translated a MATLAB face tracking algorithm into Verilog code for the FPGA to process where the face(s) is (are) looking, and the general position thereof. The FPGA can detect skin color, and uses a spatial filter to get rid of video “noise” to hone in on where the skin of the face actually is. This project is capable of tracking a maximum of two faces at once.
Suggested applications of this project are to save energy – since the FPGA knows where a user or users are looking, it could be connected to a television set so if the user is looking away from the screen for a set amount of time it could dim the screen. Or if a user walks out of the room it could turn the TV off completely.
I’m thinking more along the lines of simply looking at my microwave and it gives me a heated WhiteCastle cheeseburger. This thing knows me so well.
Additional information:
- FPGA face detection and tracking Project design report (PDF)
In all, these are two very forward-looking projects that are low cost and efficient, compared with similar setups already on the market. Can anyone think of any interesting applications of this tech? Have at it – share with us in the comments.
(via Hackaday)
Leave a Reply
You must be logged in to post a comment.