Eyes in the Sky (@EMPAC)

by Michael Mullaney on December 16, 2009

Professor and computer vision expert Rich Radke is outfitting EMPAC with a fresh set of eyes and a new visual cortex.

He and his students are augmenting EMPAC’s Studio 2 with a testbed for researching intelligent, next-gen camera network systems. This involves “stitching together” video feeds from different cameras, investigating new ways of tracking objects and individuals, along with recognizing different types of activity and crowd behavior. This is very cool stuff.


Set up in the 50-by-40-by-30-foot black-box studio at EMPAC, Radke and his team of student researchers  mounted 12 stationary fixed-focus cameras to the ceiling.  The first challenge was stitching together the twelve cameras’ feeds – a technology at the core of Radke’s research portfolio – to create a complete, comprehensive view of the room. This means taking incomplete, imperfect puzzle pieces like these:

and using computer vision algorithms to scrutinize each piece, match them together, balance all the colors, and assemble a finished snapshot of the entire floor. The finished snapshot, which would be impossible to obtain from a single camera, looks like this:

With this new, integrated eye in the sky, Radke and team are working to develop new algorithms for computers to analyze – in real time – objects moving around the room. This means tracking objects or people as they move from one camera’s vantage point to another, and plotting that movement on the “stitched together” camera feed.  Radke said he hopes that the information will be useful to other faculty and students who want to explore research possibilities in the EMPAC lab, as well as visiting artists who want to develop installations that react to the presence and motion of observers.

Radke said he also hopes to use the expansive studio environment to mock up a real-world public space, such as an airport security checkpoint, in order to prototype new computer vision algorithms for object tracking in challenging situations, such as dimly lit or highly crowded scenes.  His research is partially supported by the Department of Homeland Security Center of Excellence for Awareness & Location of Explosives-Related Threats (ALERT), which has an interest in detecting changes in people’s behavior, anomalies and other out-of-the-ordinary phenomena from video data.  

 

While the current ceiling cameras are too low-resolution to distinguish faces or gestures,  in the coming year, Radke’s team plans to augment the ceiling camera array with more sophisticated pan-tilt-zoom (PTZ) cameras mounted on the walls of Studio 2, which can quickly zoom in at great detail on a selected portion of the scene. The improved system has applications ranging from targeted evaluation of suspicious surveillance video to gestural analysis of dancers moving through the space. 

For more info on Radke’s research, check out his web site, this story of mine from last winter, and this post that he wrote for The Approach about his endeavor to build a 3-D model LIDAR model of the Rensselaer campus. Radke’s student researchers involved in this work include Eric Ameres, Andrew Calcutt and Ziyan Wu.