UH Researcher Working to Make Security Cameras "Smarter"


Video Analytics Project Will Help Cameras Send Alert at First Sign of a Threat

Security cameras deployed across the nation – at transit stops, sporting events and other places where people gather – can provide valuable clues to law enforcement investigating trouble after the fact. A University of Houston researcher is developing a system that would allow the cameras to recognize and send an alert at the first sign of a threat to public safety.

Shishir ShahComputer science professor Shishir Shah is working to develop “smart” cameras that can alert first responders at the first sign of trouble.Shishir Shah, professor of computer science at UH’s College of Natural Sciences & Mathematics, received a $1.57 million grant from the National Institute of Standards and Technology Public Safety Innovation Accelerator Program to pursue research using video analytics to develop “smart” camera security systems that can automatically alert first responders to relevant issues and emergencies. NIST is a division of the U.S. Department of Commerce.

Shah works in image analysis and “computer vision,” a field related to artificial intelligence that deals with analyzing large numbers of images to extract data which can be used to guide real-world decisions.

He will collaborate with the city of Houston, working with Julie Stroup, program director of the Public Safety Video Initiative, to ensure the work addresses the needs of public safety/homeland security personnel and other first responders. The video initiative was started in 2007 by Dennis Storemski, director of the Mayor’s Office of Public Safety and Homeland Security. There are currently more than 850 city-owned cameras in public spaces, and the office collaborates and shares video with other regional public safety agencies.

Shah’s lab, the Quantitative Imaging Laboratory, has a network of cameras set up in and around a classroom building on the UH campus. There are thousands more spread across the city’s public spaces.

It’s not practical to have people watch the feeds from all those cameras in real time, he said, both because there is too much video footage to make it cost-effective and because nothing of note happens most of the time, raising the risk that people will miss the few seconds of meaningful information.

Instead, the project will seek to build mathematical models that reflect how people act or behave and, based on that, predict deviations from the norm.

“The long-term plan is to try and move the needle in smart cameras, cameras that have some intelligence about what they are recording and how that can be used,” Shah said.

It will start with what he describes as “lower levels of intelligence,” including teaching the cameras to send an alert when something prevents optimal operation – a smudged or covered lens, a mechanical problem and even an external obstruction, such as a tree sprouting leaves that blocks the view.

Mid-level intelligence will involve things like recognizing when a vehicle is traveling the wrong way on a one-way street.

More complex programs would determine whether people’s behavior has deviated from the norm, incorporating variables including the laws of gravity – people don’t walk on the ceiling – and such social norms as where people stand and walk. That has to account for situational and cultural cues, Shah said, noting that walking on the grass might be aberrant in some situations but normal at a backyard barbecue, and people of different cultural backgrounds have different ideas about personal space.

Those are things that people watching a video feed automatically use to analyze what they are seeing. “How do we get cameras to that level of intelligence, which people bring to interpreting what is happening in a scene?” Shah asked. “For now, that analysis is really still dependent on people.”

- Jeannie Kever, University Media Relations

Related Video