When disaster strikes, a search website for first responders will save lives
Researchers are building a tool that searches real-time text, photo and video from many sources to help responders allocate resources
When Mount Vesuvius erupted almost 2,000 years ago, it took hours for a single message from Pompeii to reach rescuers 18 miles away. Today we have the opposite problem during disasters: too much rapid information from many sources, with consequences just as fatal for some people.
Engineers at the University of California, Riverside are working to change this with a tool that searches real-time text, photo and video from social media and surveillance cameras alongside data from sensors, like fire detectors and security alarms. With the tool, for example, firefighters could search the terms “fire” and crowds” in a particular location and time and receive data from multiple sources.
The research, supported by a $1.2 million National Science Foundation grant, aims to develop a single search interface across all potential sources. The group’s goal is to make a functional tool–a website similar to Google–that can search keywords.
But while Google can only search websites and images, the UC Riverside tool will be also be able to search space and time.
“If police or firefighters have drones to photograph an area, this technology could guide them to where they have to go,” said Vagelis Hristidis, a professor of computer science and engineering in the Marlan and Rosemary Bourns College of Engineering.
In addition to locating and analyzing information, the new tool will also collect it to constantly update databases. With a more integrated and holistic view of the situation, first responders can better allocate their resources.
Work has already been done searching individual sources. But to date, there is no way to search multiple sources at the same time said Hristidis, who leads a team that includes four of his Bourns College colleagues: Vassilis Tsotras, Amit Roy Chowdhury, Evangelos Papalexakis, and Konstantinos Karydis.
“The question is how to be more active in increasing the coverage,” Hristidis said. “We’re trying to get the best out of existing sources and cover gaps in order to get the big picture of an event.”
One challenge is how to represent all sources (text, image, social relationship) in a common format. Another challenge is how to convert the sources into vectors, or numbers, so that computers can understand them. The group has done some preliminary work using advanced mathematical concepts like tensors–an algebraic way to map objects across multiple dimensions–to convert data sources to numbers.
“This project will require us to address a set of very challenging problems which will undoubtedly push the boundaries in representation learning forward,” said Papalexakis, an assistant professor of computer science and engineering.
The search tool could also provide better security at concerts, sports, commencements, events that draw large crowds, and improve the safety of students on college campuses with live monitoring as potentially dangerous situations unfold.
Hristidis stressed that the search tool will only be able to access public information, like public Facebook posts or Tweets, and security apparatus like surveillance cameras and sensors that police and fire departments can already access. It will not have access to social media posts with privacy restrictions, personal surveillance cameras, or other sensors in homes.