Live Video makes Google Earth Cities BustleBy vijaysree venkatraman | September 25th, 2009 | Category: New Scientist |
Now a system that can draw on real-time video from traffic and surveillance cameras, or weather sensors, is set to change that. It populates virtual towns with cars, people and realistic skies.
The video above shows how computer scientists at Georgia Institute of Technology in Atlanta put the idea into practice using video feeds from cameras around the city. Their augmented version of Google Earth incorporates sports scenes, traffic flows, the march of pedestrians and weather.
The system looks out for moving objects in a video feed. Anything moving along a street is classified as a car and replaced with a randomly chosen 3D car model. Moving objects on sidewalks are replaced with human figures, which are animated with stock motion-capture data to make them walk.
Although surveillance cameras are used, no one’s privacy is at stake because the models obscure identifying details such as a car’s colour and licence plates, says Kihwan Kim, who led the research. “Every moving object is rendered symbolically,” says Kim, who worked with colleagues Sang Min Oh and Jeonggyu Lee, and his doctoral supervisor Irfan Essa.
Sports action is recreated more directly, with less regard to privacy, using multiple camera views to create 3D models of the players.
Because cameras don’t cover every part of Atlanta’s streets, the system has to fill in the gaps. For example, the movement of cars is modelled to allow the virtual traffic to follow realistic paths once the real vehicles have left a camera’s field of view.
The system involves a combination of computer vision, graphics and machine learning: a complex mix that means some situations are beyond the system for now, such as intersections with traffic lights, says Essa.
Adding action to virtual cities has a range of applications, say Essa and Kim. For example, adding local weather data could provide a new way to check current conditions in a city.
Sports broadcasts could be merged with aerial maps to offer viewers a different experience of the game. “Imagine watching the Champions League soccer in Google Earth,” says Kim. Already, the interface allows users to “shift” a game to a stadium of their choice in another part of the world.
This upgrade to the usually static virtual globes is an impressive first step in allowing users to “be” somewhere else in real-time, says Ramesh Raskar, who leads the Camera Culture Group, at the Massachusetts Institute of Technology’s Media Lab.
That experience could be further enhanced, he adds, thanks to cellphone cameras that can provide live audiovisual feeds proliferating, and small-scale devices that sense temperature and humidity being readily available. Tapping into those sources of data could gradually transform the static virtual globes of today into a live feed of all the action on Earth.
The Georgia researchers’ work will be presented at the International Symposium on Mixed and Augmented Reality in Orlando, Florida, next month.