You might have seen in certain science-fiction movie where the actor goes to explore some unidentified places and his route map is tracked in the computer which is monitored by others. It seems the fiction is on its way to reality and the road to it was paved by some scientist from MIT(Massachusetts Institute of Technology).
The scientists claim that such a system would come in handy in assisting help emergency responders coordinate disaster response. It consists of a wearable sensor system that automatically creates a digital map of the environment through which the wearer is moving. The project is supported by the US Air Force and the Office of Naval Research, and may help rescuers in a disaster.
In experiments conducted at the MIT, a student wearing the sensor system wandered the halls of the campus, and the sensors wirelessly relayed data to a laptop in a distant conference room where observers were able to track his progress on a map that was created as he moved.
The system developed at MIT consists of many sensors performing different functions which is attached to a sheet of plastic, and worn on the chest. In addition to the sensors it consists of a laser range-finder, accelerometers, gyroscopes, a Kinect depth sensor and a barometer. The accelerometer deals with the speed and altitude of the wearer and the gyroscope deals with the angular tilt or orientation. The laser sweeps the vicinity in a 270-degree arc and calculates how long it takes for the light to return by which it can calculate the distance between the wearer and anearby physical structure.
In addition it has a camera which takes snapshots of its surroundings at every few meters, the system’s software then extracts many visual features from the image such as patterns of colour, or contours, or inferred three-dimensional shapes and associates them with a particular location on the map.
There is also a push button which the wearer can use to flag important. But the researchers believe that in the future it can be developed to add voice or text tags to the map — indicating, say, structural damage or a toxic spill.
“The operational scenario that was envisioned for this was a hazmat situation where people are suited up with the full suit, and they go in and explore an environment,” says Maurice Fallon, a research scientist in MIT’s Computer Science and Artificial Intelligence Laboratory, and lead author on the new paper. “The current approach would be to textually summarize what they had seen afterward — ‘I went into this room on the left, I saw this, I went into the next room,’ and so on. We want to try to automate that.”
The systems software is also capable of correcting previous reading when the wearer revisits a location with new snapshots and updates in the corresponding map.