Engineers Find a Way to See Through the Fog
It's a problem that affects human and autonomous drivers the same, both have trouble seeing through the fog. Well, MIT researchers have developed a novel new imaging system that can see through fog so thick that human vision can't penetrate it, and it can even tell you how far away the object is. The work is an important step towards the development of reliable vision systems for autonomous cars
Until now, other imaging systems performed worse than the human eye in studies. For their proof -f-concept, the MIT researchers used a time-of-flight camera that shoots bursts of laser light and measures the time it takes the reflections to return. The system could actually see objects that were more than 50% farther than anything the human eye could see.
The difference is an algorithm that they developed which uses statistics on the way fog typically scatters light. Using the algorithm, the camera could separate the light reflected from the object from the light reflected from any density of fog.
The camera is taking measurements every 56 picoseconds to create an image of what's in front of the car during bad weather.
Now, if we could just figure out how to get the cars to see pedestrians at night, we'll be on to something.
Eye Trackers Watch You While You Work
The Tobii Pro Glasses 2 Wearable Eye Tracker is worn just like a normal pair of glasses, the only thing that's different is that it records it.
The glasses are pitched as an excellent training tool, a way to cut down training times and improve safety. The company did a case study with workers at the HH Castings metal foundry in Pennsylvania, and it was a fitting example since bad things can happen fast when you're working with molten metal.
The Tobii Pro captures video that is unlike any training tool before it, because even if you're shadowing a person, you still have a different perspective, and don't necessarily know what they are looking at.
It's a little scary, because it could also enable management and HR to essentially breakdown the tape and give you a play-by-play analysis of your work day. However, according to a company spokesperson, for now “the eye trackers are strictly for research purposes only within the set parameters of a scientific study. Anybody participating in a study would need to give their consent before being eye tracked.”
The University of Nebraska used eye trackers to study human error on construction sites. They were able to come up with a reliable model for predicting human error and preventing subsequent injuries on job sites.
Some sports teams are now using the technology to study player performance. You know what they would also be good for? YouTube tutorials, for basically anything.
MIT’s New Robot Swims with the Fishes
SoFi is a new soft robotic fish that can swim among real fish in the ocean. It was created by a team of researchers from MIT's Computer Science and Artificial Intelligence Lab (CSAIL).
Autonomous underwater vehicles (AUVs) typically needed to be tethered to a boat or powered by large propellers. They are big, bulky, and are even a collision risk. SoFi could signal a softer, tether-free future once the researchers make it capable of automatically recognizing and following fish. Until then, researchers actually control the extremely high-lead sushi using a hacked Super Nintendo controller that communicates using ultrasonic signals. It’s actually a Buffalo Classic USB Gamepad, but you know, it’s cooler to say it’s an old Super NES controller.
Otherwise, the components make for a much lighter bio-mimicking vessel. Using a fisheye lens, SoFi can take high-res photos and videos at depths of more than 50 feet, and a small, lithium power battery similar to the one in your smartphone powers it for up to 40 minutes.
A hydraulic pump moves the tail, urethane foam chambers make it buoyant, and a motor pumps water into two balloon-like chambers in the fish’s tail that work like a set of pistons.
Next, the team will work to make SoFi faster by improving the pump system and tweaking the design of its body and tail. They also hope to find a way to make the fish use the camera to automatically follow analog species in the sea.
This is Engineering By Design.