EDN, May 26, 2011

Issue link: http://dc.ee.ubm-us.com/i/43174

Contents of this Issue


Page 39 of 63

available from Velodyne's HDL (high- definition LIDAR)-64D laser-sensor system, which uses 64 spinning lasers and then gathers 1.3 million points/sec to create a virtual model of its surround- ings. One reason to use LIDAR rather than radar is that the laser's higher- energy, shorter-wavelength laser light better reflects nonmetallic surfaces, such as humans and wooden power poles. Google combines the LIDAR system with vision cameras and algo- rithmic vision-processing systems to construct and react to a 3-D view of the world through which it is driving (Reference 2). The enabling sensor hardware in the vehicles enables the cars to see every- thing around them and make decisions about every aspect of driving, according to Thrun. Although we are not close yet to a fully autonomous vehicle, the tech- nology, including the sensor platform of radar, ultrasonic sensors, and cameras, is available in today's intelligent vehicle. It remains only to standardize the car's hardware platform and develop the soft- ware. Cars are approaching the point that smartphone platforms had reached just before the introduction of the Apple AT A GLANCE ↘ Although self-driving cars are still five to 10 years in the future, they offer many benefits in traffic and fuel efficiency, safety, and time saving. ↘ Self-driving cars will require relatively few infrastructure changes, with cars relying on sensors and processors to "see" and react to their surroundings. ↘ Some of the key sensors will be radar or LIDAR (light-detection- and-ranging) and vision systems, such as visible light and IR (infrared) cameras. iPhone and the Motorola Android. As sensors decrease in price and increase in integration, they will become ubiquitous in all cars. Once users accept them as normal parts of a car, then automotive-OEM companies can integrate more intelligence into them until they achieve the goal of an autonomous car. Today's intelligent automobile can perform many driver- assistance tasks, such as avoiding and preventing accidents and reducing the severity of accidents. To perform these tasks, the vehicles have passive safety systems, such as air bags and seat belts; active safety systems, such as electronic stability control, adaptive suspension, and yaw and roll control; and driver- assistance systems, including adaptive cruise control, blind-spot detection, lane-departure warning, drowsy-driver alert, and parking assistance. These sys- tems require many of the same sensors that the autonomous car requires: ultra- sonic sensors, radar, LIDAR systems, and vision-imaging cameras. Cars now use ultrasonic sensors to pro- vide proximity detection for low-speed events, such as parallel parking and low- speed collision avoidance. Ultrasonic detection works only at low speeds because it senses acoustic waves; when the car is moving faster than a person can walk, the ultrasonic sensor is blind. Although ultrasonic-sensor technol- ogy is more mature and less expensive than radar, car designers who care about the aesthetics of the car's appearance are reluctant to have too many sensor aper- tures visible on the car's exterior. As a more powerful and more flexible technol- ogy, radar should begin to replace ultra- Figure 1 Google's fleet of specially equipped cars has logged more than 140,000 miles of daytime and nighttime driving in California, including traversing San Francisco's famously crooked Lombard Street and the Los Angeles freeways. 40 EDN | MAY 26, 2011

Articles in this issue

Archives of this issue

view archives of EDN - EDN, May 26, 2011