Addressing Challenges for Autonomous Vehicles on Winter Roads
Rajesh Rajamani, Professor, Mechanical Engineering
Areas of Expertise: Vehicles, Sensors, and Control Systems
Driver assistance technologies that improve safety and reduce driver burden are now available inside many new cars. Examples of such technologies include blind-spot monitoring; adaptive cruise control (ACC), which automatically maintains speed and spacing from other vehicles on the road; and lane keeping, which performs automatic steering control to keep the car in its lane. Currently, these systems require the driver to remain fully engaged during automatic operation, responsible for situational monitoring and taking control when these automatic systems are unable to continue safe operation.
On the whole, the autonomous driving industry is highly focused on demonstrating and developing vehicles in sunny weather locations...Researchers at the University of Minnesota are exploring solutions for reliable autonomous vehicle operation in winter seasons using a combination of sensors on the MnCAV test vehicle.
Several companies—including Tesla and Waymo—have held out the promise that future cars will be fully autonomous and able to operate entirely automatically without requiring driver engagement. Waymo’s autonomous driving technology grew out of Google’s self-driving car project that began in 2009 and is reported to have been tested over millions of miles of driving. Its Waymo One ride-hailing service currently offers fully autonomous rides in the East Valley of Phoenix, Arizona. Tesla offers on some cars a version of autopilot called Full Self-Driving Capability that can perform a number of automated functions beyond ACC and lane keeping, but it is still described as requiring active driver supervision. Many other companies—including Apple, Uber, Ford, and GM—are also working on developing autonomous driving technology.
On the whole, the autonomous driving industry is highly focused on demonstrating and developing vehicles in sunny weather locations such as California, Arizona, Texas, and Florida. When it comes to states with significant winter/snow seasons, very little testing and deployment have occurred. All lane-keeping systems currently sold on cars in the market use cameras to measure the lateral position of the car with respect to the lane boundary markers. A fundamental challenge with driving in areas with winter weather is that the lane markers can be covered by snow and not visible to the cameras on the vehicle. The basic feedback measurement needed by the automatic steering-control system can thus be missing and will prevent the automatic lane-keeping system from engaging. Further, data obtained by our research team using the University of Minnesota’s MnCAV vehicle has shown that snow actually remains on lane boundary markers longer than in the interior of the lane where tire paths of vehicles tend to melt the snow earlier.
In addition to this fundamental problem of covered lane boundaries, other perception problems of autonomous vehicles can become more challenging in winter: Programs that can work well in recognizing and tracking cars and pedestrians in sunshine may struggle to do the same when there is precipitation or fog, when vehicles are topped with piles of snow, or when people are bundled up in winter clothing.
While Tesla’s most recent system uses only cameras, Waymo and others use measurements from lidar and radar sensors in addition to cameras. With lidar-based driving, a digital map is often utilized to recognize the car’s surroundings and find its location with respect to lane boundaries and other road features. With lidar measurements, it is possible to find lateral distances from lane boundaries for automatic steering control, even if the road is covered with snow. However, lidar digital map creation in cities or other limited domains requires a 3D scan of roads, followed by engineers painstakingly labeling objects such as street signs, traffic lights, and buildings. Creating such digital maps takes significant effort, and keeping these maps updated and current is also very difficult. Digital maps are therefore often made only for small regions, like a city center or a specific route. Cars can then drive only within the areas for which they have digital maps. Few of the country’s roads have been mapped in 3D, and doing so for the entire country is a gigantic task. Another challenge for lidar sensors is that the laser on these sensors can bounce back from rain or snow particles and can provide poor perception performance during active snow or rain events.
Researchers at the University of Minnesota are exploring solutions for reliable autonomous vehicle operation in winter seasons using a combination of sensors on the MnCAV test vehicle. This work relies on sensor fusion—involving GPS, lidar, and inertial sensors—in addition to cameras. The solutions being explored will utilize maps that are less data-intensive and much easier to create. Using a combination of sensors can enable driving under conditions that pose significant disadvantages for one type of sensor. Besides winter driving, this type of technology might also be useful for driving on rural roads. Rural roads often lack lane markings and can therefore pose similar problems to autonomous driving as snow-covered roads.
Another consideration by University of Minnesota researchers is learning from the ability of human drivers to drive in winter even when lane boundaries are not visible due to snow cover. Humans use only their eyes—the equivalent of optical cameras. But we humans rely on the tire paths of preceding vehicles, and we tend to drive in “pseudo-lanes”—paths on which the other vehicles on the road are traveling. Autonomous driving using similar approaches has not been explored. Artificial intelligence systems that can figure out the right path on which to drive in the absence of clear lane markers have not been studied much, simply because the major autonomous driving companies have so far largely focused on driving in good weather and on clean roads.