TechNews Pictorial PriceGrabber Video Tue Nov 26 09:26:59 2024

0


How can people safely take control from a self-driving car?
Source: Justin Pritchard


This Friday, Nov. 13, 2015 file photo provided by Virginia Tech shows Virginia Tech Center for Technology Development Program Administration Specialist Greg Brown behind the wheel of a driverless car during a test ride showing the alert system handing over automation to the driver, while traveling street in Blacksburg, Va. New cars that can steer and brake themselves may lull drivers into a false sense of security. One way to keep people alert may be providing distractions that are now illegal, just one surprising finding from Stanford University research that studied the behavior of students in a self-driving car simulator.(Justin Fine/Virginia Tech)

New cars that can steer and brake themselves risk lulling people in the driver's seat into a false sense of security―and even to sleep. One way to keep people alert may be providing distractions that are now illegal.


That was one surprising finding when researchers put Stanford University students in a simulated self-driving car to study how they reacted when their robo-chauffer needed help.

The experiment was one in a growing number that assesses how cars can safely hand control back to a person when their self-driving software and sensors are overwhelmed or overmatched. With some models already able to stay in their lane or keep a safe distance from other traffic, and automakers pushing for more automation, the car-to-driver handoff is a big open question.

The elimination of distracted driving is a major selling point for the technology. But in the Stanford experiment, reading or watching a movie helped keep participants awake.

Among the 48 students, 13 who were instructed to monitor the car and road from the driver's seat began to nod off. Only three did so when told to focus on a screen full of words or moving images.

Alertness was particularly helpful when students needed to grab the wheel because a car or pedestrian got in the way.

There's no consensus on the right car-to-driver handoff approach: the Stanford research suggests engaging people with media could help, while some automakers are marketing vehicles with limited self-driving features that will slow down if they detect a person has stopped paying attention to the road.

Self-driving car experts at Google, which is pursuing the technology more aggressively than any automaker, concluded that involving humans would make its cars less safe. Google's solution is a prototype with no steering wheel or pedals―human control would be limited to go and stop buttons.



This May 13, 2015 file photo shows the front of Google's new self-driving prototype car during a demonstration at the Google campus in Mountain View, Calif. New cars that can steer and brake themselves may lull drivers into a false sense of security. One way to keep people alert may be providing distractions that are now illegal, just one surprising finding from Stanford University research that studied the behavior of students in a self-driving car simulator.(Tony Avelar, File)


Though research is ongoing, it appears that people need at least 5 seconds to take over―if they're not totally checked out.

One riddle automakers must solve: How to get owners to trust the technology so that they'll use it―but not trust it so much that they'll be lulled into a false security that makes them slow to react when the car needs them.

Trust was on the mind of researchers who in August published an extensive report on self-driving cars funded by the National Highway Traffic Safety Administration. "Although this trust is essential for widespread adoption, participants were also observed prioritizing non-driving activities over the operation of the vehicle," the authors wrote.


This Friday, Nov. 13, 2015 file photo provided by Virginia Tech shows Virginia Tech Center for Technology Development Program Administration Specialist Greg Brown behind the wheel of a driverless car during a test ride showing the alert system handing over automation to the driver while traveling street in Blacksburg, Va. New cars that can steer and brake themselves may lull drivers into a false sense of security. One way to keep people alert may be providing distractions that are now illegal, just one surprising finding from Stanford University research that studied the behavior of students in a self-driving car simulator.(Justin Fine/Virginia Tech via AP, File)

Another wide-open question: How to alert the person in the driver's seat of the need to begin driving.

It appears that the car should appeal to several senses. Visual warnings alone may not suffice. Combine a light with spoken instructions or physical stimulation such as a vibrating seat, and people are quicker to reassume control.

"If it is done courteously and subtle and not annoying, it could be missed by someone that is distracted," said Greg Fitch, a research scientist at the Virginia Tech Transportation Institute. Then again, the way the car interacts with people will be one way automakers differentiate their product―and overbearing warnings may sour potential buyers.

Other issues Fitch cites include "mode confusion" (making sure the car clearly informs the person whether or not it is driving itself) and clear explanations to drivers of what the car can―and cannot―handle.



This May 13, 2015 file photo shows the front of Google's new self-driving prototype car during a demonstration at the Google campus in Mountain View, Calif. New cars that can steer and brake themselves may lull drivers into a false sense of …more

Cars with the right sensors are becoming really good at monitoring the outside world and have quicker response times than humans. People are much better at making decisions under uncertain circumstances.

One lesson from the Stanford study may be that master and machine are better viewed as collaborators.

"There's really a relationship between drivers and cars," said David Sirkin, who helped run the experiment at Stanford's Center for Design Research, "and that relationship is becoming more a peer relationship."



This Friday, Nov. 13, 2015 file photo provided by Virginia Tech shows Virginia Tech Center for Technology Development Program Administration Specialist Greg Brown behind the wheel of a driverless car during a test ride showing the alert system handing over automation to the driver while traveling street in Blacksburg, Va. New cars that can steer and brake themselves may lull drivers into a false sense of security. One way to keep people alert may be providing distractions that are now illegal, just one surprising finding from Stanford University research that studied the behavior of students in a self-driving car simulator.(Justin Fine/Virginia Tech via AP, File)


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |