Paving the Way for Automated Vehicles

Someday you’ll be able to get into a car, tell it where you want to go and get to your destination, while doing anything you want other than driving. You could eat, lose yourself in your phone, watch a movie, play cards with other passengers or even sleep. That freedom could be life-changing when cars reach full automation—Level 5 on the Society of Automotive Engineers automation scale

However, before cars are able to perform all driving functions without human intervention, we will likely have vehicles with conditional automation. These Level 3 vehicles can perform all aspects of the driving task under some circumstances, such as freeway journeys, but they still rely on the driver to take control of the vehicle when needed. 

While drivers are expected to remain vigilant in Level 3 automated vehicles, it’s more likely they will engage in other tasks and will not be continuously monitoring the road, becoming disengaged from the driving task. If they are not paying attention to the road, drivers will likely have limited or no information about what is happening around them. That leads to an important question: How do you smoothly and safely alert the driver and prepare them to take over the vehicle if automation cannot continue? 

Through a new contract with the National Highway Traffic Safety Administration, Battelle will attempt to answer that question. We'll be studying the human factors involved when drivers have to assume control from their automated vehicles when automation fails. 

Using Battelle’s driving simulator in Seattle, the team will assess participants in multiple scenarios, varying how long drivers are disengaged and how deeply they are disengaged. Some drivers may be asked to read a book, and others will be provided with nothing to do but watch the scenery. Participants will then be alerted for the need for planned or unplanned takeover of the vehicle. 

An eye tracking headset will be used to assess where drivers are looking. Different eye movement patterns are associated with defensive driving, viewing scenery, reading, watching a video and other tasks. Algorithms will detect these different patterns to determine if a driver is paying attention to the road and when they are ready to take over the driving task. 

“Our differentiator is our approach and tools for measuring where drivers are looking,” said Christian Richard, Battelle Senior Research Scientist. “By understanding how drivers pay attention and extract information from the environment, we can solve this human performance problem.” 

With the findings of the simulator study, the team will test possible options for alerting the driver based on eye movement patterns on the test track at the Transportation Research Center in East Liberty, Ohio. In-vehicle alerts and system messages could be used to intelligently guide drivers back into the driving task.
Posted
October 02, 2019
Author
Battelle Insider
Estimated Read Time
2 Mins
Stay In the Know

Get Battelle Insights in Your Inbox

Get Updates

Sign Up for Battelle Updates

Follow along with the latest news, announcements and updates from our Battelle community of solvers.