The Trump administration released new guidelines today designed to promote the development of self-driving cars.
"Our country is on the verge of one of the most exciting and important innovations in transportation history," Transportation Secretary Elaine Chao said at a press conference at the University of Michigan.
"We are motivated by the potential of automated tech to transform mobility, reshape transportation, and revolutionize safety," Chao said at a press conference at the University of Michigan.
Chao, flanked by auto industry representatives and the president of the National Federation of the Blind, said her goal was to prod progress from car and technology companies, while allowing that customers will be the final arbiters of how fast automation comes to America's roads.
She said self-driving technology could reduce accidents and improve mobility for the elderly, disabled and other restricted populations.
The new guidelines, which build on rules under the Obama administration, focus on systems that go well beyond the self-parking and automated braking systems now widely available to much more aggressive self-driving tools all the way up to full automation.
The guidelines clarify existing rules to permit more testing and address regulation between the federal government and states.
They won praise from the Alliance of Automobile Manufacturers, whose members have been investing heavily in automated technologies.
"This federal guidance is helpful in advancing road safety and safe testing, while also providing more clarity on the role of states," the trade group said.
"The guidance provides the right balance, allowing emerging innovations to thrive while government still keeps a watchful eye over new developments."
But the nonprofit Consumer Watchdog warned of disaster.
"This isn't a vision for safety," said John Simpson, Consumer Watchdog's privacy project director.
"It's a roadmap that allows manufacturers to do whatever they want, wherever and whenever they want, turning our roads into private laboratories for robot cars with no regard for our safety."
The announcement came as the National Transportation Safety Board (NTSB) released a finding that a May 2016 fatal car crash in Florida was caused in part by the driver's overreliance on Tesla's Autopilot system due in part to system defects.
The driver's "overreliance" on the Tesla system -- designed as a semi-autonomous driving system to be used with a human operator -- permitted "prolonged disengagement" that led to the collision with a truck, the NTSB report said.
The system also permitted the driver to use the system on a road not intended for Autopilot, NTSB officials said.
(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)