Tesla crash triggers debate over safety and global competition
Thune had planned to attach driverless-vehicle development measure to China bill, but panel delays markup
The fatal crash of a Tesla believed to be running on autopilot on April 17 has reignited the debate over how the federal government should regulate autonomous vehicles, with lawmakers split on whether to encourage the development and testing of driverless vehicle technology on U.S. roads or move more cautiously in order to avoid similar crashes.
That debate was on display Tuesday during a hearing of the Senate Committee on Commerce, Science, and Transportation’s Subcommittee on Surface Transportation, Maritime, Freight, and Ports. The head of the key trade association representing U.S. automakers warned lawmakers that the poky adoption of a regulatory framework meant that the U.S. risks falling behind competitors such as China, the European Union, Japan and Korea.
“We have to do better,” said Alliance for Automotive Innovation President and CEO John Bozzella, saying the technology will ultimately help reduce accidents caused by driver error. “We have to work with a sense of urgency to reduce highway fatalities and injuries on America’s roadways. Automated vehicles hold tremendous promise. There is great opportunity here.”
The federal government’s reluctance to adopt a regulatory framework on autonomous vehicles has allowed other nations to move ahead competitively, he said.
“Our competitiveness is at stake,” he said, adding: “We need to be in that game. We enjoy a leadership position now, and we risk losing it if we don’t create this national framework to deploy and test highly automated vehicles at scale safely and effectively.”
The alliance on Tuesday unveiled safety principles to address systems that keep vehicles centered in their lane and adjust their speed relative to other vehicles, also known as adaptive cruise control, but require driver monitoring. Vehicles employing that level of automation, called Level 2, represent the bulk of driverless technology in the U.S., Bozzella said.
There are six levels of automation, ranging from no automation to completely driverless vehicles, according to the Society of Automotive Engineers.
The new principles were a clear attempt to distance the organization from Tesla and its owner, Elon Musk, who has disputed whether the autopilot feature was even installed on the car involved in the crash. Although Tesla has promised full self-driving capabilities, it admitted in a Dec. 28 letter from its associate general counsel Eric C. Williams to the California DMV’s chief of the autonomous vehicles branch, Miguel D. Acosta, that the technology was only at Level 2.
Bozzella spoke one day after Sen. John Thune, R-S.D., was forced to postpone plans to push a measure aimed at allowing automakers to test tens of thousands of self-driving automobiles on U.S. roads.
China bill
The Thune amendment would allow the National Highway Traffic Safety Administration to initially exempt 15,000 self-driving vehicles per manufacturer from safety standards written with human drivers in mind, starting with the first year of enactment. That figure would rise to 80,000 exempt vehicles within three years. Currently, NHTSA allows auto manufacturers to exempt up to 2,500 vehicles per manufacturer.
He had planned to attach it to a bill to provide $100 billion in funding for science and technology research and development over concerns about maintaining U.S. competitiveness with China.
“I think that markup’s been pushed now,” Thune said late Monday. “It’s not going to happen until after the recess. But we had intended to offer the amendment.”
He said if the markup of the larger China bill is rescheduled, he may try to attach it, “but we may have to pull back now because there are problems cropping up.”
Groups including Advocates for Highway and Auto Safety and the American Association for Justice, a trial lawyer group, have warned lawmakers not to back the amendment, saying it would push autonomous-driving technologies before an appropriate amount of research has occurred.
“Ultimately, the industry wants a license to use America’s roads as a test track and evade responsibility when its technology injures or kills Americans,” said Linda Lipsen, CEO of the American Association for Justice, who cited the Houston accident as evidence of the potential dangers of the technology. “America can and should lead the way on developing new technologies, but not by allowing the industry to skirt accountability and eliminating the critical safety incentives that have continuously led to auto safety improvements.”
On Tuesday, it became clear that her group may not represent the only pushback to the bill.
Sen. Richard Blumenthal, D-Conn., asked what steps regulators should take to address the concerns about the safety of such automated systems. He has expressed concern about Tesla and said Tuesday that the crash “highlights many unanswered questions about technology that purports to be automated.”
Bozzella said that, despite advances in automated technology, “I know of no vehicles in the U.S. marketplace today that are self-driving vehicles. Every vehicle requires a driver to be completely engaged in the driving task at all times.”
Sen. Amy Klobuchar, D-Minn., questioned what safeguards are in place to prevent the misuse of automated technology. “What can we do to ensure that consumers are getting that education so they can safely use this technology?” she asked. Bozzella re-emphasized the need for driver monitoring principles his group has introduced.
Thune, who has worked closely with subcommittee chairman Gary Peters, D-Mich., on autonomous vehicle legislation, argued that allowing more autonomous vehicles on the road is important in order to allow the regulatory framework to catch up with private sector innovation.
Bozzella said he supported expanding the number of such vehicles allowed on the roads in order to advance the technology to higher levels.
“We have to create a new regulatory framework for highly automated vehicles,” he said. “In order to do that we need more data,” but “the small current process doesn’t give us enough vehicles on the road or enough data over a long enough period of time to really get that insight and data.”
Ariel Cohen contributed to this report.