At a time of growing concern about the safety of self-driving vehicles, a leading expert is calling on the federal government to develop a national driver’s test that such vehicles would have to pass before they could travel on public roads. Such a regulation would set minimum standards to ensure that the vehicles display basic skills and competence in traffic situations where their manufacturers want to use them, said Henry Liu, who leads the University of Michigan’s autonomous vehicle testing center. “Ensuring safety is important for consumers, for autonomous vehicle developers, for the federal government as well,” Liu said in an interview. “The federal government has the responsibility to help set the minimum standard, to help provide guidance in terms of safety testing.” In recent years, autonomous vehicles have been involved in a number of high-profile crashes, and surveys have revealed widespread public uncertainty about their safety. Successful testing of the vehicles’ ability to master a variety of traffic situations, Liu suggested, would strengthen the public’s confidence in them. Liu said significant research is still needed before autonomous vehicles could be rolled out safely nationwide. But he said he agreed with their manufacturers that in the long run, self-driving vehicles could potentially save lives and improve the efficiency of the nation’s transportation system. At present, no specific federal regulations cover self-driving vehicles, and only a few states have their own such requirements. The National Highway Traffic Safety Administration, which is part of the Department of Transportation, has been gathering data about crashes involving autonomous vehicles. But it has so far issued only voluntary guidelines that don’t include driving tests. Messages were left Tuesday seeking comment from the Transportation Department. Self-driving cars still must meet federal safety standards that apply to all passenger vehicles, which means the government investigates them only after serious incidents. “Our current safety regulation for vehicles is reactive, so we depend upon self-regulation,” Liu said. At the University of Michigan testing center, Liu runs a mock town, called Mcity, containing a traffic light and a roundabout that is used by companies and the government to test self-driving vehicles. A regulation, or perhaps a voluntary test, is needed because “we don’t want to create a public hazard,” said Liu, who made his remarks Tuesday and announced that Mcity can now be used by researchers remotely. Liu suggested that a driver’s test should be able to determine whether a self-driving vehicle can make a left turn at an intersection without the protection of a traffic light with a green arrow. He said it should also ensure that the vehicle will halt at a stop sign and detect and yield to a small pedestrian crossing a road. A test, he said, would prevent a poorly performing robot vehicle from being turned loose on society, much as a human driver’s test would keep an incompetent driver off the road. But he acknowledged that no test could prevent all crashes involving self-driving vehicles. The driver’s tests, Liu said, would help robot vehicle developers “so that when they are moving in to deploy into the U.S., into certain cities, they will face less resistance from the cities.” Tesla CEO Elon Musk has long complained that federal regulation is impeding innovation. Tesla is developing a robotaxi system called “Full Self-Driving,” but the robotaxis cannot […]