These drivers knew they weren’t using a foolproof system, and that there would be glitches as they had agreed to test early versions of the regularly updating «full self-driving» software for Tesla. The company warned them of limitations, and their need to be attentive.
Experts worry that the name of the feature implies a greater functionality than what Tesla is actually offering. But the risks of «full self-driving» don’t appear to be holding Tesla back from a broad beta release of the feature. Tesla is preparing a wide rollout even as some of the Tesla loyalists testing the feature raise concerns about what will come next.
The police statement that there was no driver behind the wheel suggests that Autopilot, the widely available precursor to «full self-driving,» may have been active and, if so, was being used inappropriately.
Tesla CEO Elon Musk said Monday that data logs recovered so far show Autopilot was not enabled. But Musk did not rule out that future findings could reveal Autopilot was in use. He also did not share an alternative theory for the crash.
Tesla did not respond to multiple requests for comment, and generally does not engage with the professional news media.
The long road to «full self-driving»
Teslas using a version of the «full self-driving» beta have at times attempted seemingly dangerous left turns — pulling in front of looming high-speed traffic, or slowly making a turn, triggering uncomfortable drivers to push the accelerator to get out of harm’s way.
Tesla’s full self-driving software, or FSD, is technically a driver-assist system, so American regulators allow beta versions of it to be tested on public roads. There are stiffer restrictions on driver-assist systems in Europe, where Tesla offers a more limited suite of autonomous driving features.
And even when the system does appear to be working as intended, Tesla says that drivers are supposed to remain attentive and be prepared to take over at any time. But some worry that these guidelines won’t be heeded.
Calling for caution
AI DRIVR, a YouTuber who posts Tesla videos and is testing «full self-driving» already, has said on social media that he’s nervous about a large population getting the feature, and says people are bound to abuse it.
Like other social media users who post frequently about Tesla’s «full self-driving» software, AI DRIVR said he had an NDA, and, when contacted by CNN, he said he was not able to speak to CNN directly.
«Please let’s not screw this up and make Tesla regret their decision and the freedom that they are giving people,» AI DRIVR said.
«The beta is at a point where it can behave amazingly well and then the next second does something very unpredictable,» he said in a YouTube video. One shortcoming he claimed he experienced while using the beta version of «full self-driving» was his Tesla sometimes swerving on highways around semi trucks, when there was no clear reason to do so. In a YouTube video he speculated that one of the Tesla’s side cameras could be to blame as it’s obstructed by the trucks. AI DRIVR did not post video footage of his Tesla behaving in this way.
Raj Rajkumar, a Carnegie Mellon University professor who studies autonomous vehicles, told CNN Business that the camera on the side of the Tesla may essentially see a flat surface (the side of the truck) with the same color and texture, and incorrectly conclude that something is very close.
«Their side cameras very likely do not sense depth,» Rajkumar said. «With this ambiguity, the Tesla software may be concluding that it is best to be conservative and swerve.»
Tesla has a radar, but that is forward looking, so not aimed at trucks next to it. Ultrasonics are on all sides of the Tesla, but they’re really only useful for parking, Rajkumar said.
Rajkumar said that because «full self-driving» has «a lot of problems,» based on his assessment of beta testers’ YouTube footage, Tesla will need to prioritize what problems it addresses first and may not have had time to fully address the issue yet. Rajkumar has not tested the beta version of «full self-driving» himself.
Rajkumar said that one of the problems of «full self-driving» is its own name, which like Autopilot, he says, is extremely misleading. Drivers will get complacent and tragic crashes will happen, he said.
«I have wondered for a long time why the Federal Trade Commission does not consider this as deceptive advertising, and why NHTSA has not forced Tesla to not use these names from a public safety standpoint,» Rajkumar said.
The National Highway Traffic Safety Administration said that it will take action as appropriate to protect the public against risks to safety, but that it does not have authority over advertising and marketing claims and directed questions to the Federal Trade Commission, which does provide oversight of this kind. The Federal Trade Commission declined to comment.
James Hendler, who studies artificial intelligence at Rensselaer Polytechnic Institute told CNN Business that another plausible explanation for Teslas allegedly swerving near semi trucks is that the angle that the sun reflecting off trucks makes the Tesla think the semis are extremely close.
«These cars don’t think in terms we can understand. They can’t explain why they did it,» Hendler said.
Keeping an eye on drivers
An MIT study of 19 drivers last year found that Tesla owners were more likely to look off-road when they use Autopilot, the precursor to «full self-driving,» compared to when they were in manual driving mode. Researchers said that more should be done to keep drivers attentive.
Rajkumar, the Carnegie Mellon professor, said that Tesla would be better off with a driver monitoring system similar to one used by GM, which uses an in-vehicle camera and infrared lights to monitor driver attention.
«[It would] avoid the many shenanigans that some Tesla vehicle operators do to circumvent paying attention,» Rajkumar said.
Teslas have a camera mounted in the passenger cabin that could theoretically monitor a driver. But Tesla does not appear to be using that camera to check if beta testers pay attention. Two beta testers of «full self-driving» have said that they have at times blocked their cameras: one, who posts on YouTube as «Dirty Tesla,» and Viv, a Twitter-based Tesla enthusiast who has said she’s testing «full self-driving.»
«They’re definitely not using it yet because I blocked mine, and they haven’t said anything,» Chris said in an interview last month. «If they want it, they’ll let me know.»
The feature will cost $10,000, but monthly subscriptions will be a more affordable way to use «full self-driving» for a short period of time, like a summer road trip. Musk has said they’ll be offered by July.
Tesla Raj, another YouTuber with early access to «full self-driving,» said in a recent video that there have been instances when he felt he was in danger of hitting another vehicle, or another vehicle hitting him, and he needed to take control of the car.
Ricky Roy, who calls himself a huge Tesla fan, and an investor in the company, posted a video recently called, «the truth about Tesla full self-driving.» He said that important questions were getting lost in «crazy excitement about [a] future of robotaxis that will make people millions.»
Roy alluded to Musk’s 2019 prediction that there would be a million robotaxis operating in 2020. Musk has said that «full self-driving» would make Teslas appreciating assets. Roy said in his video that he feared people would mistake Tesla’s «full self-driving,» which still requires a human driver ready to intervene at any time, for a fully autonomous vehicle, which does not need human supervision.