Investors should ignore Elon Musk’s latest dance and focus instead on the growing issues Tesla is facing because of its chief executive’s exaggerated claims about his company’s technological capabilities.
AI Day late Thursday, self-named Technoking Musk said that the company is working on a humanoid robot as “Tesla is arguably the world’s biggest robotics company because our cars are like semi-sentient robots on wheels.”
After a white-suited human did a brief dance for the believers in the audience and on a livestream, Musk came on the stage and showed only computer-generated images of a 5’8″ humanoid robot that he claimed Tesla will produce a prototype of sometime next year. He inferred it could be used for manufacturing or boring repetitive tasks, like grocery shopping and will have a full self-driving computer.
As always with Musk and Tesla, the timeline is highly doubtful to anyone with basic knowledge of the technology in question. Fortunately, the antics did not fool everyone on Wall Street, some of whom may be getting tired of his shenanigans.
“Unfortunately, as we have seen with robotaxis and other future sci-fi projects for Musk, we view this Tesla Bot as an absolute head scratcher that will further agitate investors at a time the Street is showing growing concern around rising EV competition and safety issues for Tesla,” said Dan Ives, a Wedbush Securities analyst, in a note to clients early Friday.
The safety issues Ives mentions are what investors should be attuned to right now, because it appears the government is finally stepping up and taking note of a problem this column has long pointed out: Musk repeatedly oversells the current and near-term potential for his automotive autonomy advanced technology.
Just a day before Thursday’s “AI Day” spectacle, two U.S. senators asked the Federal Trade Commission to investigate both Tesla’s and Musk’s “repeated overstatements of their vehicles’ capabilities” in regards to the marketing of Tesla’s “Full Self Driving” product. Tesla charges thousands of dollars at purchase (or as little as $100 a month) for software that is nowhere near full self-driving, a practice that has already led to a recent review by California Department of Motor Vehicles and a German ruling that Tesla could not market the product as such.
“Language matters,” said Selika Talbott, a professorial lecturer in the department of public administration and policy at American University in Washington DC. “The use of this terminology is false and misleading and unsafe for the general public. The notions of assisted driving and autonomous vehicles and their differences are not fully understood by the general public.”
“Tesla has highly assisted technology in their vehicle, but at no point should anyone behind the wheel think that vehicle can drive itself, because it can’t,” Talbott said.
The week began with news of a federal investigation into Tesla’s Autopilot system after cars using the feature crashed into stopped emergency vehicles. The National Highway Traffic Safety Administration is looking into a series of crashes by Tesla cars that had the advanced driver-assistance system enabled. NHTSA said that it opened an inquiry into 11 Tesla crashes that involved emergency vehicles, while still investigating a series of collisions involving cars enabled with Advanced Driver Assistance Systems (ADAS) and tractor-trailers.
The latest outcry on Capitol Hill follows a stream of news reports and/or social media posts and YouTube videos of drivers engaging in extremely risky behavior while testing the so-called self-driving features of their Tesla. In May, Steven Michael Hendrickson, a 35-year-old father of two in Fontana, Calif., died when his Tesla hit an overturned semi truck. Earlier he had posted videos of driving without his hands on the wheel of his car on the freeway, but the NHTSA was still investigating the role of Autopilot in the crash.
“The vehicles that Tesla is producing are driver-assisted systems,” said Bryan Reimer, a research scientist at the MIT Center for Transportation and Logistics. “They are assisting the driver, and the driver needs to maintain vigilance.”
It is important to note the difference between Tesla’s dual products with misleading names. “Autopilot” is an ADAS system, a highly advanced version of cruise control meant for highway driving that enables “your car to steer, accelerate and brake automatically within its lane under your active supervision, assisting with the most burdensome parts of driving,” according to Tesla’s website. Tesla also offers the “FSD” package, now available by a subscription of $99 to $199 a month, which it describes as “access to a suite of more advanced driver assistance features, designed to provide more active guidance and assisted driving under your active supervision.”
If only Musk described these systems in a similar manner to the official website. In analyst conference calls and in Tesla’s multi-hour long presentations to its fan base, Musk has been proclaiming that with this software, full autonomy is around the corner.
“We basically have to solve real-world vision AI and we are,” he said in an earnings call in April. “And the key to solving this is also having some massive data set. So just having well over one million cars on the road that are collecting data… But I am highly confident that we will get this done.”
But for all of Musk’s bluster and huge fan base, investors are starting to note that the company’s tactics involving full self-driving technology are dangerous, as opposed to the other companies that are testing autonomous vehicles.
For example, Alphabet Inc.’s GOOG GOOGL Waymo, the company with the most hours of autonomous vehicle driving, is currently operating a small scale robotaxi service in parts of Arizona around Phoenix that are not densely populated, without human drivers. It is the only one of its kind in the U.S. In California, Waymo has permits from the DMV to conduct AV testing with a human driver behind the wheel.
“Waymo cannot just start selling their AVs to anyone, and they can’t just drive them on the roadway, our regulatory system does not allow for that,” Talbott of American University said. “You can test them but no publicly available self-driving car is on the market for purchase because it doesn’t exist.”
With FSD testing being done in the real world with untrained drivers, Tesla is conducting the equivalent of clinical trials of a new drug without any professional hourly or daily monitoring of the patient.
“They are calling it beta, it is a beta system, they are exposing people to substantive risk,” Reimer said.
Musk’s latest bot is yet another distraction, much like the flame thrower in 2018 sold by his Boring Company, his unwanted assistance to try and help the boys stuck in a cave in Thailand, and other projects. Investors should not let these distractions get in the way of the real issues that Musk seems to be refusing to acknowledge as he continues to oversell his company’s technological abilities.