Owners may purchase or subscribe to Full Self-Driving (FSD) which adds semi-autonomous navigation that responds to traffic lights and stop signs, lane change assistance, self-parking, and the ability to summon the car from a garage or parking spot.
The company's stated intent is to offer fully autonomous driving (SAE Level 5) at a future time, acknowledging that technical and regulatory hurdles must be overcome to achieve this goal.
By 2016, the Mobileye-based Autopilot had added automatic emergency braking (AEB), adaptive cruise control (ACC), and lane centering capabilities,[13] but Tesla and Mobileye dissolved their partnership that July.
"[19] During a January 2019 earnings call, Elon Musk reiterated "full self-driving capability is there", referring to "Navigate on Autopilot", an EAP feature limited to controlled-access highways.
[18] HW2 vehicles were updated in January and February 2017 with software version 8.0, which included Traffic-Aware Cruise Control and Autosteer (lane-centering) on divided highways and 'local roads' up to speeds of 45 miles per hour (72 km/h).
[20] In a November 2018 test drive, The Verge reporter Andrew J. Hawkins called the beta of Navigate on Autopilot "the feature that could give Tesla an edge as it grows from niche company to global powerhouse".
[60] In September, Tesla released software version 10 to Early Access Program (EAP) testers, citing improvements in driving visualization and automatic lane changes.
The package also includes minor features such as "Green Light Chime" and standard safety systems such as automatic emergency braking (AEB), lane and roadway edge departure warning and correction, and blind spot indicators.
Highlights included reduced photon-to-control latency, integrated unpark and reverse capabilities, dynamic routing around road closures, and the ability to start FSD (Supervised) from Park.
[190] Some news reports in 2019 state "practically everyone views [lidar] as an essential ingredient for self-driving cars"[198] and "experts and proponents say it adds depth and vision where camera and radar alone fall short.
"[203] In September 2021, legal scholars William Widen and Philip Koopman argued that Tesla's advertising of FSD as an SAE Level 2 system was misleading to "avoid regulatory oversight and permitting processes required of more highly automated vehicles".
Musk responded to the article with a statistical argument in an email to the reporter, saying "Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available.
[222] Additionally, a statistical analysis first published as a preprint in 2021[218] and in final form in 2023[223] criticized the self-reported Tesla crash rate data, as it failed to account for vehicle owner demographics as well as the types of roads on which Autopilot was being operated.
[237] In addition to failing to recognize the side of a trailer, Autopilot crashes have been blamed on driver distraction,[238] inability to detect stationary emergency vehicles,[239] and misuse outside the stated operational design domain of "controlled-access highways [...] with a center divider, clear lane markings, and no cross traffic".
[255] A review of the in-cabin camera-based monitoring system by Consumer Reports found that drivers could still use Autopilot even when looking away from the road or using their phones, and could also enable FSD beta software "with the camera covered.
Anecdotal evidence has shown the module is effective only for Tesla vehicles sold in the United States and Canada, leading to speculation the driver monitoring software is different by region.
[271] Autopilot users have also reported the software crashing and turning off suddenly, collisions with off ramp barriers, radar failures, unexpected swerving, tailgating, and uneven speed changes.
Tesla vehicles permanently record this data as "gateway log" files onto a microSD card in the Media Control Unit, at a rate of approximately 5 times per second (hertz or Hz).
[287] The Nairobi team grew to 400 workers by 2016, but Karpathy later stated the "quality [of their work] was not amazing" and Tesla began hiring employees for data labeling in San Mateo, California instead.
[287] In April 2023, it was revealed that San Mateo labeling employees had shared clips internally among themselves, including recordings of privately owned areas such as garages, as well as crashes, road-rage incidents, and meme videos annotated with "amusing captions or commentary".
In one case, the submersible Lotus Esprit prop featured in the James Bond film The Spy Who Loved Me, which had been purchased by Elon Musk in 2013 and stored in his garage, was recorded and shared by labeling team members.
[323] The groups renewed their appeal to the FTC and added the California DMV in 2019,[324] noting that "Tesla continues to be the only automaker to describe its Level 2 vehicles as 'self-driving' and the name of its driver assistance suite of features, Autopilot, connotes full autonomy.
[326] A 2019 IIHS study showed that the name "Autopilot" causes more drivers to misperceive behaviors such as texting or taking a nap to be safe, versus similar level 2 driver-assistance systems from other car companies.
[333][334] In July 2022, the California DMV filed two complaints with the state Office of Administrative Hearings that alleged Tesla "made or disseminated statements that are untrue or misleading, and not based on facts" relating to both "Autopilot and Full Self-Driving technologies".
[338] In September 2022, California governor Gavin Newsom signed state bill SB 1398,[339] which took effect January 1, 2023 and prohibits any manufacturer or dealer of cars with partial driver automation features from using misleading language to advertise their vehicles as autonomous, such as by naming the system "Full Self-Driving".
[369] Data for PE 21-020 had been supplemented by prior information requests to Tesla (April 19, 2021) and Standing General Order (SGO) 2021–01,[384] issued June 29, 2021[385] and amended on August 5, 2021,[369] which required manufacturers of advanced driving assistance systems to promptly report crashes to NHTSA.
[394] Starting in October 2023, NHTSA conveyed its preliminary conclusions to Tesla during several meetings, followed by Tesla conducting a voluntary recall on December 5, 2023, to provide an over-the-air software update to "incorporate additional controls and alerts ... to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged ... [and providing] additional checks upon engaging Autosteer and while using the feature outside controlled access highways", while not concurring with NHTSA's analysis.
Also ODI added, based on vehicle telemetry, "the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task", showing that in approximately 80% of 135 incidents, braking and/or steering did not occur until less than one second before a collision.
[396] A comparison to similar Level 2 ADAS led ODI to conclude "Tesla [is] an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot's permissive operating capabilities".
"[400][401] Tesla issued a recall of 11,728 vehicles in October 2021 due to a communication error that could lead to false forward-collision warnings or unexpected activations of the automatic emergency braking system.