The US has raised concerns about the interaction between Tesla’s Autopilot system and its drivers – as part of a government investigation that started two years ago.

“There is a real concern that’s not limited to the technology itself but the interaction between the technology and the driver,” said US Transportation Secretary Pete Buttigieg on Tuesday.

The National Highway Traffic Safety Administration (NHTSA) has been leading a probe into multiple Tesla crashes involving Autopilot since August 2021.

Tesla’s Models Y, X, S, and 3 are all being investigated, equating to around 900,000 Teslas on US roads.

Some 16 crashes that took place between January 2018 and January 2022 form the base of the investigation, resulting in 15 injuries and one death.

The investigation is looking to see if Tesla is misleading customers by claiming its vehicles have “full-self driving capabilities”.

Tesla urges all drivers to remain aware of their surroundings when using the autopilot feature. Although some Tesla users have admitted they’ve been drunk in the backseat while on the road.

“The question is not are they absolutely free of problems or 1000% foolproof,” Buttigieg said.

He added: “The question is, how can we be sure that they will lead to a better set of safety outcomes … This technology has a lot of promise. We just have to make sure it unfolds in a responsible fashion.”

Musk pursues autonomous vision despite probe

Elon Musk, Tesla CEO and SpaceX founder, claimed the company may continue with its widespread release of self-drive technology in 2023.

Speaking at a press conference in April, Musk said: “I hesitate to say this but I think we will do it this year.”

Last month, Tesla won a separate lawsuit filed in 2020 by Tesla driver Justine Hsu of Los Angeles, who alleged her Tesla Model S veered into a curb in Autopilot.

She demanded $3m in damages from the electric vehicle manufacturer.

The lawsuit claimed the airbag was deployed “so violently [that] it fractured the plaintiff’s jaw, knocked out teeth, and caused nerve damage to the plaintiff’s face.”

Tesla denied responsibility for the crash and argued that autopilot was being used on city streets when the company advises drivers from doing so.

The jury of Los Angeles Superior Court awarded Hsu no damages and found the airbag had performed as it should.

According to research firm GlobalData, the autonomous vehicle industry will not develop a fully self-driving car until 2035.

“We expect the timelines for deploying fully autonomous vehicles (Level 5) to be pushed back over the next few years,” the firm wrote in a report.

Level 5 autonomy relates to self-driving cars that do not require any human interaction – meaning that they won’t be manufactured with any steering wheels or pedals.