banner



Tesla Autopilot crash is a reminder that your car still can’t drive itself

Tesla Autopilot crash is a reminder that your auto still tin can't drive itself

Tesla model 3 vs model y: power
(Image credit: Tesla)

Another Tesla has been involved in a high profile accident, with a reportedly Autopilot-controlled Tesla Model 3 colliding with an emergency vehicle terminal week — specifically a Florida Highway Patrol Cruiser nearly Orlando.

While nobody is hurt, these kinds of stories highlight the inherent flaws in electric current democratic car software, be it Autopilot or something else. And that'southward exactly why you're told to take an attentive driver at the wheel at all times.

  • Everything you need to know well-nigh the Tesla Model 3
  • Tesla Model 3 vs Tesla Model Y: Which 'cheap' Tesla is right for you lot?
  • Plus: Mercedes EQE is a luxury electric auto with the range to beat Tesla

According to CNN the incident happened on Interstate 4 merely before 5 a.g. ET. The Orange County trooper had stopped to assist a broken down vehicle, just for a Tesla Model to striking the side of the patrol automobile and and so crash into the broken-downwardly Mercedes — narrowly missing the trooper in question. The patrol auto did have its emergency lights flashing at the fourth dimension.

The drivers of both the Tesla and the Mercedes were left with minor injuries, though nobody was seriously hurt. The Tesla's commuter also confirmed to the Trooper on the scene that the car was in Autopilot mode at the time of the crash.

Florida police said the crash would be reported to Tesla and the National Highway Traffic Prophylactic Administration (NHTSA) — the latter of which is currently investigating Tesla Autopilot.

The NHTSA claims that Teslas have collided with emergency vehicles, including constabulary cars and ambulances, at least 11 times between January 2018 and July 2021. The incidents happened in nine dissimilar states, and well-nigh of them plain took identify at dark.

What's more than, the NHTSA said that the scenes had utilized emergency vehicles lights, flares, illuminated arrow boards and road cones prior to each accident.

In this example, it'due south not clear whether the driver was misusing Autopilot or not. Notwithstanding it's another alarm of why drivers shouldn't be too trusting of Autopilot, or whatever other semi-autonomous driver help tech. It may seem like the car is capable of driving itself, merely it's non effective enough to completely supplant the driver.

Autopilot is non a self-driving motorcar system, no thing what it sounds like

Tesla itself has said that Autopilot could "practise the incorrect matter at the worst time ,"  which is when the commuter is needed to take control. If the commuter isn't paying attention, or worse, has actually got out of the commuter'due south seat, then the motorcar is essentially left to its own devices when serious situations occur.

Semi-autonomous driver assistance tech is a massive help, peculiarly on longer journeys, simply it'south non an alternative to actually driving. Even if terms similar 'Autopilot' and 'Full Self Driving' get in sound like the car is able to practice everything for you.

Tesla CEO Elon Musk has constantly defended the proper noun Autopilot, claiming information technology'southward based on the autopilot used in planes that was built to assistance an attentive pilot. But that hasn't stopped the automaker from landing in hot water.

German courts take ruled the name Autopilot, alongside marketing that suggested Teslas could drive themselves, is misleading. Likewise the NHTSA has asked the FTC to investigate Tesla's use of the proper noun Autopilot equally a form of false advertising, though it isn't clear what the FTC'south response was.

Tesla also needs to do more than to cease people being able to go out of the driver seat while Autopilot is engaged. Currently the organization uses sensors in the steering wheel to check if the commuter'south hands are present, and will disengage if the seatbelt is unbuckled.

Even so, testing has shown these rubber measures are terrifyingly easy to get around. Weights on the steering wheel tin mimic the presence of hands, and drivers could, in theory, sit down on top of a buckled seat belt to give them freedom to leave the driver's seat.

This is non a problem exclusive to Tesla, with other tests showing that democratic driving prophylactic measures are just as easy to cheat. And the biggest problem these systems all share are the lack of weight sensors in the driver'south seat, checking if someone is actually there or non.

Clearly, something needs to be washed across the board to stop this happening. Keeping someone in the driver'south seat isn't going to finish them from getting distracted or taking their eyes off the road, but it'due south a practiced start. In the meantime just retrieve that your 'democratic' car isn't. We all the same have a long way — and at the very least several years — to get before your ain car will be driving you effectually without needing whatsoever supervision.

  • More: Tesla hatchback: $25K price, release, possible range and more than

Tom is the Tom'south Guide'due south Automotive Editor, which means he can usually be found articulatio genus deep in stats the latest and best electric cars, or checking out some sort of driving gadget. Information technology'southward long way from his days as editor of Gizmodo UK, when pretty much everything was on the tabular array. He'southward normally found trying to squeeze another giant Lego set onto the shelf, draining very large cups of coffee, or complaining that Ikea won't let him buy the stuff he really needs online.

Source: https://www.tomsguide.com/news/tesla-autopilot-crash-is-a-reminder-that-your-car-still-cant-drive-itself

Posted by: pattondoneshearn.blogspot.com

0 Response to "Tesla Autopilot crash is a reminder that your car still can’t drive itself"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel