I find myself asking "how is this legal" several times a day lately. Why does "tech" get to stress-test long standing rules and norms? It's an attitude of "I dare you to regulate us". I'm remembering a comment on HN a week or so ago . To paraphrase: "thing, but from the internet".
There is effectively 0 traffic to contend with in that video. I do not see how you could reasonably claim that video should alleviate fears about unsafe behavior.
Can you ship some over here? Seems like if I want to get through a light before it turns red I find myself betting on the lane with the work van or heavy truck in stack rather than the one with the Tesla somewhere in the mix. In any sane world it'd be the opposite.
It's both. It's not the first time. Tesla recalled 50,000 cars because they programmed them to illegally roll through stop signs at up to 5.6mph/9kph [0]
Good grief. Can't it just be whimsy? Must everything be a conspiracy these days? They have a tunable threshold for when the car will attempt a lane change. Set it too high and it makes too many and annoys the occupants. Set it too low and it fails to choose the right lane and spends more time in traffic. That's all it is.
They just named the low threshold case after a fun driving movie because joke. Is that really so hard to believe?
"Frunk" and the Telo's "monster tunnel" are whimsy. Playing DOOM on a parked Tesla is whimsy. Shipping a feature that implies the car will drive unsafely or illegally in an automated fashion -- from an automaker that has a history (see below) of shipping similar driver assist features with unsafe behavior and misleading or overly bombastic marketing -- that is irresponsibility, not whimsy. If it weren't directly governing driving behavior or if it didn't come from a company that keeps doing this, then maybe I'd give them the benefit of the doubt.
Why would you expect it to? Tesla has had (still has?) settings that allow the user to define how much it is allowed to break the speed limit and whether it can run red lights.
This is one of those things where if we had a normal functioning regulatory environment, Elon Musk would be in jail long before he got this obvious. Having an "Autopilot" mode people get killed by because it doesn't actually drive the car should've been plenty.
Autopilot in planes also doesn’t actually pilot the plane gate to gate. That name actually seems consistent with the plane use of autopilot, as Autopilot in a Tesla merely follows the lane markings and keeps distance from the car in front.
I have no expertise in piloting or the details of autopilot in a plane, except that it does not fly the plane gate to gate, however it does assist the pilot in mundane tasks, such as cruising at altitude and following a path. Maybe modern ones land/take off, I don’t know, but a pilot is still required for many crucial tasks as far as I know.
Which is what the Autopilot function does in a Tesla, so I find it to not be a misleading name.
Except that obviously the difference is that plane pilots are rigourously tested, certified, and they have their every action including their voice recorded while they operate the plane, and they are held accountable with legally required self reporting for any mistakes no matter how innocent. There is no such scrutiny with car drivers, in fact in some places in the world you can operate a car with zero formal training and a short form test that tests bare fundamentals and nothing else - to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
But sure, Tesla's autopilot is the same as a plane autopilot in all the other respects.
>to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
This is irrelevant to the branding. Just as autopilot in a plane assists pilots in ideal conditions, Autopilot mode in a Tesla assists drivers in ideal conditions.
As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature, but merely assisted driving. Again, the broader point being that autopilot is not known to fly planes end to end, so there should be no confusion due to the name that Autopilot in a Tesla will drive end to end.
> Again, the broader point being that autopilot is not known to fly planes end to end
Is the public broadly aware of that?
There's a colloquial phrase in American English, "to be on autopilot", meaning when a person acts without awareness of what they're doing, often used when somebody makes a stupid mistake during a lapse of attention.
I don’t see why not. I didn’t go to pilot school or have any plane related interests, but from movies and tv shows and the fact that there are 2 or more pilots on every plane, it would be prudent to assume there are limitations.
The colloquialism of a person being on autopilot and making mistakes seems apt here, too. You use the Autopilot function in the car, and you don’t pay attention, then you will get in trouble.
>>As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature
And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
>>Autopilot mode in a Tesla assists drivers in ideal conditions.
That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
>And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
Drivers messing with sensors is irrelevant to Tesla informing drivers of the limitations, which the car clearly does.
>That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
Doesn’t seem like you have used a Tesla. There is no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point. Tesla locks you out of you Autopilot look away too much, and they make it clear how it is gimped in case you want to spend $200 per month for their “Full” Self Driving feature.
Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
>>no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point
And I hope no one does. But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
>>Drivers messing with sensors is irrelevant
Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
>>Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
> But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
No, which is why it tells you to keep your eyes on the road and pay attention. It’s literally a bunch of cheap cameras and some software trying to draw some lines and keep the car between them and a certain distance behind whatever is in front of it.
>Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
And you can do the same with any other car that has lane assist or whatever feature name that keeps the car in a lane and automatically brakes and accelerates.
> Uhm....good? That's great in fact?
What is the logic here? You are complaining about Tesla Autopilot being unsafe, but also complaining about the thing that makes Tesla Autopilot safer than other automakers’ lane assist/braking feature?
>>What is the logic here? You are complaining about Tesla Autopilot being unsafe,
Sorry, I should have made it clearer perhaps 3 posts ago. My #1 issue is with Tesla calling it autopilot and selling a "full self driving" upgrades for real money even though they don't exist. Call it "smart lane assist" and I'll shut up.
If you care why - because while you and I might understand that "it's just a bunch of cheap cameras keeping your car in the lane" - the public perception clearly doesn't see it that way. People cut naps behind the wheel of teslas far often than they do behind the wheels of Volvos of Peugeots, despite both of them sporting very advanced adaptive cruise systems. And we can say ok, but these people are dumb - sure, they definitely are, but I think pretending like Tesla's marketing isn't playing at least some part of it is just dishonest.
Full self driving obviously does not exist, but I think Autopilot is just as valid of a name for what it does as smart lane assist.
> People cut naps behind the wheel of teslas far often than they do behind the wheels of Volvos of Peugeots, despite both of them sporting very advanced adaptive cruise systems.
Need a source on this one. You’d have to be a suicidal maniac to cut a nap on Autopilot, it doesn’t change lanes, it doesn’t stop for traffic lights or red lights or construction signs, and the Tesla very clearly tells you it won’t.
I can keep going, searching google for "tesla driver arrested while asleep" yields at least 20+ results, I didn't keep looking past page 3.
As to whether that's more or less common with Volvo or any other brand.....that I can't tell you, you got me there, I don't know if there are public stats on this. But(anecdote) I have never seen a news article about non-Tesla drivers asleep behind the wheel on autopilot-like systems.
You would need more than one rate:
- percentage of product owner operators who are killed by the product they own/operate per year
- number of other people killed by the median product owner per year
- inflation adjusted property damage (belonging to other people, or to the public/govt) caused by the median product owner per year
Regulating products based on the potential to kill, maim, or injure, is not a terrible idea.
It’s why we require more training of people who fly 747, then people who operate cars.
But if it was going to work, we’d have to do it without carve outs - if it only applies to some products, then it’s really just politics.
If Tesla’s auto pilot is really so safe that it needs little or no regulation, then by definition, regular cars are so dangerous that they should be banned or require much more regulation. But I only ever hear the first half of this argument, which makes me worry this is not really an argument about safety.
I keep using this as an example - the Therac mechines for radiotherapy undoubtedly saved lives. They also undoubtedly administered radiation treatment better, faster and more accurately than any manual operator could have done.
And yet, they all got recalled when we realized they "sometimes" administer a lethal dose of radiation by mistake. Or do you think they should have continued operating? What was the "acceptable maximum rate at which your products can kill people" for them? Because I'd argue it's zero. And it should be zero for Teslas or any cars that have something called "autopilot".
A surgeon who performed a voluntary operation which caused unpredictable complications leading to death shouldn't necessarily stop operating on other patients. There's a line that has to be drawn somewhere, I'm not going to draw it.
A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Replace the surgeon with a robotic surgeon operating under some kind of autonomous mode and yes, I think every robot of its kind should be immediately pulled out of use.
>>A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Hard disagree, and I honestly hate it when people make that argument. The number should be zero.
>>Or are you suggesting that we shouldn't use a technology that is vastly safer than human drivers, but still causes a nonzero amount of deaths?
Absolutely this one. And the key word here is "causes" - if the deaths are being caused by mistakes of the algorithm(and by extension - their creators) then every single one of these systems should be disabled and pulled out from sale until it can be addressed, in the same way a plane autopilot would be. I suspect we will disagree on this.
Perhaps look at it this way - when I buy an automatic pressure cooker, I need it to explode exactly zero times, not "less than manual pressure cookers". If my car drives into a concrete barrier because it thought it's actually a perfectly straight road g - I really really don't care that on average fewer people have died while using it than when driving themselves. It's unsafe and it should be forbidden from sale and use on public roads.
I find myself asking "how is this legal" several times a day lately. Why does "tech" get to stress-test long standing rules and norms? It's an attitude of "I dare you to regulate us". I'm remembering a comment on HN a week or so ago . To paraphrase: "thing, but from the internet".
The article says it “speeds” but provides no sources or evidence, you might want to take that with a grain of salt.
Here’s a video of the new mode, it seems pretty normal, you wouldn’t bat an eye to a friend driving in the exact same way: https://www.youtube.com/watch?v=C8uIPsaF-yY
Not that I would trust it with my life at this point in time, but the claims do seem exaggerated.
There is effectively 0 traffic to contend with in that video. I do not see how you could reasonably claim that video should alleviate fears about unsafe behavior.
It does not seem to go over the speed limit, even on an empty road?
[dead]
> Mad Max
Fellas, the "Torment Nexus" tweet was supposed to be a joke.
Either it's overhyped marketing, or Tesla has automated aggressive A-hole driving. I don't like either option.
In Sweden, Tesla is the new BMW. It attracts aggressive A-hole drivers, so no need to automate it.
Can you ship some over here? Seems like if I want to get through a light before it turns red I find myself betting on the lane with the work van or heavy truck in stack rather than the one with the Tesla somewhere in the mix. In any sane world it'd be the opposite.
Soon enough Tesla will be omitting turn signals to save on manufacturing costs
I think torque is what attracts (or provokes) aggressive drivers.
It's both. It's not the first time. Tesla recalled 50,000 cars because they programmed them to illegally roll through stop signs at up to 5.6mph/9kph [0]
[0] https://www.tesla.com/support/recall-rolling-stop-functional...
Good grief. Can't it just be whimsy? Must everything be a conspiracy these days? They have a tunable threshold for when the car will attempt a lane change. Set it too high and it makes too many and annoys the occupants. Set it too low and it fails to choose the right lane and spends more time in traffic. That's all it is.
They just named the low threshold case after a fun driving movie because joke. Is that really so hard to believe?
"Frunk" and the Telo's "monster tunnel" are whimsy. Playing DOOM on a parked Tesla is whimsy. Shipping a feature that implies the car will drive unsafely or illegally in an automated fashion -- from an automaker that has a history (see below) of shipping similar driver assist features with unsafe behavior and misleading or overly bombastic marketing -- that is irresponsibility, not whimsy. If it weren't directly governing driving behavior or if it didn't come from a company that keeps doing this, then maybe I'd give them the benefit of the doubt.
https://www.nbcnews.com/tech/tech-news/tesla-cars-traffic-la...
https://www.cbsnews.com/news/tesla-fsd-nhtsa-investigation-t...
https://www.wired.com/story/tesla-autopilot-risky-deaths-cra...
https://futurism.com/tesla-change-name-full-self-driving-chi...
https://www.jalopnik.com/1878905/ncap-downgrades-tesla-safet...
https://www.theverge.com/news/692639/tesla-robotaxi-mistake-...
Please don't Gish-Gallop on HN. None of those links are about AP tuning or "Mad Max" mode.
“Move fast and break things” is taking on a very literal meaning.
Does this mode comply with state traffic regulations like follow distance and lane change protocol?
Why would you expect it to? Tesla has had (still has?) settings that allow the user to define how much it is allowed to break the speed limit and whether it can run red lights.
How could they think that this is a good idea? Baffling.
Influencers will post about it and that will resuscitate their brand. Somehow.
Isn't it kind of dumb that only after something like this hits world wide distributed news does anyone investigate anything?
As far as I can see they rolled it out without any warning, and, well, presumably without asking the regulators.
The feature was only available this month
This is one of those things where if we had a normal functioning regulatory environment, Elon Musk would be in jail long before he got this obvious. Having an "Autopilot" mode people get killed by because it doesn't actually drive the car should've been plenty.
Agreed but we are through the looking glass at this point.
Autopilot in planes also doesn’t actually pilot the plane gate to gate. That name actually seems consistent with the plane use of autopilot, as Autopilot in a Tesla merely follows the lane markings and keeps distance from the car in front.
Care to list the major functions of autopilot in airplanes and the Tesla equivalent?
I have no expertise in piloting or the details of autopilot in a plane, except that it does not fly the plane gate to gate, however it does assist the pilot in mundane tasks, such as cruising at altitude and following a path. Maybe modern ones land/take off, I don’t know, but a pilot is still required for many crucial tasks as far as I know.
Which is what the Autopilot function does in a Tesla, so I find it to not be a misleading name.
> I only have a vague idea of what autopilot does but I can confidently say that Tesla autopilot is the same.
That is not what I wrote, but feel free to interpret it that way if it makes you feel better.
> Having an "Autopilot" mode people get killed by because it doesn't actually drive the car should've been plenty.
Just like you need a pilot paying attention even when a plane is using autopilot, you need a driver paying attention when a Tesla is on Autopilot.
Where is the incongruence?
> That name actually seems consistent with the plane use of autopilot [continues with talk of specific mechanics and nothing about monitoring]
...
> I have no expertise in piloting or the details of autopilot
You're changing your story from "they have the same functionality" to "they're both the same because they aren't fully autonomous".
> feel free to interpret it that way if it makes you feel better.
Have a good night.
I never wrote “they have the same functionality”, so why are you claiming I did and putting it in quotes?
I wrote:
> That name actually seems consistent with the plane use of autopilot,
Except that obviously the difference is that plane pilots are rigourously tested, certified, and they have their every action including their voice recorded while they operate the plane, and they are held accountable with legally required self reporting for any mistakes no matter how innocent. There is no such scrutiny with car drivers, in fact in some places in the world you can operate a car with zero formal training and a short form test that tests bare fundamentals and nothing else - to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
But sure, Tesla's autopilot is the same as a plane autopilot in all the other respects.
>to expect such a driver to take the same level of care and attention as a commercial pilot when operating a Tesla in autopilot mode is....wishful thinking at best.
This is irrelevant to the branding. Just as autopilot in a plane assists pilots in ideal conditions, Autopilot mode in a Tesla assists drivers in ideal conditions.
As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature, but merely assisted driving. Again, the broader point being that autopilot is not known to fly planes end to end, so there should be no confusion due to the name that Autopilot in a Tesla will drive end to end.
> Again, the broader point being that autopilot is not known to fly planes end to end
Is the public broadly aware of that?
There's a colloquial phrase in American English, "to be on autopilot", meaning when a person acts without awareness of what they're doing, often used when somebody makes a stupid mistake during a lapse of attention.
>Is the public broadly aware of that?
I don’t see why not. I didn’t go to pilot school or have any plane related interests, but from movies and tv shows and the fact that there are 2 or more pilots on every plane, it would be prudent to assume there are limitations.
The colloquialism of a person being on autopilot and making mistakes seems apt here, too. You use the Autopilot function in the car, and you don’t pay attention, then you will get in trouble.
>>As an aside, Autopilot mode in Tesla monitors the driver’s eyes to ensure they are looking at the road, and quite a few steps are taken to ensure drivers know that it is not a self driving feature
And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
>>Autopilot mode in a Tesla assists drivers in ideal conditions.
That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
>And as many, many, many videos of pornhub attest, you can do plenty of other activities for a long time without autopilot giving a crap. Maybe that's the drivers messing with the sensors somehow, but it's obviously possible.
Drivers messing with sensors is irrelevant to Tesla informing drivers of the limitations, which the car clearly does.
>That sounds like an absolute cop out if you don't mind me saying so. It's not how the feature is perceived, and again it goes back to what I said earlier - drivers should have to receive actual, real sit-down-with-a-book training to use this feature.
Doesn’t seem like you have used a Tesla. There is no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point. Tesla locks you out of you Autopilot look away too much, and they make it clear how it is gimped in case you want to spend $200 per month for their “Full” Self Driving feature.
Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
>>no way a reasonable person can perceive Autopilot as a feature where the car drives itself point to point
And I hope no one does. But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
>>Drivers messing with sensors is irrelevant
Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
>>Also, plenty of other companies offer the same feature under a different name like lane assist and enhanced cruise control, and they don’t even monitor the driver’s eyes.
Uhm....good? That's great in fact?
> But I'm sure we both agree that any reasonable person should be able to expect a Tesla to drive itself on a straight road without driving into a truck stopped sideways on said road. Or not be confused in really weird and unusual situations like driving against the sun on a bright summer day.
No, which is why it tells you to keep your eyes on the road and pay attention. It’s literally a bunch of cheap cameras and some software trying to draw some lines and keep the car between them and a certain distance behind whatever is in front of it.
>Which again, I have no proof was done in those cases, but it's certainly a trend on social media and on other kinds of websites to show all the activities that you can do while the car is "clearly" driving itself. And even outside of things clearly done for attention, there isn't a lack of reports of people being arrested for reading, watching films, playing games and yes, being fully asleep in Teslas behind the wheel. We're not talking about influencers farming likes, we're talking normal people.
And you can do the same with any other car that has lane assist or whatever feature name that keeps the car in a lane and automatically brakes and accelerates.
> Uhm....good? That's great in fact?
What is the logic here? You are complaining about Tesla Autopilot being unsafe, but also complaining about the thing that makes Tesla Autopilot safer than other automakers’ lane assist/braking feature?
>>What is the logic here? You are complaining about Tesla Autopilot being unsafe,
Sorry, I should have made it clearer perhaps 3 posts ago. My #1 issue is with Tesla calling it autopilot and selling a "full self driving" upgrades for real money even though they don't exist. Call it "smart lane assist" and I'll shut up.
If you care why - because while you and I might understand that "it's just a bunch of cheap cameras keeping your car in the lane" - the public perception clearly doesn't see it that way. People cut naps behind the wheel of teslas far often than they do behind the wheels of Volvos of Peugeots, despite both of them sporting very advanced adaptive cruise systems. And we can say ok, but these people are dumb - sure, they definitely are, but I think pretending like Tesla's marketing isn't playing at least some part of it is just dishonest.
Full self driving obviously does not exist, but I think Autopilot is just as valid of a name for what it does as smart lane assist.
> People cut naps behind the wheel of teslas far often than they do behind the wheels of Volvos of Peugeots, despite both of them sporting very advanced adaptive cruise systems.
Need a source on this one. You’d have to be a suicidal maniac to cut a nap on Autopilot, it doesn’t change lanes, it doesn’t stop for traffic lights or red lights or construction signs, and the Tesla very clearly tells you it won’t.
>>You’d have to be a suicidal maniac to cut a nap on Autopilot
And yet....
https://www.lakemchenryscanner.com/2025/10/26/barrington-hil...
https://abc7chicago.com/post/tesla-driver-caught-sleeping-au...
https://robbreport.com/motors/cars/canadian-police-arrest-sl...
I can keep going, searching google for "tesla driver arrested while asleep" yields at least 20+ results, I didn't keep looking past page 3.
As to whether that's more or less common with Volvo or any other brand.....that I can't tell you, you got me there, I don't know if there are public stats on this. But(anecdote) I have never seen a news article about non-Tesla drivers asleep behind the wheel on autopilot-like systems.
There has to be some acceptable maximum rate at which your products can kill people, and Tesla's autopilot is probably below that rate.
You would need more than one rate: - percentage of product owner operators who are killed by the product they own/operate per year - number of other people killed by the median product owner per year - inflation adjusted property damage (belonging to other people, or to the public/govt) caused by the median product owner per year
Regulating products based on the potential to kill, maim, or injure, is not a terrible idea.
It’s why we require more training of people who fly 747, then people who operate cars.
But if it was going to work, we’d have to do it without carve outs - if it only applies to some products, then it’s really just politics.
If Tesla’s auto pilot is really so safe that it needs little or no regulation, then by definition, regular cars are so dangerous that they should be banned or require much more regulation. But I only ever hear the first half of this argument, which makes me worry this is not really an argument about safety.
Thinks aren’t magically legal just as long they don’t kill “too many” people.
Things aren't magically illegal just because they sometimes kill people
Driving dangerously is illegal in most places though, and the illegality isn’t tied to the fatality rate of your driving style.
I keep using this as an example - the Therac mechines for radiotherapy undoubtedly saved lives. They also undoubtedly administered radiation treatment better, faster and more accurately than any manual operator could have done.
And yet, they all got recalled when we realized they "sometimes" administer a lethal dose of radiation by mistake. Or do you think they should have continued operating? What was the "acceptable maximum rate at which your products can kill people" for them? Because I'd argue it's zero. And it should be zero for Teslas or any cars that have something called "autopilot".
A surgeon who performed a voluntary operation which caused unpredictable complications leading to death shouldn't necessarily stop operating on other patients. There's a line that has to be drawn somewhere, I'm not going to draw it.
A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Replace the surgeon with a robotic surgeon operating under some kind of autonomous mode and yes, I think every robot of its kind should be immediately pulled out of use.
>>A self-driving car that kills less people per mile than a reasonably selected cohort of human drivers is probably a good thing.
Hard disagree, and I honestly hate it when people make that argument. The number should be zero.
>Hard disagree, and I honestly hate it when people make that argument. The number should be zero.
Are you suggesting that we could use such a system but shouldn't be happy with it until it reaches zero deaths? In which case I couldn't agree more.
Or are you suggesting that we shouldn't use a technology that is vastly safer than human drivers, but still causes a nonzero amount of deaths?
>>Or are you suggesting that we shouldn't use a technology that is vastly safer than human drivers, but still causes a nonzero amount of deaths?
Absolutely this one. And the key word here is "causes" - if the deaths are being caused by mistakes of the algorithm(and by extension - their creators) then every single one of these systems should be disabled and pulled out from sale until it can be addressed, in the same way a plane autopilot would be. I suspect we will disagree on this.
Perhaps look at it this way - when I buy an automatic pressure cooker, I need it to explode exactly zero times, not "less than manual pressure cookers". If my car drives into a concrete barrier because it thought it's actually a perfectly straight road g - I really really don't care that on average fewer people have died while using it than when driving themselves. It's unsafe and it should be forbidden from sale and use on public roads.