Post by sofamonkey on Dec 17, 2023 19:46:42 GMT -5
Can we discuss this? I’ve seen mentions here and there, saying that they’re recalling ALL teslas for recall. How is this feasible/possible? I’m linking an article from the hill, which I think is fairly middle of the road, but this seems kind of huge?
Post by wanderingback on Dec 17, 2023 19:54:17 GMT -5
Yes, the article explains. Most vehicles in US are being recalled. They are pushing a software update next week.
"Tesla recalled nearly all of its vehicles sold in the U.S. to fix a flaw with its Autopilot self-driving feature Wednesday. About 2 million vehicles are impacted by the recall, consisting of all Tesla models Y, S, 3 and X produced between Oct. 2012 and last week."
If it's like the Rivian, they push over the air updates regularly and after installing it's fixed or updated. So I am guessing they are doing this to all Teslas, not necessarily taking them off the road like a typical recall.
If it's like the Rivian, they push over the air updates regularly and after installing it's fixed or updated. So I am guessing they are doing this to all Teslas, not necessarily taking them off the road like a typical recall.
Ok, that’s the piece I’m missing. LOL! It’s not a recall like I’m used to. It’s basically a software update.
Post by karinothing on Dec 18, 2023 6:15:21 GMT -5
I am not real fan of Tesla but sometimes I think it's weird how much of a focus is on them during an accident. Regular humans still have a much higher rate of accidents and killing people than Tesla auto pilot. And yes of course the system needs to be fixed so it properly recognizes object/people but aren't the drivers partially responsible? Our car has a similar feature, auto pilot light, (I call it) and it constantly yells at me if I take my hands off the wheel. I heard folks were tricking Tesla cars by putting weights on the wheel so it would think they were holding on.
Anyway, basically of course it needs to be fixed but I think sometimes Tesla gets criticism solely for being Tesla.
If it's like the Rivian, they push over the air updates regularly and after installing it's fixed or updated. So I am guessing they are doing this to all Teslas, not necessarily taking them off the road like a typical recall.
Ok, that’s the piece I’m missing. LOL! It’s not a recall like I’m used to. It’s basically a software update.
So, can people decline the updates?
Sometimes you can put them off for a little bit. Usually you need to be on wifi to download them, so we have to find somewhere the car will be in a wifi signal, our house signal doesn't reach far enough. They can push through updates without it in theory, but I don't think I've seen it done yet.
Autopilot is something you have to pay extra for, at least in the model we got. We didn't want to pay extra for it since we were never going to use it, neither one of us trust the technology/software yet. I assume we still have to update the software, though.
I am not real fan of Tesla but sometimes I think it's weird how much of a focus is on them during an accident. Regular humans still have a much higher rate of accidents and killing people than Tesla auto pilot. And yes of course the system needs to be fixed so it properly recognizes object/people but aren't the drivers partially responsible? Our car has a similar feature, auto pilot light, (I call it) and it constantly yells at me if I take my hands off the wheel. I heard folks were tricking Tesla cars by putting weights on the wheel so it would think they were holding on.
Anyway, basically of course it needs to be fixed but I think sometimes Tesla gets criticism solely for being Tesla.
Watch the most recent john oliver. He covered this. Is it better than humans? Yeah, sure. But are there engineering errors that shouldn't have happened? Yup.
It will help some. It won't solve the entire problem, but will move the technology in the right direction.
So theoretical question- there are roughly 42,000+ deaths on US highways alone, not even counting injuries or property damage only collisions. If Autonomous vehicle technology in general by all manufacturers could reduce those deaths by 10K, do you think we as a society should move forward with that technology? What if that technology was still responsible for crashes that killed 2K people, but in the end saved 10K lives? If we could do something that would potential save 5K or 10K or 20K lives, isn't is irresponsible NOT to do that thing?
ETA: to be clear, this isn't my POV, but a common sentiment I see when defending this technology (or AI) by people who are very utilitarian and/or libertarian (a la Musk). All technology comes with risks, but trying to alter the risk calculus of crash causation is different than injury causation (seatbelts, airbags). People are still hurt everyday by seatbelt & airbags, but the benefits greatly outweigh the risks when used properly. Trying to develop any autonomous vehicle will require a very tight balance between progress and safety.
It will help some. It won't solve the entire problem, but will move the technology in the right direction.
So theoretical question- there are roughly 42,000+ deaths on US highways alone, not even counting injuries or property damage only collisions. If Autonomous vehicle technology in general by all manufacturers could reduce those deaths by 10K, do you think we as a society should move forward with that technology? What if that technology was still responsible for crashes that killed 2K people, but in the end saved 10K lives? If we could do something that would potential save 5K or 10K or 20K lives, isn't is irresponsible NOT to do that thing?
tangent, have you watched The Good Place? I feel like you should watch it.
Tesla requires that you keep your hands on the wheel and pay attention, even when using autopilot. This recall is saying that whatever controls they had in place to make sure people were doing this were not adequate. So this is supposed to be tightening the requirements on what the driver is doing while using autopilot. It's not an issue with the autopilot per se, but the user. Tesla pushes updates to the software all the time, so most cars will get "fixed" without the owners hardly noticing it.
Tesla requires that you keep your hands on the wheel and pay attention, even when using autopilot. This recall is saying that whatever controls they had in place to make sure people were doing this were not adequate. So this is supposed to be tightening the requirements on what the driver is doing while using autopilot. It's not an issue with the autopilot per se, but the user. Tesla pushes updates to the software all the time, so most cars will get "fixed" without the owners hardly noticing it.
Right, i mentioned above that I heard folks were using weights to trick the sensors. I mean the car manufactures can only do so much if people are going to try to game the system.
I look at this like any life safety issue, like with building codes or other car safety regulations. When accidents happen as a result of new technology, safety regulations will increase as a response to prevent loss of life / injury. I don’t trust a company like Tesla to do all the things the Gov’t might make them do on their own because safety is not their only concern and they are weighing it with other factors. Sadly the only market the U.S. doesn’t have much regulation like this is the gun industry.
On a side note, I was driving home during rush hour on the freeway Friday night and a cop car with lights/ sirens passed me in the fast lane. A little further ahead the was a Tesla in the fast lane and it took so long to notice and get over so it was holding the cop up. I was wondering if it was on autopilot and the delay was either due to not sensing the lights or because it needed so much room to get over a lane (a normal driver would have more than enough room to move into the gap, but the fast lane was clear ahead of them while the other lanes were not, so it didn’t really make sense for the cop to pass them on the right). I was curious how the Tesla autopilot responds in that scenario.
Like neverfstop posted above, it can be complicated. I'm addition to total number of deaths, there's also the question of whether some people are disproportionately killed (e.g., pedestrians vs drivers), and "active" vs passive killings (think about the railway scenario where you are in a pain to decide whether to force the train to change tracks and kill just one person instead of multiple, but of course if you do you've killed that one person vs just being witness to a tragic accident).
All of that said, I do think autonomous car manufacturers generally have an obligation to update their vehicles to prevent more accidents/deaths, even in the scenario where they're already reducing the total number. I think that of other technology, too.
I am not real fan of Tesla but sometimes I think it's weird how much of a focus is on them during an accident. Regular humans still have a much higher rate of accidents and killing people than Tesla auto pilot. And yes of course the system needs to be fixed so it properly recognizes object/people but aren't the drivers partially responsible? Our car has a similar feature, auto pilot light, (I call it) and it constantly yells at me if I take my hands off the wheel. I heard folks were tricking Tesla cars by putting weights on the wheel so it would think they were holding on.
Anyway, basically of course it needs to be fixed but I think sometimes Tesla gets criticism solely for being Tesla.
Tesla gets the most attention because they have BY FAR the most crashes, and they're the ones marketing it as an "autopilot" when it's not actually allowed to be used like that.
NHTSA has been collecting detailed data on crashes involving driver-assistance technology since 2021. Almost all of the 807 automation-related crashes in this data set involved a Tesla vehicle. Subaru came in second with 23. The Post discovered that four of the 17 Tesla-linked fatalities involved a motorcycle, and one involved an emergency vehicle.
I look at this like any life safety issue, like with building codes or other car safety regulations. When accidents happen as a result of new technology, safety regulations will increase as a response to prevent loss of life / injury. I don’t trust a company like Tesla to do all the things the Gov’t might make them do on their own because safety is not their only concern and they are weighing it with other factors. Sadly the only market the U.S. doesn’t have much regulation like this is the gun industry.
On a side note, I was driving home during rush hour on the freeway Friday night and a cop car with lights/ sirens passed me in the fast lane. A little further ahead the was a Tesla in the fast lane and it took so long to notice and get over so it was holding the cop up. I was wondering if it was on autopilot and the delay was either due to not sensing the lights or because it needed so much room to get over a lane (a normal driver would have more than enough room to move into the gap, but the fast lane was clear ahead of them while the other lanes were not, so it didn’t really make sense for the cop to pass them on the right). I was curious how the Tesla autopilot responds in that scenario.
I mean but can't the driver just take control. Auto pilot doesn't mean the driver doesn't have control of the vehicle if they need to. Like in that situation the driver should have been aware and moved? All it takes to move out of auto pilot is for the driver to move the wheel or brake.
To elaborate further - I just looked a handful of company websites about their driver-assist tech and every single one of them has a name that implies it's an extra set of eyes, or helps notice things or sense things. Not that it drives your car for you. Except Tesla.
AND they get extra heat (and lawsuits...) because Elon Musk himself doesn't actually GAF if people are ignoring the instructions and just SAYS SHIT because he thinks what he thinks is cool is more important than how the world actually works. www.cnn.com/2023/12/14/tech/teslas-autopilot-recall-elon-musk/index.html
I look at this like any life safety issue, like with building codes or other car safety regulations. When accidents happen as a result of new technology, safety regulations will increase as a response to prevent loss of life / injury. I don’t trust a company like Tesla to do all the things the Gov’t might make them do on their own because safety is not their only concern and they are weighing it with other factors. Sadly the only market the U.S. doesn’t have much regulation like this is the gun industry.
On a side note, I was driving home during rush hour on the freeway Friday night and a cop car with lights/ sirens passed me in the fast lane. A little further ahead the was a Tesla in the fast lane and it took so long to notice and get over so it was holding the cop up. I was wondering if it was on autopilot and the delay was either due to not sensing the lights or because it needed so much room to get over a lane (a normal driver would have more than enough room to move into the gap, but the fast lane was clear ahead of them while the other lanes were not, so it didn’t really make sense for the cop to pass them on the right). I was curious how the Tesla autopilot responds in that scenario.
I mean but can't the driver just take control. Auto pilot doesn't mean the driver doesn't have control of the vehicle if they need to. Like in that situation the driver should have been aware and moved? All it takes to move out of auto pilot is for the driver to move the wheel or brake.
there are reports that, no, this can't happen quickly enough. Immediate fast full brakes on a highway. Or turning in opposite directions.
So you combine regular human reaction delay irt driving with reacting to the "autonomous" driving delay and correction, and you can end up in pretty awful situations.
My understanding is that a fair amount of the criticism isn't the technology per se, even though that's certainly an element, but largely because of the hyped up and misleading marketing.
I look at this like any life safety issue, like with building codes or other car safety regulations. When accidents happen as a result of new technology, safety regulations will increase as a response to prevent loss of life / injury. I don’t trust a company like Tesla to do all the things the Gov’t might make them do on their own because safety is not their only concern and they are weighing it with other factors. Sadly the only market the U.S. doesn’t have much regulation like this is the gun industry.
On a side note, I was driving home during rush hour on the freeway Friday night and a cop car with lights/ sirens passed me in the fast lane. A little further ahead the was a Tesla in the fast lane and it took so long to notice and get over so it was holding the cop up. I was wondering if it was on autopilot and the delay was either due to not sensing the lights or because it needed so much room to get over a lane (a normal driver would have more than enough room to move into the gap, but the fast lane was clear ahead of them while the other lanes were not, so it didn’t really make sense for the cop to pass them on the right). I was curious how the Tesla autopilot responds in that scenario.
I mean but can't the driver just take control. Auto pilot doesn't mean the driver doesn't have control of the vehicle if they need to. Like in that situation the driver should have been aware and moved? All it takes to move out of auto pilot is for the driver to move the wheel or brake.
driver reaction time is a real bitch, particularly at highway speeds, but regardless - yes, of course they can. The whole point of the recall is to increase the warnings to make sure the driver is ready to do just that. It's not taking away the feature.
I mean but can't the driver just take control. Auto pilot doesn't mean the driver doesn't have control of the vehicle if they need to. Like in that situation the driver should have been aware and moved? All it takes to move out of auto pilot is for the driver to move the wheel or brake.
there are reports that, no, this can't happen quickly enough. Immediate fast full brakes on a highway. Or turning in opposite directions.
So you combine regular human reaction delay irt driving with reacting to the "autonomous" driving delay and correction, and you can end up in pretty awful situations.
My understanding is that a fair amount of the criticism isn't the technology per se, even though that's certainly an element, but largely because of the hyped up and misleading marketing.
Yeah interesting. WE have an electric car (not tesla) and it drives with very minimal interaction from me but the second i move the wheel a fraction of an amount it shuts off. I only drove a Tesla once with auto pilot and it shut off quickly, but I know there have been issues with quality control. Ideally if a car is in autopilot the driver should be paying JUST as much attention as if the car is not on auto pilot, but i understand that isn't the case (whether intentionally or not).
It will help some. It won't solve the entire problem, but will move the technology in the right direction.
So theoretical question- there are roughly 42,000+ deaths on US highways alone, not even counting injuries or property damage only collisions. If Autonomous vehicle technology in general by all manufacturers could reduce those deaths by 10K, do you think we as a society should move forward with that technology? What if that technology was still responsible for crashes that killed 2K people, but in the end saved 10K lives? If we could do something that would potential save 5K or 10K or 20K lives, isn't is irresponsible NOT to do that thing?
tangent, have you watched The Good Place? I feel like you should watch it.
It's just a different sort of Trolley Problem, isn't it? WWC(hidi)D?
If it's like the Rivian, they push over the air updates regularly and after installing it's fixed or updated. So I am guessing they are doing this to all Teslas, not necessarily taking them off the road like a typical recall.
Ok, that’s the piece I’m missing. LOL! It’s not a recall like I’m used to. It’s basically a software update.
So, can people decline the updates?
Hm, I don't know. We have to manually click "install" in our app and there are rules (can't be plugged into a fast charger, certain percentage of battery, etc) but we've always been anxious to get our updates because they have improvements so we can't hit install fast enough, lol. I remember being SO surprised the first time we got a software update. It improve our range by like 20 miles... I was like, how?!
Do I think it'll make a difference- probably, eventually. I know for us, the Rivian has constant noises and movements of the wheel if you're not touching it, even with Driver+ on. But ours doesn't allow for adaptive cruise control and changing of lanes without input. It's not touted as "self driving" though.