Comments

You must log in or register to comment.

Thegarbagegamer97 t1_j954b19 wrote

Thus is the problem with AI driving. It knows the rules, it knows how to respond to things that are following the rules, however, it struggles to compensate for when something goes against what it considers to be the rules. That requires on the spot thinking which we still seemingly have a ways to go before being able to replace humans for.

68

Thegarbagegamer97 t1_j95684x wrote

Likely so, but in cases of mistakes like this, if the self driving was in use, i have to go with human drivers simply for the ability to make sense and at least attempt to navigate/slowdown/stop in response as needed. The AI seems in the beginning stage of development where it says “hmm, that looks like road so full speed ahead”, which most people, assuming they aren’t distracted, or under the influence of some substance will see flashing lights and a big object blocking the road and try to navigate around the firetruck, not plow through it like their in some action film. Some day im sure self driving will get there, but we are a LONG way off

7

kuahara t1_j956wrq wrote

AI was not navigating this vehicle; a human was.

Also, we didn't need to know the make of the vehicle. The outcome would have been the same in a Chevy, Ford, BMW, etc... reporting agency is trying real hard for a specific reaction to a completely irrelevant detail.

−29

Perfect-Height-8837 t1_j957xuy wrote

This is the main reason I want to buy a Tesla. It doesn't even need to be in self driving mode, but the Media will report my death to the world just because I was driving a Tesla.
No other car manufacturer can offer this level of post-mortem notoriety.
You never read headlines such as "some nobody dies when his Skoda crashed into a firetruck."
But put Mr Nobody in a Tesla and he's worth reporting about.

90

ViciousNakedMoleRat t1_j958sau wrote

With automated driving the question is simply: How much more do we value an overall reduction of crashes compared to having to live with crashes that a driving human would've easily avoided?

On a societal level, we should theoretically be in favor of self-driving cars as soon as they cause fewer crashes than human-operated cars – even if it's just a couple of percent.

However, on a personal level, it probably takes a much more significant rate to convince many individual drivers. That's because a vast majority of drivers thinks of themselves as above average.

The perceived stupidity of automated driving accidents, like driving straight into objects or coming to a stop in the middle of the road makes them particularly likely to be picked up by the media, which raises the exposure of people to these issues. The hundreds of daily crashes caused by inattention or other human error just slide by without being noticed.

This causes a similar situation as fear of flying. It's much safer to fly than to drive a car, but plane crashes become huge news stories, which causes some people to develop an irrational fear of flying, while having no issue with driving.

−9

Thegarbagegamer97 t1_j959416 wrote

Self driving will be a wonderful thing one day, but when it has the potential to break laws by ignoring the rules of certain areas of roadway, and to plow straight on into a stalled out or stopped vehicle like theres nothing there, i think ill hold off on it just a little longer and keep my personal judgement capabilities. Humans arent perfect drivers, i dont expect AI to be either. But id rather prefer not having to babysit the entire time simply because it can have a tendency to suicide rush a fire truck or go straight from a turn only lane.

12

Chippopotanuse t1_j95dj4t wrote

Weird how Teslas keep crashing in to emergency vehicles. Happens so frequently that NHTSA launched an investigation:

https://www.autoweek.com/news/green-cars/a37425353/another-tesla-hits-police-car/

> The latest Tesla crash into a first responder vehicle comes just two weeks after the National Highway Traffic Safety Administration opened an investigation into 11 instances of Tesla drivers hitting parked emergency vehicles while using the Autopilot driver-assist system in the US. The incidents date back several years and follow a surprisingly common pattern: First responder vehicles, including police cars and fire trucks, stop to assist disabled vehicles on road shoulders or traffic lanes, using emergency lights to direct traffic around them, as Tesla vehicles with Autopilot engaged collide with them either with or without attempts by the driver to brake in the seconds prior to impact. Some crashes have resulted in serious injuries, as they happened at highway speeds.

195

wolf-bot t1_j95f2g2 wrote

Could have been avoided if there was a child standing on the side of the road.

87

Neospecial t1_j95gjji wrote

Isn't it always "OFF"? As in intentionally turning itself off second before a crash to avoid liability? I don't know and don't care to find out, just something read or heard somewhere at some point.

I'd not trust an AI driving me regardless.

18

hwangjae45 t1_j95gyg8 wrote

From what I know Tesla cars have a record of when it turns it’s auto pilot on and off, and from what I’ve seen it seems I think that it records that it is on. With that said I think Tesla had a recall due to their auto pilots, so it does seem to be a huge problem.

2

Sol_Invictus t1_j95lec1 wrote

Maybe the Head Tweeter will adjust the algorithm.

21

LogicisGone t1_j95o6o7 wrote

I mean my local news pretty much covers every car accident with a death and I don't remember a single instance of anyone hitting a parked fire truck ever. Let alone also killing someone and injuring 4 firefighters. I would actually call this newsworthy.

12

Mr_Mons_of_Nibiru t1_j95pery wrote

And we come to it. The necessary sacrifices that need to be made in the name of progress.

Can't wait for all those automated semis to hit the road.

−10

Chippopotanuse t1_j95psyk wrote

How many of those cars claim to have “full self driving“ and have people doing everything from reading the newspaper on the highway to taking a nap while the car is going 70 miles an hour.

Sure, the fine print says “but there has to be a human ready to take control at all times!“

But there’s only one auto maker that brags about their self driving hardware. And so yeah, it is newsworthy when the one company who claims they can do it is failing spectacularly at it. And getting people killed.

38

SporkofVengeance t1_j95q0qn wrote

As the training set is basically using real cars as alpha/beta testers, that’s likely true for Tesla. Most other companies now are using synthetic data to train their AVs and so test a far wider range of scenarios.

19

buttergun t1_j95rotz wrote

The Florida State Legislature is debating a bill that would replace construction zone traffic cones with 8 year old children. It seems like they would work for emergency responders too.

32

jenkinsleroi t1_j95tq03 wrote

Your statistic is meaningless unless it's normalized by number of miles driven. And I suspect the number of miles driven by non Tesla cars is way higher.

Plus autopilot should mean that they're much less likely to crash into a stopped vehicle on the side of the road.

−17

Agent_Angelo_Pappas t1_j95u8sw wrote

Except other automated systems like SuperCruise and ProPilot and whatnot don’t seem to have this same issue. Tesla automation is disproportionately hitting emergency vehicles with respect to how many systems are in the market

77

thisismynewacct t1_j95upo6 wrote

Terrible situation but at least the only fatality was the driver, who should’ve been paying attention and not the passenger or a firefighter just out doing their job.

3

Chippopotanuse t1_j95uvbc wrote

It’s not my statistic. It’s the NHTSA investigating crashes.

If you think your hand waving is enough to overcome the legitimate concerns the NHTSA has here…please see if Tesla will hire you as their general counsel in charge of regulatory oversight.

18

diezel_dave t1_j95x4o2 wrote

Mine would dangerously decelerate when it saw emergency lights flashing. Didn't matter where they were though, just any lights flashing anywhere vaguely ahead of you would cause immediate and unexpected hard braking. I always feared someone would rear end me or road rage me for brake checking them. So glad I sold that thing.

7

razorirr t1_j95x698 wrote

Are they though?

In 2019, an estimated 2,500 vehicles crashed into firetrucks parked as blockers (6.8 crashes every day or 16% of all firetruck collisions).

https://www.workzonebarriers.com/emergency-response-firetruck-collision-crash-facts.html#:%7E:text=In%202019%2C%20an%20estimated%202%2C500,of%20all%20traffic%20fatalities%20nationwide

Tesla has had around a dozen but its over 5 years.

Theres around 2 million teslas, and they all have AP at this point. 248 million cars total. .8%. 2/2500 is .08%. So tesla is 10x better than everyone else.

Also i feel you dont hear about the others because their systems are in an insignificant amount of cars, and usable on an insignificant amount of places. Once they scale to "yeah it works everywhere" it will go up

−30

Chippopotanuse t1_j95zckj wrote

You are less informed than your think.

Elon talks about autopilot and full self driving interchangeably.

Go listen to him in 2019 when speaking with Cathie Wood’s Ark Podcast:

“My guess as to when we would think it is safe for somebody to essentially fall asleep and wake up at their destination: probably toward the end of next year. I would say I am certain of that. That is not a question mark.”

It’s 2023 now. Not 2020.

And Tesla isn’t anywhere near capable of having someone fall asleep and arrive at their destination in any safe manner.

Elon didn’t speak unequivocally back in 2019 about this. He didn’t say “hey we are working on something cool that might happen someday.”

He guaranteed it: “I would say I’m certain of that. It’s not a question mark.”

He has promised consumers for years that they can do these things.

He’s a liar and you’re being duped. And more than one person who believed his statements are now dead.

17

ryan_m t1_j95zsyt wrote

Read the claim I responded to fully and then read what you posted. The first half is true that it turns off, but the core of the claim (that it is done to shift blame away) is entirely bullshit, because the cutoff for reporting is 30 seconds, and Tesla counts a minute before.

It makes sense that autopilot will shut off before a crash if you think about it for more than a couple of seconds. What behavior do you want a system like that to have when it encounters a situation it can’t handle? It should alert the driver and disengage. If you’re being a responsible driver, you should be paying attention the entire time anyways and ready to take control to specifically avoid things like this.

The anti-Musk circlejerk has gotten so insane at this point that people are no longer thinking about what they’re saying.

6

razorirr t1_j963cf0 wrote

How much do you want to actually read? I can answer this.

NHTSA has a standing order on ADAS crashes. All manufacturers are required to provide telemetry and report if a crash occurred with ADAS either on, or had been on in the previous 30 seconds. This reporting started July of 2021 and is still current.

You can read their june 2022 findings here. next report will be next june

In that whole time period, only 2 crashes were confirmed into first responder vehicles total, for any brand.

So every other article you have seen since june 21 through may15th 2022, the cutoff date in that report, is bullshit. its the press going "oh its a tesla and a responder vehicle, lets accuse AP/FSD, get a shitload of clicks from people on reddit, then not release a retraction months later when its found not to have been the cause"

As to my significantly insignificant bit. yeah, both crashes might have been tesla (the report does not break it down to that detail) But their system works everywhere, and is on way way more cars than Fords or GM. Ford was happy when they hit 16 million miles driven total. Teslas system does north of a billion a year. If tesla was both crashes on 1b miles, ford will have 0, and you can claim that "well ford is perfect" no, ford just has not had enough time to be statistically relevant.

The only other brand to have a significant amount of vehicles is Honda, with about 5 million, Their system however does not function everywhere so theres the question of are they better at not crashing, or do they just not have the crashes per mile figure out there as they have not released miles of usage figures. I can't do apples to apples with them as tesla has shown their apple, and the others all have a black box they say may or may not have a fruit in it.

−10

razorirr t1_j964ohq wrote

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

This report does. Manufacturers are required as of july 1 to have the cars monitor when the systems were on / off. If the car crashed with it either on, or on in the prior 30 seconds to the crash to report it.

From July 1 2021 - May 15 2022. Only 2 crashes total were into first responder vehicles. It does not specify which brand had it happen. but even if it was tesla for both, its probably inevitable. Tesla reports about 1 billion miles a year where the car is driving, Ford reported 16 million in a press release.

If we find that 1 crash in 500 million miles is the average, Fords 16 million miles is only 3.2% of that miles driven. Its not that ford never crashes, its that they have not done enough driving to hit the point at which it was statistically probable to have occurred yet.

4

razorirr t1_j965fji wrote

Weird how you brought that up when this article does not even attempt to blame it on Autopilot / FSD.

Frankly, theres 2500 crashes like this a year across the whole vehicle fleet. 6.8 of them per day. So if that was 1 article, where are the other 5.8 articles? Oh wait it does not involve a tesla, so it wont make the news.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

There were from Jul 2021 - May 2022, 2 crashes that were proven to have been ADAS of any brand's "fault". And by fault I mean the NHTSA's order of "if the system was off, but on up to even 30 seconds before hand, it counts"

Tesla does about a billion miles a year right now. Ford was happy to put out a press release about their cars having hit 16 million total. So even if both of the 2 in that report is tesla, and we find out that its a 1 in 500 million miles driven average. 16/500=3.2%. Ford simply has not had enough usage to have their 1:500,000,000 happen yet.

11

razorirr t1_j966747 wrote

There have from June 2021 through may 15th 2022 been 2 crashes with ADAS of any type, from any brand into emergency vehicles.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

So you can either believe the government which is forcing the makers to publish this data that this is a non issue. Or you are just here talking about it cause you don't like autonomous vehicles and you are being dishonest with your comment anyways.

0

razorirr t1_j9685bf wrote

Cant read, paywall.

how much of a minority, and how many miles do the others have? Like ford for example is all happy their has been active for 16 million miles recently, Tesla is around 3 billion since it came out adding about a billion a year. so every 1 tesla is worth 97 fords.

−17

Velocity_LP t1_j968du1 wrote

> A car crashing itself and killing its driver is much more newsworthy

are you confusing this article with some other incident? or did you find another source i must’ve missed? there’s zero mention of use of autopilot or anything else in this article that suggests the car caused the accident

11

razorirr t1_j96c9l7 wrote

Nah. NHTSA requires reporting of all accidents up to 30 seconds after it turns off.

So if you think its turning off to not get counted, that means you think its not able to avoid crashing, but is able to realize its going to crash a half mile up the road, turn itself off, which it notifies you its doing, then the driver ignores the minority report self turn off, does not take over, and crashes.

2

tapac333 t1_j96ehdu wrote

Emergency vehicles run red lights. More Teslas than other brand self- driving cars on ths streets, therefore probability of Teslas hitting vehicles that don't abide by traffic signs would be higher.

−8

iamaredditboy t1_j96g1bh wrote

Teslas need to be banned period till they turn off self driving on all their vehicles. No one knows when self driving is engaged and Tesla drivers are worse then break my windows drivers…

−1

bobjoylove t1_j96hyfd wrote

Nevertheless with ADAS this exact collision type should be 100% avoidable without extenuating circumstances (ice on the road, impact from another vehicle driven by a human). The reason it’s not is Tesla’s refusal to use ranging technology like Radar, and insisting on cheaper visible-light based cameras.

17

Chippopotanuse t1_j96mwhw wrote

Okay…but these are emergency vehicles that are on the side of the road and stopped.

Literally the excerpt from the NHTSA report I pasted says these emergency vehicles were stopped on the side of the road to help folks.

The flashing lights on emergency vehicles confuse the Tesla AI. It’s been a known problem for years. Elon and his fanboys try to gloss over it or play whataboutism games to avoid having to address it in any substance.

4

razorirr t1_j96oqwi wrote

Your statement shows you dont know how car radar works.

Cars are using the radar to measure doppler shift. This is how they tell if the car in front of you is moving faster, or slower than you. Because the speed of the signal is a known constant, it can also give you distance.

In the conditions you have driving, you have to throw out any measurement of something not moving, such as that parked firetruck and mark it as invalid. This sounds ridiculous but its for a simple reason

Pretend you are in a car with radar and you are driving down into a valley. The car will see the bottom of the valley where you would start driving up the other hill as a static object, and the car would stop. With radar, you cant tell this valley from a police car.

−3

KnucklesMcGee t1_j96pet0 wrote

Miraculously, FSD disengaged seconds before impact.

1

razorirr t1_j96w1v9 wrote

Miraculously, NHTSA requires any disengagements up to 30 seconds prior to be reported as if it was on.

So good on you for thinking tesla can predict it will hit a parked firetruck 31 seconds prior, yet cant figure out how to not hit said truck in 31 seconds.

2

bobjoylove t1_j96wdpy wrote

Your statement shows how you don’t know how software works.

You augment the camera with the RADAR. When the two diverge significantly the system will error and hand back control to the driver.

11

razorirr t1_j96xtfb wrote

So, we don't have all the offical numbers for things but we can take a crack at this

https://lexfridman.com/tesla-autopilot-miles-and-vehicles/ Lex Fridman, a MIT Research Scientist has sat down with the sales figures and the AP miles driven numbers tesla has occasionally given out, and at the last update he posted, there would have been roughly 1.8 billion miles driven between 4-22-2020 and 1-1-2021. For a full year this is 2.662 billion miles. or 221.9m per month.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

NHSTA says over the course of 10 months, there have been two confirmed ADAS related accidents into first responder vehicles.

https://smartfinancial.com/average-miles-driven-per-year

If you take the 12785 miles per driver 2020 average which they calculated by doing the math against the 2020 FHA report. There are 228m drivers, so this is 243b a month

https://www.workzonebarriers.com/emergency-response-firetruck-collision-crash-facts.html

This report shows that 2500 trucks a year parked as blockers get hit. 250 per month

250 accidents per month / 243000 million miles = .001

.2 accidents a month / 222 million miles = .0009

So if you take all the different reports in context of each other. Non tesla AP hits .001 fire trucks per million miles driven. Tesla AP hits .0009 firetrucks per million miles driven.

Tesla AP is slightly better than all humans + other AP systems. If we replaced everything else with Tesla AP, we would have reduced the accident count by 291.5

−1

VegasKL t1_j96xz98 wrote

>SuperCruise

That's also different technology afaik. I think GM maps various roads with lidar vehicles and then those maps get loaded into the vehicles for cross-referencing to their position -- done this way so they don't have to have a bunch of LIDAR units on the vehicle processing in real time. They likely have some forward facing LIDAR or Radar (or both) units.

Elon wants to be cheap and do it solely with cameras.

8

VegasKL t1_j96ymhz wrote

It shouldn't be an issue of training data anymore, Tesla uses a lot synthetic (3d generated) data now so they can train the same exact scenario with a ton of variables swapped out over and over again. Nvidia (IIRC) did a presentation on the tech.

Remember, they also had this issue with box trucks if I remember correctly.

3

razorirr t1_j96yztn wrote

That wouldn't do anything

Like I explained, the radar in this case would give "All clear"

The camera in this case should have gave "firetruck" but gave "all clear"

Erroneous camera All Clear + radar design all clear = all clear = crash.

Camera Firetruck + all clear = stop

The radar all clear in this case is unneeded, as it will never be not all clear, and the diverging car stop is not needed because the firetruck car stop would apply.

From a QA guy telling the probably Developer guy your logic is bad, you could program the radar to always return blocked if it sees any static object. but then that causes a problem.

  1. If the radar says blocked, and the camera sees something, That is a stop due to agreeing.
  2. If the radar says blocked but the camera does not see anything, that is a stop due to divergence.

Your car would never be able to go anywhere in the system you proposed other than on an unblocked flat surface.

Love all the instant downvotes all my posts are getting. Seems a lot of people don't know what they are talking about but think they do.

−3

bobjoylove t1_j971ijk wrote

The Radar is used for ranging. It provides a distance and a rate of change over a reasonably narrow aperture. The bottom of the valley does not get close enough to warrant emergency intervention from the braking system.

The fact that the majority of cars with dynamic cruise and automated pedestrian braking systems all using 60GHz as the detection method should tell you it is possible and it is shipping already.

8

razorirr t1_j9731y8 wrote

>The Radar is used for ranging

Correct

>The bottom of the valley does not get close enough to warrant emergency intervention from the braking system.

Incorrect

You are driving down the hill, its a 1 mile slope from top to bottom, then it curves and goes up the next hill.

You are right that while its far away, you can ignore the read because the range is saying "yeah i see something, but its 3000 feet off, who cares" or it just sees nothing as its not looking that far out.

But since its not moving and you are, eventually you will be 200 feet from the bottom. Radar sees this as an object blocking your path, and its now close enough the car goes "Yeah I see something, its 200 feet away, lets stop."

Since the ground is never going to move, radar will always say stop. A camera with sufficient data labeling ability can overcome this as it can tell context, radar never can as it is a binary "block / clear"

Also, https://www.chevrolet.com/support/vehicle/driving-safety/brakes/front-pedestrian-braking heres Chevy explaining how they do their pedestrian braking. Its using cameras, not the radar.

5

bobjoylove t1_j977j8u wrote

Ok let’s agree to disagree on the technical aspects of a know working collision avoidance system they is shipping on millions of cars including my own.

It’s good to have a secondary system to cross-check the cameras. I have noted that many (not this one) cases of the Tesla systems failing have been at night. Adding RADAR or LIDAR augments the cameras. BTW the answer in the back of the book is Tesla have realised that they actually do need RADAR and have begun adding it. https://electrek.co/2022/12/06/tesla-radar-car-next-month-self-driving-suite-concerns/

6

razorirr t1_j978ckm wrote

No. This is a technical conversation about how a technical system works. You cant agree to disagree on those aspects else its impossible to come to an agreement at all. The only way to prove this would be for you to prove the car would not stop forever on the hill once the radar and the camera diverged if divergence = stop or in the case of pure radar = sees blockage = stop.

I agree augmenting is good. Like the radar can see the range of an object better than camera vision can guesstimate it. But what i was talking about is a known limitation to radar. You can not "Augment" around that, you have to throw the data out, and if you are throwing it out 100% of the time, you don't need it.

0

MidwestAmMan t1_j978fza wrote

It’s a sticky wicket tbh. The Tesla-over-the-cliff all survived story was incredible. Teslas are clearly much safer on average. But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

If humans are a greater risk than FSD maybe FSD can be modified to require the driver take over when approaching emergency vehicles. But we need to know if FSD was engaged here.

1

smoke1966 t1_j9790e7 wrote

if it is programmed to learn correctly.. I've done programming and there's always the one thing you forgot.. If you don't believe that it's just a prime example of the problems with these cars.

1

TenderfootGungi t1_j97ac0b wrote

>In the conditions you have driving, you have to throw out any measurement of something not moving, such as that parked firetruck and mark it as invalid.

That should depend on where it is. Is it on the side of the road? Not an issue. In my lane? Real issue.

It is telling that no other self driving tech is having trouble with this. Everyone else has this figured out.

3

TenderfootGungi t1_j97bg88 wrote

They were caught turning it off a split second before most crashes, and then stating something like "the auto pilot was not engaged". In many cases it was, less than a second before the crash, though. They has now started asking if it was engaged so many seconds before a crash (e.g. 10 seconds, but cannot find the exact time).

−1

bobjoylove t1_j97ch2n wrote

Do you ever think that, even when provided with a link proving me right - specifically Tesla adding RADAR to fix their issue - and you still argue that isn’t the resolution; that you might just be stubbornly wrong?

3

razorirr t1_j97dmf4 wrote

Did you ever think that they could be putting the radar in to augment all the other situations where radar is helpful, but due to the limitations of radar, this is not one of those situations?

Actually read and comprehend that article. The OG radar my car has was insufficient compared to just cameras. The one they are putting in has much better distance that it can see, but it still will have the issues I've explained above as that is a fundamental issue with radar.

So now instead of seeing the bottom of the valley at 200 feet, it sees it at 400 feet. All of the same problems occur and the car still can not proceed to the bottom of the hill if programmed to always stop based off a radar blockage or a radar vs camera divergence. Radar always will be not helpful for static objects in path, but it will be really helpful for letting the car know something is in motion 400 feet away.

0

Raspberries-Are-Evil t1_j97fv7z wrote

> But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

As a Tesla owner myself, I understand that I am in control of the car at all times. This is no different than some idiot on cruise control slamming into a stopped car in front of him.

FSD requires your hands to be on the wheel. In fact, every 30 seconds or so, it remind you and if it doesn't detect your hands on the wheel by making a slight move to the wheel, it will disengage.

So even IF driver was using FSD, its his fault for not slowing down when approaching a fire truck.

3

razorirr t1_j97g29v wrote

I will freely admit radar can help in situations. This situation is not one of them because of how radar works. You have convinced yourself otherwise and now refuse to correct your incorrect opinion.

Have a good one.

1

xnago_tyr_sires t1_j97wug0 wrote

Why is it always in Walnut Creek that some one does something stupid in a Tesla?

1

ariceli t1_j98k3en wrote

I guess other makes of cars don’t crash except Teslas.

−1

WirelessBCupSupport t1_j98kopd wrote

I watched this on the news. They couldn't determine as the driver died on the scene, but the passenger was alive and lifted to the hospital. And the firefighters that were there, said this isn't the first time they were hit. And, why dealing with the crash, they almost got hit again!

3

razorirr t1_j98uvkc wrote

hahahaha. that report is a news article talking about the NHTSA report i got my 2 AP crashes from.

If you take the estimated miles driven for AP, and the estimated miles driven by everything else. AP has a crash rate of .0009 per 1,000,000 miles into all first responder vehicles, and that is assuming all 2 reported in that report were tesla. all cars overall broke out to .001 per 1,000,000 miles.

So forcing everyone to use AP would reduce crashes into parked firetrucks by 290 a year or 11.5%.

So if you want to use that article as a reason against ap, feel free, as its actually a reason to ban humans and use AP.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

Its page 7 of nine, second chart, shows ADAS crashes per vehicle it crashed into. Further. the data in that report has the possibility of double counts, so if Tesla reported it and the police seperately reported it. its 2x times. It also does not mean "The tesla crashed into me" If you rear end a tesla, and the tesla was on AP, it goes in the report, even though it was not the teslas fault.

1

razorirr t1_j9an80n wrote

You would think that but no. If you consider anything in front of you not moving as reason to stop, if i put you on a hill, your radar is now pointing down the slope, so as you approach the bottom your vision will tell you "im ok to proceed, its just the hill slope leveling off." Radar will tell you "oh shit theres a stationary object, brake now". Stationary objects in path is a limitation of using radar, which cant tell what the object is.

1

Osama_bin_laughin t1_j9awrxj wrote

Okay but was the Tesla self driving or not? Doesn't say in the article. If not then this article is some anti Telsa non sense

2