đŸ’Ÿ Archived View for dioskouroi.xyz â€ș thread â€ș 24919354 captured on 2020-10-31 at 00:50:40. Gemini links have been rewritten to link to archived content

View Raw

More Information

-=-=-=-=-=-=-

Consumer Reports: Cadillac's Super Cruise Outperforms Tesla's Autopilot

Author: totalZero

Score: 101

Comments: 134

Date: 2020-10-28 14:49:54

Web Link

________________________________________________________________________________

yumraj wrote at 2020-10-28 16:54:55:

So Cadillac is calling it what it really is, a _Super Cruise_, not self-driving, not auto pilot, not AI-_magic_.

That, IMHO, in itself is commendable since it tells the driver what to expect and keep themselves safe by being aware of the limitations of what the car can and cannot do.

elisharobinson wrote at 2020-10-28 21:43:29:

IMHO driver monitoring should be separate from any test which is meant to evaluate L2 self driving features. And any trained pilot knows autopilot does not do everything and that you should be prepared to take over if needed , if you perceive autopilot to be "AI-magic" you should not blame others for your own ingnorance.

542354234235 wrote at 2020-10-29 17:41:27:

> And any trained pilot knows autopilot does not do everything and that you should be prepared to take over if needed , if you perceive autopilot to be "AI-magic" you should not blame others for your own ingnorance.

A huge part of airplane safety is the direct opposite of that. It isn’t “don’t blame others for your ignorance” it is “the system should be designed so that it is harder to make a mistake than to do the right thing”. Design and testing doesn’t put the onus on pilots, it puts the onus on the design and the system. How can this documentation be clearer? Is this warning easy to ignore in certain situations and how can we change that? If a pilot gets confused in this situation, how can we design the system to help? How can we train our pilots so they can handle this? How can we test our pilots so we know they truly understand the system? They don’t just say “well the manuals were available, you should have known so that’s on you”. Saying “it’s the individual’s fault” doesn’t improve the system.

content_sesh wrote at 2020-10-28 22:26:33:

There is a large difference in the degree of training needed for pilot certification vs. getting a driver's license, and Tesla has already gotten in trouble for misleading claims about its "Autopilot" feature.

So I agree with the parent comment that it is good Cadillac is not overselling their capabilities.

Tarragon wrote at 2020-10-28 17:55:37:

I prefer Lane Keeping Assist to Lane Centering.

The Ford system is the only LKA I have experience with. It has me drive but warns and intervenes when I get close to the line without signalling. Before it nudges the steering it shakes the wheel and it feels just like rolling over a rumble strip.

In cases where I do legitimately get distracted and start to drift that rumble snaps me back to driving before things go awry.

False positives almost completely come from changing lanes or exiting the highway without signaling. The rumble strip feeling and then the steering input isn't strong enough to prevent the lane change but does give me that reminded to signal next time.

The nudge is strong enough to keep the car in lane but no one in their right mind would let the car go down the road bouncing off the lines like this.

To me there's no question that I'm driving. I stay engaged and alert without every having to be monitored and nagged. It helps me when I slip but doesn't make me complacent.

With a Lane Centering systems the car is doing all of the subtle corrections. I don't feel like I'm driving and I totally drift off. I feel much more checked out and thinking about it after the fact I feel much less safe over time.

skrtskrt wrote at 2020-10-28 18:28:20:

The Honda system is almost exactly how you would describe, and it's pretty great.

Sometimes it's a tiny bit too aggressive for me as I like to sit towards the very outside of the lane (if I am in furthest left or right lane) to keep max distance between myself and other cars. More distance buys you more precious reaction time.

Otherwise, a fantastic system.

opwieurposiu wrote at 2020-10-29 16:07:07:

I have the same experience with Honda Sensing. The honda drives much closer to the double yellow then I like. Apparently it is possible to recalibrate the system but from this video it looks like a rather involved process.

https://www.youtube.com/watch?v=RsnHOgFnJLw

BoorishBears wrote at 2020-10-28 18:25:49:

Tesla's initial Autopilot run was literally LKA running on a loop.

Mobileye had put that exact sensor suite in dozens of production vehicles but left it to LKA duties due to the inherent danger of expecting the user to take over at any moment.

To that end, the system actually provided _more input_ when it detects _wrong_ input than when it detected _no_ input. For example, feel a strong nudge when you try and change lanes without signaling _before_ you leave the lane, but with no input at all the car would be riding the line before correcting, essentially preventing hands free usage

It focused on correcting behavior without causing sudden hard to correct errors... like driving into a concrete barrier when your hands aren't ready to provide counter-input

philsnow wrote at 2020-10-28 18:47:04:

I wish the Honda LKA were tunable, it's much too noticeable when it nudges you back into the lane.

I typically disable it on long car drives because in California even long car drives often have a lot of other cars that I keep my distance from, and I have a kid who gets motion sick easily. If I keep the LKA on, on a typical drive it results in a bunch of jerky movements that trigger his motion sickness.

If it were paired with driver monitoring it could know that my eyes are indeed on the road and nudge less or not at all.

pedrocr wrote at 2020-10-28 20:34:50:

Teslas work exactly like this without Autopilot. It's one of the assistance features, just like warning if someone stops in front of you and you don't seem to be reacting. These features also get better as Autopilot gets improved with less of the nasty edge cases. So even if you don't trust Autopilot, Tesla's improvement of the system over time is still useful in improving those other assistance features.

Solocomplex wrote at 2020-10-28 22:14:41:

Hyundai/Genesis have limited hands-off lane keep. I did a road trip across SD with minimal intervention. No need to use the pedals, and only the system-required steering wheel nudges.

Tarragon wrote at 2020-10-28 18:02:28:

I totally want to study this and see if it's just me and then publish.

Oh well, if I ever jump to academia.

RobLach wrote at 2020-10-28 18:06:41:

You can publish without an academic affiliation.

Tarragon wrote at 2020-10-28 18:10:43:

Sure can but available time and relation to paying tasks keeps it on the wishlist and not do list.

greatjack613 wrote at 2020-10-28 17:27:43:

This is a very misleading title, even CR agrees that tesla's autopilot outperforms supercruise in terms of features, accuracy, and performance. They just don't like the amount of freedom tesla's system gives the driver while supercruise forces the driver to stare straight ahead the entire time.

gamblor956 wrote at 2020-10-28 18:17:40:

No, it's pretty accurate.

The Tesla Autopilot _just barely_ beats out Super Cruise in terms of features and performance.

But Super Cruise absolutely blows Autopilot out of the water on safety. It's not even close.

And so, objectively, Super Cruise won.

BoorishBears wrote at 2020-10-28 18:32:15:

It's funny that you got downvoted for this very accurate take based on their own conclusion.

Tesla barely bests GM and co on capabilities: 9/10 vs 8/10

But then loses on one of the most important features to ensure people aren't killed by what should be safety features: 3/10 (!) vs 7/10

Nothing misleading about it, it's a reflection of the fact GM isn't going for flashiness over function.

A comment above yours is bragging about party tricks like taking an on ramp, but being able to ensure the driver is attentive for the bulk of the trip that is highway is going a longer way for driver safety than being able to beta test non-self-driving "full self driving" on public roads at the expense of everyone else on the road

jiofih wrote at 2020-10-28 19:37:41:

If you consider “safety” requiring more human input. Tesla is aiming for the car/computer to be safer than the driver.

ars wrote at 2020-10-28 20:16:32:

It's good for them to aim that way, but at present the car/computer is much less safe than the driver.

jiofih wrote at 2020-10-29 16:12:18:

It is not. It is either as safe[1] or vastly safer[2] than humans depending on how you look at it, and even without AP, Tesla vehicles are some of the safest ever made. It’s all publicly available data.

[1]

https://www.forbes.com/sites/bradtempleton/2020/10/28/new-te...

[2]

https://cleantechnica.com/2020/08/01/tesla-autopilot-acciden...

gamblor956 wrote at 2020-10-29 20:10:40:

No, if you actually read the report, they note that Teslas using Autopilot actually had more than 2x the number of accidents than cars driving without Autopilot engaged (but had other advanced driving features engaged), and almost 3x the number of accidents than cars that weren't using Autopilot _or_ the other advanced driving features.

In other words, Autopilot was 2-3x worse than not using Autopilot at all.

In terms of comparing Teslas to the entire US automobile market, they're including luxury cars versus every other type of vehicle, including big rigs, and car models spread over several decades. Of course Tesla is going to be safer than older cars; NHTSA regulations _require_ that.

Notably, Tesla refuses to release comparisons of Tesla to cars of similar age or type. You know why? Because their competitors have the same low accident statistics as Teslas driven without Autopilot or "safety" features enabled, and admitting that would kill Tesla's safety marketing claims for Autopilot.

andbberger wrote at 2020-10-29 10:30:55:

Our regulators must be driving teslas because they are clearly asleep at the wheel.

It's obscene that Tesla is still allowed to pull this bullshit. They've already killed several people, and I hope that their recklessness catches up with them before more die. Their approach is transparently negligent and incredibly dangerous, which is immediately obvious to anyone who knows anything about machine learning.

Musk and co should be serving time for manslaughter

jiofih wrote at 2020-10-29 16:09:16:

The safety records for all car manufacturers are public. You can check on your own and stop repeating such uninformed blabbering.

gamblor956 wrote at 2020-10-29 20:04:59:

His "uninformed blabbering" is simply the truth.

Tesla has more Autopilot-related fatalities than the entire rest of the industry does for their self-driving or advanced cruise features _combined._

And to date, Tesla is still the only car manufacturer whose self-driving system can't tell the difference between the sky and a gigantic semi trailer.

Shivetya wrote at 2020-10-28 18:58:11:

Yes I own a Tesla Model 3 but that safety rating is just pure bullshit. So they failed it because it can drive anywhere and they don't agree it should be allow a user to use on residential streets or streets not clearly marked. Better yet the car shows that is regonizes the driving lanes even with bad marking. Clearly CR is not judging on ability but instead judging against their own choice.

wow, just wow. that is just as contrived as can be.

as for amount of torque required to deactivate the self driving it not a lot and why would you assume you want the car to retake control immediately afterward? If I took control I want control till I state otherwise. I don't want to suddenly find the car off doing what it wants after I disengaged it through a method they defined as will turn it off.

So no, I do not agree with their safety assessment because the majority ignores the fact that it was designed to drive in those conditions and not require rigidly mapped roads to operate. There is no real point in self driving system that requires perfection in roads and markings.

Hell the scariest part about Tesla's FSD is how well it does work, it makes you wonder why so many cannot get even part way there.

Is it perfect, not on your life. However it is damn better than CR implies and the only real danger is some people might trust it too much regardless how much Tesla says not to. Well guess what, they do that will all LKA and even super cruise.

pandaman wrote at 2020-10-30 01:27:58:

Consider a compiler that compiles some malformed programs (e.g. replaces misspelled identifiers with a closest match, ignores missing ';'s etc). Will you want your Tesla's software to be compiled with such a compiler or would you prefer they used the strictest compiler available with extra linting with something like Coverity?

The same sentiment is for critical systems in a car: them working when they should not work is not an amazing feat of engineering but a defect.

vessenes wrote at 2020-10-28 18:02:21:

I agree, I was reminded of the days of custom anti-Tesla news sites funded by short sellers, actually.

"We assume these systems need full driver attention as their only important baseline, and regardless of how they perform, score them on that assumption."

There is no way Cadillac can take an on-ramp to a highway, safely change lanes for an exit and take the exit without drama, but my Model 3 does that very nicely right now.

m463 wrote at 2020-10-29 05:40:47:

Yes, there's another dimension to this test (literally). Time.

it might be like comparing an iphone to an android phone. a review at the time of release vs a review 3 years later might show different results if one keeps updating the software and the other stops.

bluu00 wrote at 2020-10-28 17:56:39:

> even CR agrees that tesla's autopilot outperforms supercruise

where?

> They just don't like the amount of freedom tesla's system gives the driver while supercruise forces the driver to stare straight ahead the entire time.

That's L2 sir. L2.

jiofih wrote at 2020-10-28 19:39:09:

In both the “capabilities” and “ease of use” section. Cadillac wins all the others because it has a camera watching your eyes.

ebg13 wrote at 2020-10-28 18:08:08:

> > _even CR agrees that tesla's autopilot outperforms supercruise_

> _where?_

In the section titled "Capabilities and _Performance_".

deweller wrote at 2020-10-28 15:34:53:

Capability and Ease of Use: Tesla wins

Safety Features: Super Cruise wins

Tesla beats Super Cruise in the 2 capability categories. But the Super Cruise wins by a large margin in the 3 safety categories.

Therefore the Super Cruise wins over all.

Robotbeat wrote at 2020-10-28 15:42:06:

There are some problems with Consumer Report’s methodology. They prioritize premapping as if it’s essential to being considered safe (even though relying on premapping for safety is potentially problematic!). The safety assessment is based on CR’s opinion on features, not empirical results.

I will say that warning drivers of potentially hazardous and/or uncertain conditions ahead is a good feature of SuperCruise that ought to be emulated widely.

BoorishBears wrote at 2020-10-28 18:42:22:

That's an extremely misleading slant lol.

They call out the premapped data as helping effectively communicate hard situations that are upcoming to the driver.

Super Cruise also ends up winning in safety categories that have nothing to do with their mapped data.

In no way shape or form do they imply that it's essential for being safe.

jiofih wrote at 2020-10-28 19:40:50:

What’s misleading about it? A system that is inherently designed to deal with the unknown, and not rely on hardcoded world info is bound to be much better. And safer.

BoorishBears wrote at 2020-10-28 20:52:31:

This is just another misleading claim neither above comment or the article made.

Using premapped data doesn't mean the system isn't designed to deal with the unknown... how do you think Super Cruise would be able to function without being able to deal with unknowns?

Rather the standard for a working feature that GM has set simply higher than the extremely low standard that Tesla has continously settled for (which is why they originally misapplied Mobileye's LKA sensor suite for Lane Centering and got people killed due to known defects with stationary object handling)

Super Cruise requires premapped data as a data point, in abscense of which it reverts to simpler ACC, which it can reliably do.

This avoid situations like my Model S test drive in White Plains, NY where the not-a-salesperson had to warn me about a stretch where AP likes to split the difference between the lane and an exit (towards a Jersey barrier) due to poor road conditions.

Super Cruise can use mapping data not just to handle something like that, by flag a problem area in advance and let the user take over _before_ the customers become guinea pigs.

jiofih wrote at 2020-10-29 16:07:30:

Have you tried Super Cruise under the same poor road conditions?

You didn’t counter my claim, in fact you reinforced the fact that SC will _rely_ on pre-mapped data for safety.

There is no conclusion to be taken anyway, as Tesla has over 3 billion miles driven on Autopilot with a great safety record, vs a mere 5M for Cadillac (AP’s average was 4.5M miles per accident at some point).

BoorishBears wrote at 2020-10-29 19:29:32:

Maybe you didn't consider the meaning of your own words:

> A system that is inherently designed to deal with the unknown, and not rely on hardcoded world info is bound to be much better.

This sentence is strongly implying Super Cruise is _not_ inherently designed to deal with the unknown due to the use of hardcoded data"

I counter that claim by pointing out that to function at all? it would have to be able to deal with the unknown, the map isn't going to include the most important dynamic element... other drivers.

Those will always be unknowns, so the system can handle unknowns.

-

And for the record, at the point that AP was at 4.5 Million miles, GM had already places 2 or 3 generations of Mobileye's EyeQ hardware in cars (the same hardware exploited for AP1).

It never killed anyone by driving them into a firetruck because GM used it as intended.

There's a big focus on X miles per accident with AP vs all other cars, but today a base model Corolla will assist a driver in keeping their lane and alert on a drowsy driver who is being over assisted.

What articles like this are showing what many have suspected all along, APs does have safety benefits, but the additional convenience features tacked on are not implemented safely.

Therefore cars that take a more conservative approach to convenience, by sacrificing some points with people clamoring to try the bleeding edge on public roadways, while maintaining the safety features end up being safer overall.

blinding-streak wrote at 2020-10-28 16:10:32:

Personally I'd like to use the one that is less likely to kill me. So I think a focus on safety is appropriate.

m463 wrote at 2020-10-28 16:34:48:

I think that's probably appropriate for consumer reports.

Growing up I read consumer reports and was wondering why the corvette didn't get the best score in the world. I mean, wasn't it the best car to a kid?

But the people who buy consumer reports are the kind of folks who would buy the best rated washer and dryer even if they came from two different companies and didn't match. (my parents, for example). And we all managed to survive with clean laundry that was fluffy and dry.

ponker wrote at 2020-10-28 17:26:14:

I find Consumer Reports reviews to be very conservative. I bought their top rated washer in a rental apartment where the landlord told me to "buy whatever." It was a decent washer, but nothing too incredible, and got mildewy smells (I know, I could have cleaned it more assiduously).

For my next washer when I moved into my own house, I did my own research, and found a washer that 1) stored an entire bottle of detergent and dispensed it automatically 2) ran a venting fan to prevent mildew after a load is done, and keeps it bone dry, 3) Sends me a phone notification when my laundry is done and when it's been sitting for 30 minutes, 4) Can wash a small load and dry it in one shot with the venting fan. Massive upgrade and nowhere on Consumer Reports' radar.

totalZero wrote at 2020-10-28 18:37:24:

How many years passed between the two purchases, and what was the ventilation situation in your apartment versus your house?

(Congratulations on your home purchase, btw)

ponker wrote at 2020-10-28 18:46:40:

3 years. Not much of a difference. But the old washer was always wet for a long time (even with the door left open) and the new one is always dry.

totalZero wrote at 2020-10-28 21:01:22:

Interesting; I guess you're right. I will take this as a data point indicating that Consumer Reports doesn't necessarily have its reviewers live with a product long enough to really understand how it performs under typical usage, even if they do get a chance to install/use it firsthand.

ponker wrote at 2020-10-28 22:23:13:

I think they also fall into the "Seeing like a State" fallacy where they like to rank things that are measurable. So they look at things heavily like water consumption and cleaning ability, which are easy to measure, but not "long-term stinkiness of the gasket" which are hard to measure. But having a stink when you open the door makes you feel disgusting about your clothes, regardless of the stain removing ability or how little water it used.

mhh__ wrote at 2020-10-28 16:51:48:

I think if Tesla end up getting outplayed on any of these fronts (in the long term) it wouldn't be good for their image.

A part of me would like to see it happen (regardless of my opinion of musk I think the company could use knocking down a peg for the sake of consumers)

SloopJon wrote at 2020-10-28 16:14:45:

I'm in the market for a new car with decent driver assistance, so a round up like this is just what I've been looking for. One of the first things that stands out is what Mazda told Consumer Reports about its Lane Departure Warning System with Lane-keep Assist:

Mazda responded that its system was not designed or intended to center the vehicle in a lane and that the automaker doesn’t believe its vehicles are equipped with technology that meets the criteria CR is using in its evaluation.

Not necessarily something I would have understood from its website or brochure. This probably takes the CX-5 off of my short list.

Incidentally, the submitted title casts this as Cadillac vs. Tesla, but the actual title is "Cadillac's Super Cruise Outperforms Other Driving Assistance Systems." In fairness, their 2018 article did have a similar title: "Cadillac Tops Tesla in Consumer Reports' First Ranking of Automated Driving Systems."

NovemberWhiskey wrote at 2020-10-28 16:40:10:

My car has the current-edition Toyota/Lexus system. First of all, it's worth saying there're a whole slew of ancillary safety systems beyond the 'nominal case driving' automation, like rear-crossing detection, collision brake assistance, blindspot warning and so on.

For the self-driving stuff, I really only use the radar cruise control. The car does have lane self-centering but it's not great at detecting lane boundaries, especially when in bad weather or when there's a lot of traffic.

I have that system turned to an advisory mode so it'll beep if I start moving across a lane boundary without a turn signal on.

The radar cruise control is very solid. My only real complaint about it is that it leaves a large gap at low speed; even when in its tightest-follow mode. You can end up getting people merging in front of you, which then causes it to generate an even larger gap. It will take off from a stop, but if it's a long stop it wants you to hit a resume button.

I understand how the systems work at this point, they don't surprise me, and they help with fatigue. I'm generally an attentive driver (I know, we all say that) so I'm OK with the limitations. I don't really want any more.

Very anecdotally, the last time I tried a lane-following system (an Audi rental about two years ago), it tried to kill me by following the off-ramp into a gas station at 65mph.

wil421 wrote at 2020-10-28 16:54:56:

Is adaptive cruise control considered self driving? I always thought it was driver assist feature like the original cruise control and blind spot monitoring.

Surprisingly it’s been around for almost 30 years and Mitsubishi was using Lidar way back in ‘91.

https://www.autonomousvehicleinternational.com/features/adas...

Rebelgecko wrote at 2020-10-28 17:08:42:

Because the car throttles itself up and down appropriately, which at least for me is the main annoyance when driving in traffic. IIRC, rear end collisions due to innatentive driving are also the most common type of car accident

NovemberWhiskey wrote at 2020-10-28 17:23:13:

The other thing to bear in mind is that the capability is way higher now than in the past, despite the feature having the same name. It works perfectly with multiple lanes of closely-spaced traffic, through bends, bad weather etc. It actually uses the brakes rather than just engine-braking when necessary. It works well enough in stop-and-go traffic. It doesn't lose its mind if people merge in front of you.

chris11 wrote at 2020-10-28 18:04:33:

I wouldn't consider it self driving. But the hw for LKA and adaptive cruise control can do a lot. That's all that Comma AI's Open Pilot needs, and a Toyota with TSS 2.0 is considered the best platform for Open Pilot. I definitely wouldn't say it's self driving but using Open Pilot gives you a really nice hands free experience driving on the highway.

andrewia wrote at 2020-10-28 17:01:54:

I believe Mazda does have lane centering, but they remove the feature for the US. It can be restored by using diagnostic tools to change the car's region. Specifically, it's "early" vs "late" departure intervention.

I have personally had experience with Hyundai's systems. Even cheap Elantras have a camera-based system with amazingly good lane centering and some form of auto braking, but they decided to make radar cruise only available for top trims. I got an older 2015 Genesis Sedan with radar cruise (natural but imperfect, and no detection of cars cutting in front of you) and lane keep assist (not quite centering except for every fifth curve where it's too aggressive). The blind spot warning is good but a bit too sensitive. Auto braking is radar based so it warns you appropriately but can't detect pedestrians. The car's driving assistance feels like a beta, which makes sense given its age. The NVH of the Genesis is stellar and the rest of the tech works well.

I also tried Honda's radar cruise in a Clarity and it worked well. When I test drove a TLX, the lane keep assist in the first generation TLX wasn't quite centering. Its braking alert worked fine and I appreciate that since it didn't have the HUD, they put a small LED to project "BRAKE" on the windshield instead.

totalZero wrote at 2020-10-28 18:45:01:

The full heading is "Cadillac's Super Cruise Outperforms Other Driving Assistance Systems: Other automakers close in on Tesla's Autopilot, now a distant second, in Consumer Reports' new ratings of 17 systems," but that seemed long and wordy. CR used the phrase "distant second" for Tesla's DAS, calling Autopilot out by name, so it seemed fair to consolidate the headline as submitted.

odshoifsdhfs wrote at 2020-10-28 18:31:23:

I'm not sure if anything changed, but my CX-5 from 2015 had this. Maybe they don't consider it to be full hands off and thus have made this statement? It could center itself and drive within the same lane in a highway going around 100-120km/h without any involvement from me. It also nudged me to the lane whenever I tried to change lanes without using the blinker.

mensetmanusman wrote at 2020-10-28 16:59:58:

Main bit:

“ Even after two years, Cadillac’s Super Cruise remained our top-rated system because, when turned on, it uses direct driver monitoring to warn drivers that appear to have stopped paying attention to the road. “

Basically, CR wants the vehicle to aggressively warn the user who is not paying attention. Tesla already has driver facing cameras/eye tracking, so they could enable this with a software upgrade to easily take the #1 spot with the broader definition of ‘autopilot’ (CR is only focusing on two aspects, lane centering, and vehicle distance spacing).

cma wrote at 2020-10-28 18:14:13:

Tesla's interior camera isn't IR with illumination so it isn't capable of driver monitoring at night.

mensetmanusman wrote at 2020-10-28 19:11:41:

Would be interesting to see what the night time data look like.

Also, people make these camera covers:

https://www.greendrive-accessories.com/en/products/cache-cam...

Ironically, Cadillac sells these to cover laptop cameras:

https://cadillaccollection.com/CAD21/Cadillac/Office+Tech/Pr...

vernie wrote at 2020-10-28 17:06:20:

Right now that system just prepares an affidavit blaming you if the car veers into a median.

drewrv wrote at 2020-10-28 17:27:33:

I understand that people should evaluate products as they're delivered, vs promises of a vague wonderful future product. And Tesla has made some wild promises that they've taken payment for, but I really doubt they can fulfill.

That being said, one thing that I like about my spouses Tesla is that it reliably gets OTA updates that adds features and improves functionality. For example, they recently added a video feed from the blind spot cameras when I'm in reverse. I'm not sure of any other car brand that would do that. My experience with other brands is that, best case scenario, it would take a trip to the dealership and hundreds of dollars to add a feature like that. More likely they'd try to upsell me on a 2021 model. Meanwhile the Tesla adds that stuff on a regular cadence while I sleep.

I'd bet the 2018 Tesla they tested would perform competitively with todays models. And in two more years the 2018 model will compete well against 2022 models. CR shouldn't speculate this way but it's probably worth consideration if you're looking for a new car.

vkou wrote at 2020-10-28 17:29:17:

> That being said, one thing that I like about my spouses Tesla is that it reliably gets OTA updates that adds features and improves functionality.

As someone who builds software for a living, and has introduced regressions once or twice into production, I would not like _my_ safety-critical system to live on the bleeding edge of functionality.

It's great if other people are interested in being testers for every X.0 release, but I'd rather not be one of them.

SomeHacker44 wrote at 2020-10-28 20:15:10:

Tesla has this option. You can opt in to early releases (I did for a while) or just the main line releases.

digital-cygnet wrote at 2020-10-29 12:38:48:

The pro-Tesla view is that this is like comparing an iPhone to feature phones in 2008 on the basis of sound quality and keyboard action (unfortunately I couldn't find their actual review from that year in a quick google). The question being: do you compare products on the basis of _what they aspire to be_, or _what they offer today_?

Clearly the iPhone's touch centric design and App Store turned out to be a paradigm shift in computing, and Tesla's Autopilot hopes to be the same for driving. The difference as I see it is that the App Store not living up to its hype wasn't going to kill anybody, and choosing an iPhone is a much less weighty financial decision than choosing a Tesla.

That being said, I think Consumer Reports could have done a better job of comparing the actual systems (or explaining why it chose not to), along with the exhaustive analysis of the ways in which use of the systems is curtailed for safety. What a lot of people care about is "how good is it at self-driving", even though the more important short term question, which they tackle, is "how well do its driver assist features holistically contribute to the safety of the car". Under this mode of analysis my decade-old Prius actually would do pretty well -- it has no driver assist at all, but at least it doesn't let you turn on driver assist at dangerous times, and tells you clearly when it's off (which is always).

nethunters wrote at 2020-10-28 18:31:48:

I've had the current gen Toyota Corolla for just over a year and use the adaptive cruise control and lane tracing ~95% of the time I'm driving, whether that is city driving in busy tight roads or motorway driving.

After using it for so long I won't be able to drive a car without it. I've personally had no issues with the lane tracing as I instinctively understand where the system works and where it doesn't and I've adapted my driving as such. The buttons for the controls are situated on the steering wheel and are simple to understand and easy to activate. Accelerating doesn't disable the lane trace assist or adaptive cruise control. The system also doesn't get confused with other lines on the road, bright sun light, glare, wet roads or driving at night. My only gripe is that the speed sign recognition system is a pain in cities where signs for side roads point wrongly to the main road but thankfully the speed of the adaptive cruise control isn't limited by it just an annoying beep that you can switch off if required.

Toyota do give a warning not to use the adaptive cruise control behind a tilt and slide recovery truck, or a lorry trailer that is higher than the car itself however I haven't found any issues with that.

The settings for the lane tracing can be adjusted to change the sensitivity to which it responds to the drivers turning force. I find that warning you with different beeps for different things is a pretty good system and I can easily tell the difference between the different beeps and what they're for (even with loud audio blaring through my speakers).

Comparisons like these need to use the findings of drivers who actually use these systems in real life rather than made up scenarios.

Unrelated, (but adding this after seeing those who are in the market for a new car), it is a sporty, nice and fun to drive car, 2 litre hybrid with 200nm torque from each engine and a combined bhp of 186 (could be ~300 if remapped but you'll lose linear acceleration) with good mpg, MacPherson suspension at the front, and multi linked individual suspension at the back, lower centre of gravity and 52.5:47.5 weight distribution for better cornering. In relation to the "self-driving" the system takes a little learning, but once you understand it and are used to it, there's no going back to conventional cars.

maxharris wrote at 2020-10-28 20:28:02:

A reminder to people: do not buy or sell stocks on the basis of headlines, or any price movement that spans the basis of months. The incentives to mislead you are incredibly powerful.

aphextron wrote at 2020-10-28 17:53:51:

Super Cruise is geofenced to the US interstate highway system on routes that have been pre-mapped by GM with lidar. It's not a comparable system at all.

jedberg wrote at 2020-10-28 17:37:36:

This seems like it was rigged for Cadillac to win.

In the Tesla, during autopilot, if you hit your blinker it will change lanes for you. I don't think SuperCruise does that. In fact, I think only Tesla does that.

Why did they not include that in their tests?

pandaman wrote at 2020-10-29 02:57:04:

BMW does that on demand: if you hold the signal it will change lanes even while driving normally.

nojito wrote at 2020-10-28 16:10:19:

Until liability switches away from the driver to the manufacturer, no driving system should be considered "safe"

criddell wrote at 2020-10-28 17:03:50:

I was thinking the exact opposite - that manufacturers working on self-driving cars should have some legal immunity.

In the US 100 people every day die in car accidents. Say self driving systems could cut that number in half. As we all know, every complex system has bugs so imagine that bugs in the self-driving cars are responsible for 1000 deaths every year. If we don't shield manufacturers from big legal bills for those 1000 deaths, we might not get the systems that could save 10,000 lives.

dragonwriter wrote at 2020-10-28 18:48:45:

If they are safer in net, the total liability costs will be less, even if they are concentrated in the manufacturers, so that the total cost of ownership will be less (as the operators will have lower insurance costs, which will offset the additional costs the manufacturers will need to charge in purchase prices to insure, most likely self-insure, for defect liability.)

I suppose that if you want to keep the sticker prices of vehicles low, you could transfer liability to a government entity in exchange for stronger government oversight and pre-approval of design and manufacturing, and tax vehicle owners (which still should be lower than current costs when added to their insurance costs) to pay for the liability pool.

If you just shield the manufacturers without transferring liability, you are essentially transferring the costs to the injured.

criddell wrote at 2020-10-28 19:40:18:

> you could transfer liability to a government entity

That sounds like a good idea. Legislate a fixed payout schedule and manage it centrally. Society in general benefits and so it makes sense that society in general should be responsible for the harms.

dllthomas wrote at 2020-10-28 19:05:20:

I think a risk is that courts make a habit of awarding significant penalties because the mistakes the cars are making are less understandable, and so seen as worthy of more blame even though there is less damage actually being done.

dragonwriter wrote at 2020-10-28 19:23:28:

There's well-established rules for what a driver is required to do, and they clearly apply to machines that replace drivers. In general, liability for failure to follow traffic laws and for manufacturers defects are strict liability, so how easy it is to understand the error doesn't really have much room to play a factor in assigning liability, nor is it normally a legal component of assessment of damages once liability were assigned.

If we were talking about liability in a negligence/recklessness regime, “hard to understand innocent errors => more liability awards” is a reasonable model, but in a strict liability regime, that an error is innocent (which being easy to understand makes more likely that a trier of fact, perhaps especially a jury, will find) isn't a mitigation.

dllthomas wrote at 2020-10-28 20:08:15:

That's somewhat reassuring :)

dragonwriter wrote at 2020-10-28 17:58:14:

> Until liability switches away from the driver to the manufacturer, no driving system should be considered "safe"

Strict liability for manufacturing defects _already_ rests with the manufacturer of the defective product (and everyone in the chain of commerce for the defective product, though with Tesla's direct-to-consumer sales model, there are fewer parties there than is typically the case with autos.)

derekp7 wrote at 2020-10-28 16:16:36:

I don't think that really will, or should happen, if you compare to similar systems.

For example, the owner of an aggressive dog breed will be held liable for attacks from that dog, even though the dog is acting "autonomously". But the dog breeder typically isn't held liable, even if they are responsible for selecting for certain aggressive traits.

So the question is: is a manufacture of (semi) autonomous systems really much different? Now this can be different if the autonomous car is owned by an entity other than the primary user (for example, an auto cab -- the passengers in a regular cab aren't responsible for the cab driver, so they wouldn't be liable for a computerized cab driver either).

jmpman wrote at 2020-10-28 16:29:42:

Until Elon is willing to let X Æ A -12 cross the freeway in front of a bunch of Tesla’s driven on autopilot by drunk frat boys, no driving system should be considered “safe”.

Der_Einzige wrote at 2020-10-28 17:54:01:

Tesla tries to pretend that they're somehow so far ahead of other automakers technologically. Aside from the electric power-trains and battery tech, they're just not ahead of other manufacturers.

A few months ago, I tried Tesla's autopilot and found it extremely poor. It frequently took more time than expected for possible objects (that I saw) to appear on the head-unit. Further, the autopilot didn't detect a garbage can on the side of the road and almost tried to drive me into it.

Why does the Tesla still refuse to put in blind-spot monitors? Why does elon still refuse to give us HUDs? Where is Android Auto / Apple CarPlay?

jiofih wrote at 2020-10-28 19:45:22:

Have you tried other cars in the same driving conditions? There is no commercially available vehicle that will navigate around a trash can right now besides a Tesla.

elisharobinson wrote at 2020-10-28 21:54:00:

There is nothing to pretend . The facts are there for everyone to see , no other system is able to perform lane change and operate outside of the US.

danbr wrote at 2020-10-28 16:16:33:

Rather than praising some of these systems and lambasting others, CR should reach out to manufacturers documenting what their testing results were, and how to improve systems. We have CVE reporting, why not something similar for vehicle automation testing?

It’s quite disappointing to see [unbiased] testers promoting some systems over others, rather than trying to up the safety/capability level across the board.

shajznnckfke wrote at 2020-10-28 16:20:53:

This seems like a general argument against product reviews.

The manufacturers are surely going to track the results of these tests and try to improve their rankings. Publishing the results doesn’t prevent that, it just gives consumers that information too when they are making choices about what kind of car to buy today.

Keeping test results private wouldn’t be consistent with the Consumer Reports mission of informing consumers.

agumonkey wrote at 2020-10-28 16:31:22:

Quite true. Gaming the metrics are a well known issue. And it's also linked to blind studies somehow.

totalZero wrote at 2020-10-29 21:59:58:

College rankings like USN&WR are a great example.

kenjackson wrote at 2020-10-28 16:46:31:

CR does reach out to manufacturers. They even give some of the feedback the manufacturers gave back to them in the article:

"In response to requests for comment, GM said the system on its Buick Encore GX offers an “impressive amount” of standard active safety features that have been proved to reduce crashes. Mazda responded that its system was not designed or intended to center the vehicle in a lane and that the automaker doesn’t believe its vehicles are equipped with technology that meets the criteria CR is using in its evaluation. "

FireBeyond wrote at 2020-10-28 16:36:12:

It's not a reviewers responsibility to improve a product. These companies are quite capable of figuring out partnerships and determining where there's a shared good to come from things.

ebg13 wrote at 2020-10-28 15:11:56:

According to the charts and descriptions in the article, it literally doesn't. The article shows Tesla firmly in the lead in capability/performance and ease of use and that Cadillac just nannies the driver more and turns itself off completely when not on pre-mapped divided highways with restricted access, which is antithetical to making the world safer because it's only active away from where people currently die by the thousands due to human negligence.

CR's idea of goodness is that autopiloting should only ever be allowed on pre-mapped divided highways with restricted access. Since Cadillac's only works on pre-mapped divided highways with restricted access, they get high marks. But that's not better performance. That's worse performance with a geofence to hide the details.

_“Active driving assistance systems should only be able to be activated in low-risk driving environments, void of pedestrians and tricky situations, such as intersections and complicated traffic patterns,” Funkhouser says._

But this ignores the fact that most traffic fatalities happen from human negligence in the places where Cadillac's Supercruise explicitly disables itself, rural roads + intersections. If Cadillac's system only ever activated while parked in your garage they could give it a 10/10 for safety, and it would not make the world safer.

mdorazio wrote at 2020-10-28 15:21:45:

You must have missed their #1 requirement for these systems: safety.

"Even after two years, Cadillac’s Super Cruise remained our top-rated system because, when turned on, it uses direct driver monitoring to warn drivers that appear to have stopped paying attention to the road."

Autopilot doesn't "nanny" drivers, as you call it, and is therefore not a safe system so they ranked it lower. If you're fine with your ADAS system killing multiple people [1], this is not the ranking system for you.

[1]

https://apnews.com/article/ca5e62255bb87bf1b151f9bf075aaadf

loudmax wrote at 2020-10-28 16:00:05:

Tesla's term "Autopilot" is misleading and it's gotten people killed. Tesla and Musk deserve opprobrium for giving their driver assist system a name that implies that the car can drive without constant supervision.

But Consumer Report's title isn't entirely clear either. They rank Cadillac ahead of Tesla because Super Cruise makes stronger attempts than Autopilot to make sure the driver is engaged. That isn't a bad thing, but it should be clear in the title that the "performance" they're measuring heavily weighs toward safety. It's like saying that a Volvo station wagon outperforms a Ferrari sports car because your family is safer in the Volvo. Sure, I'd rather my kids ride in a Volvo but that's not what I think of when I read the word "outperforms".

elisharobinson wrote at 2020-10-28 22:02:27:

There exists a total of 0 pilots in the world who believe autopilot system will fly the plane without any need for intervention. If your ingnorance gives meaning to words it does not make them true.

ghaff wrote at 2020-10-28 16:10:35:

>You must have missed their #1 requirement for these systems: safety.

Not really surprising from CR. They've always emphasized safety--along with maintenance, cost, etc. The target audience for CR is mostly a conventional middle-class suburban family (understanding that Cadillac is still a relatively high-end car). They don't really care about car enthusiasts who are prioritizing other types of capabilities.

toomuchtodo wrote at 2020-10-28 15:37:32:

Roughly 38k people a year die in human caused car accidents. ADAS systems will kill people, as all systems can and will fail. Six people (the number of people killed while Tesla vehicles had Autopilot engaged) seems within a tolerable amount if we expect progress to reduce human caused vehicular deaths. NHTSA and adjacent regulatory bodies allow Tesla to continue to sell Autopilot functionality, and allow it to remain active on sold vehicles. Their opinion counts far greater than that of Consumer Reports.

moduspol wrote at 2020-10-28 17:23:50:

This isn't even a choice we have to make. Why can't SuperCruise make sure the driver is paying attention even when it's _not_ doing the driving?

The numbers are pretty clear. If we want to save lives by making sure drivers are paying attention, it should be on the cars _without_ Autopilot.

beervirus wrote at 2020-10-28 15:39:48:

That's hardly a fair comparison, given how many people are driving vs. how many automated driver systems are out there.

toomuchtodo wrote at 2020-10-28 15:42:01:

Propose an alternative comparison that is not so outlandish as zero deaths ever. We're currently at six deaths with almost a million Tesla vehicles sold and 3+ billions of Autopilot miles travelled [1].

[1]

https://electrek.co/2020/04/22/tesla-autopilot-data-3-billio...

wtallis wrote at 2020-10-28 15:46:54:

The fact that you've edited your comment to reference number of vehicles sold and miles traveled shows that you don't need any help coming up with an alternative comparison that is more reasonable. You already know that the numbers you initially cited need to be divided by their share of miles driven.

toomuchtodo wrote at 2020-10-28 15:52:44:

The fact that I've edited my comment reflects that I want it to be historically accurate after the edit window times out. Hacker News participants aren't going to have any impact on regulators, so while I try to persuade that this automation development is necessary (because I genuinely believe that, and have lost loved ones [plural] to vehicular accidents), it's moot beyond discussion (and there are _so many discussions_ [1]). Some will argue that any deaths are too much, other will argue that many deaths are reasonable. I argue that some deaths will occur, which will lead to less deaths overall over the long term (which I would also argue is a goal we could all agree on), and I don't believe this is an unreasonable position to hold.

[1]

https://hn.algolia.com/?q=tesla+autopilot

YeGoblynQueenne wrote at 2020-10-28 16:41:23:

I don't know if the OP asked for "zero deaths ever", but:

_Given that current traffic fatalities and injuries are rare events compared with vehicle miles traveled, we show that fully autonomous vehicles would have to be driven hundreds of millions of miles and sometimes hundreds of billions of miles to demonstrate their safety in terms of fatalities and injuries._

From "Driving to Safety - How Many Miles of Driving Would It Take to Demonstrate Autonomous Vehicle Reliability?"

beervirus wrote at 2020-10-28 15:45:10:

The fair comparison would obviously look at deaths per mile traveled for each category, not total deaths.

totalZero wrote at 2020-10-28 15:57:09:

I think deaths per hour driven gives a more appropriate weighting to city driving, which presents different complications and hazards from those encountered on a highway.

beervirus wrote at 2020-10-28 16:09:02:

That also seems reasonable, although city driving will be biased toward more accidents, but fewer fatalities.

Retric wrote at 2020-10-28 15:47:10:

That gets into fun with statistics.

On a pure deaths per mile basis Autopilot is already safer than the average car driven by the average driver. However, people downplay that by adjusting for various things making them relatively equivalent with not enough data to really say one way or another.

Basically, there is simply not enough data to be statistically significant improvement or detriment vs

drivers, but there is enough data to say they are safe enough to be used.

amanaplanacanal wrote at 2020-10-28 21:57:45:

How can you say autopilot is safer when the human driver has to take over when it sees the autopilot isn’t up to the task? Isn’t the prima fascia evidence that the human driver is safer?

Retric wrote at 2020-10-29 05:42:06:

Different areas of expertise, a computer’s doesn’t get bored, drunk, or text people while driving. Safety is just about not crashing, actually getting somewhere is a different story.

In some ways having a human take over in difficult conditions is stacking the deck, but computers are also taking over in an crash to slow down before impact and people are adjusting how much autopilot speeds. So, it’s kind of a grey area, but at a minimum we know autopilot isn’t dramatically worse than humans.

amanaplanacanal wrote at 2020-10-29 14:21:49:

Of course it is. Count every time the human has to take over as a crash for the computer, and then see what your statistics look like. The computer is _much_ worse.

Retric wrote at 2020-10-29 18:00:24:

Every time the computer is handing off control that’s a multi second process where it could have come to a complete stop instead. Essentially that’s the same thing as the vehicle stability assistance kicking in, which very much favors the machines.

Instead you need to count the number of times someone has actively taken control to avoid a collision, which is rare enough it’s hard to find examples.

amanaplanacanal wrote at 2020-10-29 18:27:35:

You seem to be claiming that the computer really is good enough _today_ for fully autonomous driving, and that the need for the driver to pay attention and keep hands in the wheel is really unnecessary. Is that right?

Retric wrote at 2020-10-29 18:58:31:

I don’t think fully autonomous driving is ready, fully autonomous human equivalent accident avoidance is a much simpler problem that seems to be largely solved.

Having an alert driver paying attention _is_ a net benefit, but in practice people don’t continuously pay full attention for long stretches with autonomous driving. If you assume people are paying attention say 90% of the time in practice then their only catching less than 90% of the potential accidents they could avoid. Further, assuming identical rates of accidents to the average person the other car is at fault 1/2 the increase is further reduced.

As such, I think the regulation is a good idea for now as the net benefit to fully autonomous driving is to the individual while the risks include other drivers.

Retric wrote at 2020-10-28 15:45:15:

6 is also a total not a rate per year like 38,000. The rate of self driving car deaths is roughly identical to the rate of non self driving car deaths. However, unlike drivers these systems can significantly improve over time.

roland35 wrote at 2020-10-28 16:10:53:

I think drivers can improve over time as well, especially with assistive technology such as lane departure warning, blind spot detectors, etc becoming more prevalent. Luckily self driving isn't an all or nothing deal!

Retric wrote at 2020-10-28 16:20:32:

People are already comparing individual models with and without self driving features to adjust for things like visibility. As these cars already have those driver assist features having them become more commonplace isn’t going to change the relaxant statistics.

That said, unknown future forms of driver assistance might be relevant, but I would lump that in with better cars.

vmception wrote at 2020-10-28 15:43:56:

The fair comparison will show the same thing lol.

Each car in with the automated system will provide data to update all the other cars with automated systems.

I trust that process way more than individual unconnected human drivers. I don't trust Elon or Tesla corporation to care about human lives but I do trust that it will eventually get better and very quickly.

So, thank you for the altruism everyone, it will be fine.

labcomputer wrote at 2020-10-28 15:53:04:

They have a funny definition of safety.

Specifically, they insist that the system should only be usable on limited-access highways. CR docked _all_ manufacturers (except GM) 4 points for that.

CR also docked 3 points from every company (except GM, again) for "unresponsive driver" without saying what, exactly, GM is doing differently from everyone else.

Finally, under "keeping the driver engaged", GM somehow got 3 more points than anyone else, despite being subject to the same complaints.

Sadly, CR is not the publication it used to be. While this isn't a "Tesla hit-piece", it is quite clearly a "GM submarine."

ebg13 wrote at 2020-10-28 15:35:08:

If your idea of safety is explicitly that it should be disabled in the places where most traffic fatalities occur (rural roads + intersections) due to rampant human negligence, then I think you should reconsider your definition of safety.

itsoktocry wrote at 2020-10-28 15:30:54:

>_That's worse performance with a geofence to hide the details._

It's brilliant the Tesla has managed to convince people that things like mapping and geofencing and LIDAR are all crutches, despite the fact that they improve performance and safety.

neuronexmachina wrote at 2020-10-28 15:37:20:

IMHO, Tesla seems to be stuck in a pretty unfortunate local maxima.

arcticfox wrote at 2020-10-28 17:01:37:

This might have been true before the latest FSD rewrite beta; not sure how that can be claimed at this point.

Obviously the beta still is a long ways from true FSD, but it is clearly a massive amount more capable than it used to be, so I don't think there's any indication at all that Tesla is stuck.

ebg13 wrote at 2020-10-28 15:54:22:

I think they will reconsider adding LIDAR at some point. It's just another sensor after all, and fusing another sensor would be great because having more different sensing modalities is usually better. But their performance is still better than any of the other commercial options even without it.

labcomputer wrote at 2020-10-28 15:39:11:

I really don't understand how they can be viewed as anything but crutches.

1. What happens when there is unexpected construction or an accident? The geofence isn't going to be updated fast enough. Same-day updates are considered _extremely fast_ in the mapping world.

2. What happens when there is rain or snow? LIDAR becomes useless, so the system can't be used in rain or snow. If you want an all-weather self driving car, you can't depend on LIDAR.

Edit: Humble request. Instead of _just_ downvoting (I don't mind the downvotes), why don't you write a comment to tell me why this is wrong?

gamblor956 wrote at 2020-10-28 17:20:41:

That's not how the geo-fence works. If the car solely used geo-fence it wouldn't be able to handle simple things like other cars on the road. Obviously, that's not the case.

Geo-fence lets the car know that the road has been mapped so that it can correlate its sensory data against fixed data.

Rain reduces the operational distance of a LIDAR system but does not render the system useless. On the other hand, tests have shown that more than moderate amounts of rain (i.e,. more than a sprinkle) will render a camera-based system useless at both recognizing objects and at determining distances.

https://ouster.com/blog/lidar-vs-camera-comparison-in-the-ra...

(Most car manufacturers have already stated they will not let their advanced driving functions work in snow for safety reasons.)

labcomputer wrote at 2020-10-28 21:38:13:

> Geo-fence lets the car know that the road has been mapped so that it can correlate its sensory data against fixed data.

Yes, I get that, and my point is that updating the geo-fence fast enough to account for changes in the ground truth which invalidate your map is an unsolved problem.

gamblor956 wrote at 2020-10-29 20:50:14:

No, you're not getting it.

Geo-fence is the _backstop_ safety mechanism for SuperCruise, not the primary mechanism.

They don't _need_ to update geo-fence to account for changes to "ground truth." The whole point of the geo-fence is to confirm sensor data and assist in identifying anamolous situations _like_ accidents and construction. If sensor data doesn't match "truth" then the self-driving system can engage safety mechanisms, engage additional processing, or disable self-driving as appropriate.

fancyfredbot wrote at 2020-10-28 16:01:09:

A safety feature which improves safety in some but not all cases is a crutch? It doesn't seem a very strong argument.

labcomputer wrote at 2020-10-28 16:08:44:

Can you expand on that? I'm having trouble understanding what you're arguing here: Is "self driving" the safety feature?

FireBeyond wrote at 2020-10-28 16:40:04:

Your argument seems to be along the lines of "an improvement that is not a panacea is to be dismissed as not worthy". "Oh, that works and help in conditions a, b, and c, but not as much in x, y, and z, therefore it's not any improvement".

1234letshaveatw wrote at 2020-10-28 16:15:46:

#2 seems like an argument in bad faith. Nobody is suggesting that LIDAR is the "only" solution, just that it can/should be part of the solution- augmentation.

Similar to airbags- You can't only depend on them, but nobody does. That is why cars still have seatbelts

labcomputer wrote at 2020-10-28 21:30:49:

> Nobody is suggesting that LIDAR is the "only" solution, [...]

Sure, that's an fun slight-of-hand. But plenty of people are arguing that vision-only (or vision + RADAR) systems cannot possibly work, and that you _must have_ LIDAR in addition to vision to do self-driving. If you _must have_ LIDAR, then you can't make an all-weather system that has the same performance in good weather and bad.

It also seems odd that you would accept degraded performance during driving conditions which are especially challenging... It's like saying that you'd turn off your airbag in the rain because you still have seat belts. If anything, I would think most people would want better performance during bad driving conditions.

bpodgursky wrote at 2020-10-28 15:41:50:

At the end of the day.. WE can drive with only vision, in unfamiliar terrain, so it's a bit silly to suggest that LIDAR etc and geofencing strict requirements for full self-driving.

DennisP wrote at 2020-10-28 17:07:15:

True, but the human brain is vastly more powerful than any self-driving computer hardware.

derekp7 wrote at 2020-10-28 16:40:32:

However, WE can't drive when looking directly into a setting sun. So we have sunglasses and sun visors to help. Is wearing sunglasses a crutch when driving?

throwaway0a5e wrote at 2020-10-28 15:42:08:

Not being able to turn on autopilot on some random rural state highway seems like a fair trade for knowing a regression won't put me into a barrier on a major interstate I commute on.

ghaff wrote at 2020-10-28 16:24:39:

I don't necessarily disagree but "While many people expect a larger number of accidents to occur in urban areas, the reality is that far more fatal accidents take place on rural two lane roads. In fact, reliable data indicates that as many as 57 percent of all fatal car crashes occur on rural roads. Another surprising statistic is that 62 percent of fatal accidents do not occur on curves, but on straightaways, especially at night."

So, while very good autonomous driving on interstates is definitely a convenience win and doubtless could save lives, accidents on those roads are not the leading cause of auto fatalities. (Which sort of makes sense. You're often going almost as fast on those two lane roads and there are far more opportunities to have a head-on collision or to veer off the road and road maintenance/plowing/etc. is often quite a bit worse.)

[1]

https://www.sutliffstout.com/faqs/where-car-accidents-happen...

YeGoblynQueenne wrote at 2020-10-28 16:36:06:

No, actually what you say makes sense. Self-driving cars could be widely adopted _right now_ and the only thing that's stopping this adoption is a bunch of regulation that demands of them to be at least as safe as human drivers and at least as reliable.

If this kind of regulation were to be removed, self-driving cars like Tesla's or Cadillac's would be free to go everywhere and the technology would quickly reach massive adoption.

Of course, there is the little detail that regulation that requires cars to be safe and reliable is sane and logical and makes sense. But, if it were to be removed, there's nothing stopping self-driving cars from making a left turn into traffic at a busy intersection near you!