That year of Model X didn't have a driver facing camera. If he managed to sleep while putting a little bit of pressure on the wheel it might have just kept going.
TBH, the whole "what to do when the driver is unresponsive" problem is being solved in a lot of different ways right now. A Tesla will usually (eventually) recognize it and stop in the middle of the road.
Some other systems turn off lane centering completely when they realize this.
Those sound like terrible way of solving it. I would rather the car attempt to keep driving to my destination than stop in the middle of a busy road or stop holding the wheel mid turn and send the car to the guardrail/sidewalk
"How to bail out of any situation" should really be self-driving step 0. If your car can't figure out how to pull off onto the shoulder it shouldn't be driving down the highway.
Now, how that applies to cruise control or lane centering, damned if I know.
It absolutely could do that, but that would likely transfer legal liability once it deviated from operator inputs. The lawsuits around automatic braking and traction control are still about how the operators commands were interpreted, and they were bad enough. This would be explicitly deviating from operator commands. These are silly problems from an engineering standpoint, but there are other constraints.
Stopping on the highway (and alerting the passenger and engaging the warning lights) sounds way better than whatever would've happened if the car wasn't able to act at all.
Highways are the least accident prone exactly because everyone is behaving mostly the same: driving at the maximum speed (left lane), near maximum speed (right lane) or on/off-ramping.
Self-driving cars should either off-ramp and find parking space, or at the very worst park in an emergency lane with hazard lights on. Stopping in the middle of the highway is insanity.
What??? No! Driving straight is much safer. A potential collision of two cars at roughly the same speed is much less dangerous than a nearly guaranteed collision of a stopped car and a car at 120 km/h (75 mph). BTW Sweden has a low speed limit - it's 140 km/h (85 mph) where I live, but people regularly drive 160-180 km/h (~110 mph).
It must go behind the highway border line if it's going to stop (and keep going if it's not possible!), but it'd be much better to pull out of the highway at the first opportunity. Definitely never just stop in the middle of a highway.
Not sure if this is some kind of a US thing but here in Europe people don't expect you will stop in the middle of a highway. That's really extremely dangerous.
It's also dangerous to drive on the shoulder. Perhaps the safest thing is to keep going while it's obviusly just highway, perhaps a bit under the speed limit (low enough to get the attention of other drivers but high enough to not cause collisions). It could navigate to the slowest possible road and just park. Maybe drive with one wheel on the shoulder at 80km/h road on a 90km/h road, with warning lights and headlights blinking, car horn sounding etc. Worst case if the car approaches the end of the highway and there is a shoulder maybe it could stop there. After all, it's what any driver would have to do with a flat tire anyway (the self driving car can't run out and place a warning triangle 100m behind the car though).
What are you even comparing (and what are you even implying that my point is)? Sounds like you're comparing an automatic stop system to an ideal autonomous driving system. I was comparing stopping in a controlled way to what would happen if you don't have a system that control the car when you fall asleep at all. Obviously driving the vehicle like a human being would be the best option...
The car is evidently fully capable of just going straight, so that's what it should do, never ever should it just stop in the middle of a highway lane.
It would be great if it tried to pull to a gas station/rest point or, after a long time of very loud sound and vibration of the driver seat etc, and only if no other option is available, turn to the emergency stop lane and slowly come to stop there - but if these are not an option (the emergency lane is not wide enough, for example), it should just keep driving straight ahead at 80 km/h (minimal allowed highway speed in EU).
If my car doesn't have autopilot and can't keep driving straight, I'd prefer a light crash to the right side to just stopping in the middle of a highway lane which is a nearly sure death sentence - and not just for myself.
Thanks for the clarification. I was wondering about that driver facing camera. Didn’t know that the 2019 model didn’t have it. Other OEMs use infrared sensors for that. A camera is a bit spooky to me from a privacy standpoint. Nevertheless Teslas newer models should be able to prevent this situation with it.
The article mistranslates the Swedish term "mil" (1 mil == 10 km) into English/US miles. According to the summary and the map, the car drove around 250 km.
That would be Norsholmen right before Stockholm, but there is a small town Norsholm between Linköping and Norrköping that the author put in the image in the article.
I'd love to read the original Swedish article but it's paywalled. It's possible they could tell that driver fell asleep south of Mjölby but was noticed by the police there.
Marketing like this should be banned. This implies that the car handles everything you throw at it and only requires supervision due to outdated regulations. Of course, people will ignore the current regulations to rely on, you know, their fully self-driving car to drive for them—the reason they bought the car in the first place? A fully self-driving car has the potential to be an order of magnitude better and safer driver than the average person, so yeah, obviously.
No one who has driven a Tesla with Full Service Driving for 15 minutes is under the delusion that it is a level 5 autonomous driving system.
It is amazing and useful, and assuredly makes me a better driver than I am without it.
I am begoggled at the scolds who think the "deceptive" branding is sufficient to override the lived experience of driving with this fantastic (if flawed?) tool.
Not sure what you mean - that's just what the word autopilot means. It doesn't mean the plane does everything by itself, it means you don't need to hold the stick.
If you look at additional options of a car and one of them is "autopilot" and the other one is "full self driving", which one is going to give you full self driving?
Imagine if instead he had been using a car without autopilot and had killed himself and or others. That’s how these “falling asleep at the wheel” stories usually go.
He certainly could have, but he also might not have. If he was actively driving he may have remained more alert, or he may have chosen to pull over. It's a counterfactual so we don't really know for sure either way.
Yeah, sure. I guess you can say that a driver who hadn't slept the night before and was super stressed out about work could've fallen asleep while driving on a monotonous and boring highway, but he also might not have. Yet, at the end of the day, it's far more likely that he would've fallen asleep either way.
I feel like this distracts from the fact that the self-driving feature was probably safer than falling asleep at a manual car for everyone involved.
In general, I think people are more likely to fall asleep when supervising an autonomous machine than when operating a manual machine. However falling asleep when operating any machine is obviously possible and it may that this driver was so tired that he would have slept in either scenario. We don't know.
That's not an interesting story though. We always had tired/drunk/reckless drivers, and we accept having death and injury due to those. But we don't accept death/injury due to poor FSD, regardless of whether on average it is a better driver. There are several reasons for this a) self driving cars don't have a personal interest or "skin in the game" in avoiding accidents. Only business reasons are involved. b) It's new tech. We accept old risks, but not new ones. Just like we'd never accept the consumption of alcohol or tobacco if it was invented today, regardless of how dangerous they are compared to other substances. They are bad enough to be banned as new tech, no doubt.
So self driving cars need to not only be safer than human drivers for acceptance, but significantly safer. They might be, and especially further into the future when the number of human drivers is smaller and there are good protocols for collaborating among self driving vehicles. But while they are the novelty the stories will look like this, and for good reason.
"Accept" was used in the loosest sense: We still drive, knowing that it can happen. We might petition for safer roads or stricter alcohol limits, but we don't suggest banning driving.
Imagine if instead the driver had to concentrate on something like, possibly, the road, and then not fall asleep of boredom. IMO this is one of the bigger issues with these kind of systems, they bore the driver while still forcing him to take the responsibility
I find adaptive cruise and reversing cameras really useful, whereas I find lane assistance features quite dangerous. It's exactly what you want while going around a corner - the steering wheel jumping under your hands, and making you jump/panic.
Although - on the other hand, there's nothing quite like having to fight the car for control of the steering wheel to solve drowsiness...
Emergency braking is another one that I tend to find is almost entirely false positives. Overtaking a wagon on the motorway, lets trigger the emergency brake! Steep angle in the multi-story car park, lets trigger the emergency brake!
Seatbelts and backup cameras existed for decades before they became mandatory. Fully autonomous driving systems do not exist and are not likely to for many years
Three-point seat belts were not mandatory in 1966. The first law was in 1968 and it required only lap and shoulder seat belt assemblies in each front outboard seating position. Other seating positions could have lap or lap and shoulder seat belt assemblies. https://web.archive.org/web/20140529033515/http://www.nhtsa....
And that was only required as part of the standard equipment. People were not required to use them. https://en.wikipedia.org/wiki/Seat_belt_laws_in_the_United_S... says "Seat belt use was voluntary until New York became the first state to require vehicle occupants to wear seat belts, as of December 1, 1984."
> The first modern three-point seat belt (the so-called CIR-Griswold restraint) commonly used in consumer vehicles was patented in 1955 U.S. patent 2,710,649 by the Americans Roger W. Griswold and Hugh DeHaven.
before later saying:
> The three-point seat belt was developed to its modern form by Swedish inventor Nils Bohlin for Volvo, which introduced it in 1959 as standard equipment.
Agreed, but I don't think it needs to be perfect to be useful enough to be mandated. I would suggest a mechanism whereby a car would be able to identify whether its driver is no longer in proper control of the vehicle (e.g. asleep) and then assess the road conditions, and if it isn't confident in its ability to continue self-driving to the designated destination, it would self-drive into a safe stop. I think that on its own could massively reduce road accidents
This is about the dozenth Tesla software update I have heard this about, and so far none of them have lived up to the hype. Youtube videos are way too easy to cherry pick and/or fake for this kind of analysis.
Brace yourself for longer. I used to think 2030 would be the year of starting common FSD. Then I moved it to 2035. Then I realized I'll probably retire before it happens, and it became largely irrelevant topic for me.
My opinion is 5 years to "perfect". Qualified because nothing is perfect.
I tell people that in 10 years, the insurance companies will charge 2x more to drive a car yourself. And in 20 years, you just won't be allowed to drive.
> Researchers in the 1960s and the 1970s were convinced that their methods would eventually succeed in creating a machine with general intelligence and considered this the goal of their field.[269] Herbert Simon predicted, "machines will be capable, within twenty years, of doing any work a man can do".[270] Marvin Minsky agreed, writing, "within a generation ... the problem of creating 'artificial intelligence' will substantially be solved".[271] They had, however, underestimated the difficulty of the problem.
Cars last 25+ years, so ~20% of people will be driving at least 20 years after self driving gets perfected and then standardized.
Insurance companies raise and lower their premiums in response to risk. You might get a discount for self driving car eventually, but people who keep driving will just keep paying the same reasonably affordable rates as they do now.
> Cars last 25+ years, so ~20% of people will be driving at least 20 years after self driving gets perfected and then standardized.
That's a bit of a leap. If and when FSD becomes sufficiently reliable to enable you to legally relinquish control to it, including have it take you anywhere even when you're intoxicated, without any personal liability, I expect adoption to ramp up very quickly.
Also, depending on what technical solution ends up winning this race, it might be possible to convert existing old cars to FSD.
> including have it take you anywhere even when you're intoxicated, without any personal liability
People get arrested for sleeping in their cars while drunk. It’s going to be a long time before a legal self driving exception exists for drunk people.
People who have 20 year old cars generally can’t afford a new one and installing a self driving kit isn’t going to be cheap or worth it on a car that old.
> If and when FSD becomes sufficiently reliable [...] I expect adoption to ramp up very quickly.
There's basically zero chance of that. Working people are not going to be buying very expensive new cars just because it has a working FSD feature. A handful (relative to population) rich people will but that's it.
For FSD to be everywhere first it needs to become reliable, then it needs to trickly down into every Toyota Corolla and then it needs to trickle down into the used market of people who won't buy anything newer than 10+ years old.
> take you anywhere even when you're intoxicated, without any personal liability
That probably won't happen in the next 50 years, regardless of technical advances. The law will refuse to catch up.
I don't think most people buy cars for new features like that - most of America doesn't have that luxury. I'm also unsure as to conversion - many people can't/won't buy winter tires in environments that could use it, and there's no way I could imagine that a FSD conversion would be that cheap.
You suggested the difference in total risk is 50% not just a drop in rates.
Self driving cars can’t impact the risk of a someone bashing the windshield to get stuff out, hail or a tree falling on your car, it getting stolen, an uninsured motorist rear ending you at a stoplight etc. But it will mean replacing your car is more expensive should it get stolen etc.
Further if you assume more ultra self driving cars are on the road the human drivers risks also drop.
Not on average and per person it’s highly dependent on the policy, a persons driving history, and their car etc. Minimum-liability coverage averages $627/year, average policy runs 2008$ /year.
In terms of individual drivers, an antique show car driven very limited miles per year the policy’s costs have little to do with the driver’s accident history. Meanwhile someone with multiple DUI’s is paying for the risks associated with their behavior.
TBH, the whole "what to do when the driver is unresponsive" problem is being solved in a lot of different ways right now. A Tesla will usually (eventually) recognize it and stop in the middle of the road.
Some other systems turn off lane centering completely when they realize this.