Hacker News new | past | comments | ask | show | jobs | submit login
Sleeping Tesla driver caught on Swedish highway – after 25 miles (vibilagare.se)
42 points by eriksdh 17 days ago | hide | past | favorite | 79 comments



That year of Model X didn't have a driver facing camera. If he managed to sleep while putting a little bit of pressure on the wheel it might have just kept going.

TBH, the whole "what to do when the driver is unresponsive" problem is being solved in a lot of different ways right now. A Tesla will usually (eventually) recognize it and stop in the middle of the road.

Some other systems turn off lane centering completely when they realize this.


Those sound like terrible way of solving it. I would rather the car attempt to keep driving to my destination than stop in the middle of a busy road or stop holding the wheel mid turn and send the car to the guardrail/sidewalk


Yeah, stopping on a highway sounds like a terrible idea.

Having the car make more and more insistent annoying noises when there’s no response, and then signal the police sounds like a better one.


"How to bail out of any situation" should really be self-driving step 0. If your car can't figure out how to pull off onto the shoulder it shouldn't be driving down the highway.

Now, how that applies to cruise control or lane centering, damned if I know.


It absolutely could do that, but that would likely transfer legal liability once it deviated from operator inputs. The lawsuits around automatic braking and traction control are still about how the operators commands were interpreted, and they were bad enough. This would be explicitly deviating from operator commands. These are silly problems from an engineering standpoint, but there are other constraints.


Stopping on the highway (and alerting the passenger and engaging the warning lights) sounds way better than whatever would've happened if the car wasn't able to act at all.


Highways are the least accident prone exactly because everyone is behaving mostly the same: driving at the maximum speed (left lane), near maximum speed (right lane) or on/off-ramping.

Self-driving cars should either off-ramp and find parking space, or at the very worst park in an emergency lane with hazard lights on. Stopping in the middle of the highway is insanity.


I'm not saying what self-driving cars should do I'm just saying that a dumbed-down self driving system is safer than not having one.


What??? No! Driving straight is much safer. A potential collision of two cars at roughly the same speed is much less dangerous than a nearly guaranteed collision of a stopped car and a car at 120 km/h (75 mph). BTW Sweden has a low speed limit - it's 140 km/h (85 mph) where I live, but people regularly drive 160-180 km/h (~110 mph).

It must go behind the highway border line if it's going to stop (and keep going if it's not possible!), but it'd be much better to pull out of the highway at the first opportunity. Definitely never just stop in the middle of a highway.

Not sure if this is some kind of a US thing but here in Europe people don't expect you will stop in the middle of a highway. That's really extremely dangerous.


It's also dangerous to drive on the shoulder. Perhaps the safest thing is to keep going while it's obviusly just highway, perhaps a bit under the speed limit (low enough to get the attention of other drivers but high enough to not cause collisions). It could navigate to the slowest possible road and just park. Maybe drive with one wheel on the shoulder at 80km/h road on a 90km/h road, with warning lights and headlights blinking, car horn sounding etc. Worst case if the car approaches the end of the highway and there is a shoulder maybe it could stop there. After all, it's what any driver would have to do with a flat tire anyway (the self driving car can't run out and place a warning triangle 100m behind the car though).


What are you even comparing (and what are you even implying that my point is)? Sounds like you're comparing an automatic stop system to an ideal autonomous driving system. I was comparing stopping in a controlled way to what would happen if you don't have a system that control the car when you fall asleep at all. Obviously driving the vehicle like a human being would be the best option...


The car is evidently fully capable of just going straight, so that's what it should do, never ever should it just stop in the middle of a highway lane.

It would be great if it tried to pull to a gas station/rest point or, after a long time of very loud sound and vibration of the driver seat etc, and only if no other option is available, turn to the emergency stop lane and slowly come to stop there - but if these are not an option (the emergency lane is not wide enough, for example), it should just keep driving straight ahead at 80 km/h (minimal allowed highway speed in EU).

If my car doesn't have autopilot and can't keep driving straight, I'd prefer a light crash to the right side to just stopping in the middle of a highway lane which is a nearly sure death sentence - and not just for myself.


> stop in the middle of the road

I hope that means safely pull over to the side of the road and stop.


Thanks for the clarification. I was wondering about that driver facing camera. Didn’t know that the 2019 model didn’t have it. Other OEMs use infrared sensors for that. A camera is a bit spooky to me from a privacy standpoint. Nevertheless Teslas newer models should be able to prevent this situation with it.


The article mistranslates the Swedish term "mil" (1 mil == 10 km) into English/US miles. According to the summary and the map, the car drove around 250 km.


Mjölby to Norsholm is about 56km, and they stopped a bit before that.


Check your map, it's around 210km by the closest route (E4).


That would be Norsholmen right before Stockholm, but there is a small town Norsholm between Linköping and Norrköping that the author put in the image in the article.

Here is a non-paywall version of the same event that says "4 mil" (40km): https://nyheter24.se/motor/1273809-tesla-forare-sov-pa-e4-me...



I'd love to read the original Swedish article but it's paywalled. It's possible they could tell that driver fell asleep south of Mjölby but was noticed by the police there.


Keep on allowing them to be called FULL SELF DRIVING(tm), and this will keep happening


Marketing like this should be banned. This implies that the car handles everything you throw at it and only requires supervision due to outdated regulations. Of course, people will ignore the current regulations to rely on, you know, their fully self-driving car to drive for them—the reason they bought the car in the first place? A fully self-driving car has the potential to be an order of magnitude better and safer driver than the average person, so yeah, obviously.


No one who has driven a Tesla with Full Service Driving for 15 minutes is under the delusion that it is a level 5 autonomous driving system. It is amazing and useful, and assuredly makes me a better driver than I am without it.

I am begoggled at the scolds who think the "deceptive" branding is sufficient to override the lived experience of driving with this fantastic (if flawed?) tool.


Or even worse: "Autopilot".


It's the other way around, though. Aircraft autopilot is a dumb thing that holds the stick for you, nothing else - you need to configure everything.


Ah yes, all of those pilots confused and tricked by a name, on machine they were trained hundreds of hours to use.

That is surely a apt comparison.


Not sure what you mean - that's just what the word autopilot means. It doesn't mean the plane does everything by itself, it means you don't need to hold the stick.

If you look at additional options of a car and one of them is "autopilot" and the other one is "full self driving", which one is going to give you full self driving?


I didn't think Knight Rider would be prophetic.

https://www.youtube.com/watch?v=HHOTtoNHYO0


Car succeeds at illegal self-driving, but fails to negotiate with police.

If I were a runaway autonomous vehicle, I wouldn't stop for the police either!


Imagine if instead he had been using a car without autopilot and had killed himself and or others. That’s how these “falling asleep at the wheel” stories usually go.


That's assuming he would have fallen asleep without autopilot.


Happens all the time, those split second sleeps that destroy peoples lives.


He certainly could have, but he also might not have. If he was actively driving he may have remained more alert, or he may have chosen to pull over. It's a counterfactual so we don't really know for sure either way.


Yeah, sure. I guess you can say that a driver who hadn't slept the night before and was super stressed out about work could've fallen asleep while driving on a monotonous and boring highway, but he also might not have. Yet, at the end of the day, it's far more likely that he would've fallen asleep either way.

I feel like this distracts from the fact that the self-driving feature was probably safer than falling asleep at a manual car for everyone involved.


In general, I think people are more likely to fall asleep when supervising an autonomous machine than when operating a manual machine. However falling asleep when operating any machine is obviously possible and it may that this driver was so tired that he would have slept in either scenario. We don't know.


That's not an interesting story though. We always had tired/drunk/reckless drivers, and we accept having death and injury due to those. But we don't accept death/injury due to poor FSD, regardless of whether on average it is a better driver. There are several reasons for this a) self driving cars don't have a personal interest or "skin in the game" in avoiding accidents. Only business reasons are involved. b) It's new tech. We accept old risks, but not new ones. Just like we'd never accept the consumption of alcohol or tobacco if it was invented today, regardless of how dangerous they are compared to other substances. They are bad enough to be banned as new tech, no doubt.

So self driving cars need to not only be safer than human drivers for acceptance, but significantly safer. They might be, and especially further into the future when the number of human drivers is smaller and there are good protocols for collaborating among self driving vehicles. But while they are the novelty the stories will look like this, and for good reason.


> We always had tired/drunk/reckless drivers, and we accept having death and injury due to those

FWIW, as an individual I don't accept those things. If I had my way, we'd have more effective solutions to those problems.

But my views on the issue are apparently not widespread enough to change the status quo.


"Accept" was used in the loosest sense: We still drive, knowing that it can happen. We might petition for safer roads or stricter alcohol limits, but we don't suggest banning driving.


This is both a very encouraging and very discouraging news story about AI.


Imagine if instead the driver had to concentrate on something like, possibly, the road, and then not fall asleep of boredom. IMO this is one of the bigger issues with these kind of systems, they bore the driver while still forcing him to take the responsibility


When I was in college, sometimes I foolishly drove between my parents' house and school when I was way too tired to drive.

It's possible that the only reason I never crashed was pure luck.

I only bring this up to say that needing to focus on the road might not be enough to keep a driver alert.


You need to speed and flip off other drivers in order to get the adrenaline pumping.

"Officer, it was for the best!"


> Imagine if instead the driver had to concentrate on something like, possibly, the road, and then not fall asleep of boredom.

In monotonous structures (e.g. long alleyways, long straight roads through nowhere) humans already have that problem - boredom tires them out.


Just like seatbelts and backup cameras, autonomous driving systems should be mandatory in vehicles within the next 10 years.


Some features are better than others...

I find adaptive cruise and reversing cameras really useful, whereas I find lane assistance features quite dangerous. It's exactly what you want while going around a corner - the steering wheel jumping under your hands, and making you jump/panic.

Although - on the other hand, there's nothing quite like having to fight the car for control of the steering wheel to solve drowsiness...

Emergency braking is another one that I tend to find is almost entirely false positives. Overtaking a wagon on the motorway, lets trigger the emergency brake! Steep angle in the multi-story car park, lets trigger the emergency brake!


Seatbelts and backup cameras existed for decades before they became mandatory. Fully autonomous driving systems do not exist and are not likely to for many years


Did they though?

The three-point seatbelt was invented in 1959 and became mandatory in the United States in 1966.


Three-point seat belts were not mandatory in 1966. The first law was in 1968 and it required only lap and shoulder seat belt assemblies in each front outboard seating position. Other seating positions could have lap or lap and shoulder seat belt assemblies. https://web.archive.org/web/20140529033515/http://www.nhtsa....

And that was only required as part of the standard equipment. People were not required to use them. https://en.wikipedia.org/wiki/Seat_belt_laws_in_the_United_S... says "Seat belt use was voluntary until New York became the first state to require vehicle occupants to wear seat belts, as of December 1, 1984."

Nor were all passengers required to use three-point seat belts. https://www.consumerreports.org/car-safety/anatomy-of-the-mo... says it wasn't until 1989 that "Cars are required to have three-point lap-and-shoulder belts in the outboard rear seats."

From 1959 to 1989 is 30 years.

FWIW, https://en.wikipedia.org/wiki/Seat_belt notes:

> The first modern three-point seat belt (the so-called CIR-Griswold restraint) commonly used in consumer vehicles was patented in 1955 U.S. patent 2,710,649 by the Americans Roger W. Griswold and Hugh DeHaven.

before later saying:

> The three-point seat belt was developed to its modern form by Swedish inventor Nils Bohlin for Volvo, which introduced it in 1959 as standard equipment.

The 1955 US patent can be seen at https://patents.google.com/patent/US2710649A/en .


Lap belts existed well before that


In the EU a lot of driving assistance systems are already mandatory for new cars today [1].

[1] https://ec.europa.eu/docsroom/documents/50774/attachments/2/... [pdf]


While it's a step in the direction and some of the pieces are coming into place, it's still a long way to go for a holistic solution.


It will take longer than 10 years to perfect FSD. We are in the infancy of self driving cars.


Agreed, but I don't think it needs to be perfect to be useful enough to be mandated. I would suggest a mechanism whereby a car would be able to identify whether its driver is no longer in proper control of the vehicle (e.g. asleep) and then assess the road conditions, and if it isn't confident in its ability to continue self-driving to the designated destination, it would self-drive into a safe stop. I think that on its own could massively reduce road accidents


In 2015 I went for dinner with a guy who worked on one of the big self driving projects, and he assured me that we'd all have FSD by 2018.


You should watch people driving with 12.3.5 FSD. Plenty of videos on YouTube. I'd say it's already safer than an average human driver.


This is about the dozenth Tesla software update I have heard this about, and so far none of them have lived up to the hype. Youtube videos are way too easy to cherry pick and/or fake for this kind of analysis.


Well, if you refuse to learn, we can't help you.


10 years is a long time. Ten years ago we barely had the first functional iPhone.


No. The iPhone launched in '07, the app store in '08, and the iPhone 4 in '10. Ten years ago was '14.


The iPhone launched 17 years ago. The correct rough approximation is now "20 years ago" :)


In the field of AI, AGI has always been 20 years away since the '70s and that continues to hold true today.


Brace yourself for longer. I used to think 2030 would be the year of starting common FSD. Then I moved it to 2035. Then I realized I'll probably retire before it happens, and it became largely irrelevant topic for me.


Why irrelevant? Having access to a robust self-driving car when I retire sounds like it would be really useful to me.


> Ten years ago we barely had the first functional iPhone.

The iPhone 6 was released almost ten years ago...


the current iphone isn't all that different, i guess now we have copy paste, tap to pay, and encrypted drives


My opinion is 5 years to "perfect". Qualified because nothing is perfect. I tell people that in 10 years, the insurance companies will charge 2x more to drive a car yourself. And in 20 years, you just won't be allowed to drive.


> Researchers in the 1960s and the 1970s were convinced that their methods would eventually succeed in creating a machine with general intelligence and considered this the goal of their field.[269] Herbert Simon predicted, "machines will be capable, within twenty years, of doing any work a man can do".[270] Marvin Minsky agreed, writing, "within a generation ... the problem of creating 'artificial intelligence' will substantially be solved".[271] They had, however, underestimated the difficulty of the problem.


Your timeline makes no sense.

Cars last 25+ years, so ~20% of people will be driving at least 20 years after self driving gets perfected and then standardized.

Insurance companies raise and lower their premiums in response to risk. You might get a discount for self driving car eventually, but people who keep driving will just keep paying the same reasonably affordable rates as they do now.


> Cars last 25+ years, so ~20% of people will be driving at least 20 years after self driving gets perfected and then standardized.

That's a bit of a leap. If and when FSD becomes sufficiently reliable to enable you to legally relinquish control to it, including have it take you anywhere even when you're intoxicated, without any personal liability, I expect adoption to ramp up very quickly.

Also, depending on what technical solution ends up winning this race, it might be possible to convert existing old cars to FSD.


> including have it take you anywhere even when you're intoxicated, without any personal liability

People get arrested for sleeping in their cars while drunk. It’s going to be a long time before a legal self driving exception exists for drunk people.

People who have 20 year old cars generally can’t afford a new one and installing a self driving kit isn’t going to be cheap or worth it on a car that old.


> If and when FSD becomes sufficiently reliable [...] I expect adoption to ramp up very quickly.

There's basically zero chance of that. Working people are not going to be buying very expensive new cars just because it has a working FSD feature. A handful (relative to population) rich people will but that's it.

For FSD to be everywhere first it needs to become reliable, then it needs to trickly down into every Toyota Corolla and then it needs to trickle down into the used market of people who won't buy anything newer than 10+ years old.

> take you anywhere even when you're intoxicated, without any personal liability

That probably won't happen in the next 50 years, regardless of technical advances. The law will refuse to catch up.


I don't think most people buy cars for new features like that - most of America doesn't have that luxury. I'm also unsure as to conversion - many people can't/won't buy winter tires in environments that could use it, and there's no way I could imagine that a FSD conversion would be that cheap.


"FSD conversion" - sounds so cyberpunk


> Insurance companies raise and lower their premiums in response to risk.

So we're saying the same thing


You suggested the difference in total risk is 50% not just a drop in rates.

Self driving cars can’t impact the risk of a someone bashing the windshield to get stuff out, hail or a tree falling on your car, it getting stolen, an uninsured motorist rear ending you at a stoplight etc. But it will mean replacing your car is more expensive should it get stolen etc.

Further if you assume more ultra self driving cars are on the road the human drivers risks also drop.


Most insurance risk and cost is for liability

Not on average and per person it’s highly dependent on the policy, a persons driving history, and their car etc. Minimum-liability coverage averages $627/year, average policy runs 2008$ /year.

In terms of individual drivers, an antique show car driven very limited miles per year the policy’s costs have little to do with the driver’s accident history. Meanwhile someone with multiple DUI’s is paying for the risks associated with their behavior.


It may happen but your timeframe is too short. A lot of people can't even afford cars that are less than 10 years old.


They'd be taking robotaxis



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: