• UberPeople.NET - Independent community of rideshare drivers. It's FREE to be a person and enjoy all the benefits of membership. JOIN US! CLICK HERE

The first death resulting from a crash involving a self-driving car!!

SafeT

Well-Known Member
So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?

A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?
 

LAuberX

Well-Known Member
Moderator
I thought you couldn't take your hands off the wheel while in autopilot, not to mention that they call it AUTOPILOT when clearly it isn't.

This will be a HUGE lawsuit that Tesla settles fast.
the story mentions a warning that comes up when you engage the autopilot to keep your hands on the wheel?
how about a warning to keep your attention OUTSIDE the car when driving!
 

LAuberX

Well-Known Member
Moderator
So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?

A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?
and here is the problem. they argue the cars are "safer" or "better drivers" than humans... I think it will be great for ambulance chasing lawyers!
 

painfreepc

Well-Known Member
So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?

A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?
This question was asked on the George Noory show, are you at George Noory listener if you are good for you,

This is a very serious question and the people behind the technology is not answering it,

your self driving car will have to make this decision who will it make the decision in favor of, the driver are in favor of the idiot who walked out in front of your car..
 

DriverX

Well-Known Member
  • Thread Starter Thread Starter
  • #25
and here is the problem. they argue the cars are "safer" or "better drivers" than humans... I think it will be great for ambulance chasing lawyers!
THis sorta thing is always the folly of people who think they are smarter than they are. I doubt the guys who coded it wanted to release it yet, but marketing always trumps common sense.
 

Bart McCoy

Well-Known Member
they call it AUTOPILOT when clearly it isn't.

This will be a HUGE lawsuit that Tesla settles fast.


edit
Mr. Brown apparently posted videos of himself riding in autopilot mode. “The car’s doing it all itself,’’ he said in one, smiling as he took his hands from the steering wheel.

DOH famous last youtubes

Yeah they will probably file suit. Even though the driver clearly was partially at fault. Somehow he didn't even see a big 18wheeler,since no brakes were applied. Seems the family could sue the company of the truck as well.

But let's says the computer realized a tractor trailer jumped in the road, what would the computer do:

1) swerve to avoid, but possibly cause another accident with somebody/something else?
2) Jam on the brakes and come to a computer stop on the highway?

How would Google's automated cars have saved the day??
 
Last edited:

Jermin8r89

Well-Known Member
Its gonna get implanted but people know they like control and its just gonna fail in a couple of years. I'm already seeing people going back to taxis because of privacy issues where they get bigger fair then they expected including me. Taxi drivers in Boston are already using Prius
 

DriverX

Well-Known Member
  • Thread Starter Thread Starter
  • #29
So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?

A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?
This question was asked on the George Noory show, are you at George Noory listener if you are good for you,

This is a very serious question and the people behind the technology is not answering it,

your self driving car will have to make this decision who will it make the decision in favor of, the driver are in favor of the idiot who walked out in front of your car..

George Noory kinda blows. Coast to coast was better when it was mostly conspiracy stuff. ALl the bigfoot and ghost or psychic crap is boring as hell.

I digress though, I think the obvious answer to the hypothetical is that you program the car to brake and not swerve regardless of wether or not it can stop in time. Swerving, while sometimes could save a human that is in control of the swerve would always be considered more dangerous in a binary system. Braking is really the only option for the machine to consider and this is why these systems will never be better than humans, they will just be more organized and potentially safer when all other vehicles are being controlled by the same system and the roads have been designed for autonomous vehicles. so that'll happen in like 40+ years.
 

ExpendableAsset

Active Member
There are going to have to be some really huge leaps forward in AI programming and sensory technology to make this even close to viable. Anyone who really understands how complicated driving is and how unbelievably stupid the smartest AI is knows that self driving cars are pure science fiction. We will likely cure cancer and old age, build orbital shipyards, and achieve FTL communication before self driving cars could be smart enough to do something like drive in a city and not be a deathtrap.
 

MattyMikey

Well-Known Member
Working in insurance this is going to be a nightmare. The good thing is this is not even in the 10 year plan for big companies, so don't think drivers being replaced is going to happen anytime soon.

There would be multiple people reasonable, so who would be primary? The driver, the car manufacturer, or the company that programmed the software.

With so many variables and multiple responsible parties, my guess is insurance would still be around. Then the insurance companies would start subrogation process to seek reimbursement from others that should be reasonable. I would guess that vehicle owners would still pay for it.

I'm not looking forward to this when it comes. Luckily I'm going to retire in about 15-16 years and likely won't have to deal with it much, if at all.
 

Rat

Well-Known Member
I'm pretty sure if he was actually looking out the windshield he would have seen the tractor trailer in front of him.

the autopilot in high end cars is amazing... but you can't fall asleep or have your head down in your phone!
But that is exactly why people want them.
 

observer

Well-Known Member
Moderator
Why would you put on autopilot into a car then think the customer is going to pay 100% attention to the road,
there are drivers out here now driving normal cars than don't pay 100% attention to the road..

Isn't the point of having autopilot so you don't have to pay 100% attention to the controls,

this technology is stupid, it's illogical and makes absolutely no sense whatsoever..

I first saw this car autopilot in one of the Arnold Schwarzenegger movies,
no I'm not talking about Johnny cab, it was that movie where he was cloned, him and his friend was sitting in the truck they were both facing each other,
I was watching the movie saying to myself there's no way in hell that's going to happen for real no one's watching the road and here we are in 2016 and it's happened for real - how ironic
Always follow manufacturer instructions. Especially when driving a two ton deadly machine.

When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

He had already almost been killed for NOT PAYING ATTENTION WHILE DRIVING.

The Verge also notes that Tesla referred to Brown as a friend of the company, and that he recorded a somewhat viral video of his Model S nicknamed ‘Tessy’ having a near crash earlier. He said he had not been watching the road and that his car saved his life.

In this guys case he got a second chance and didn't learn from his first mistake.
 
Last edited:

tohunt4me

Well-Known Member
And so it begins.

http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s

https://www.teslamotors.com/blog/tragic-loss

Clearly Tesla's QA dept. isn't up to the challenge of insuring the safety of driverless vehicles. Why would it not occur to someone at Tesla that scanning the roadway over 3 feet off the ground would be a vital safety requirement?

THe first death in a series to come. I'd expect a recall soon. and I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.

Fear not drivers they will need us for quite sometime to come.
I saw this on the news.
Can you imagine the horror of a car full of paying customers being DRIVEN INTO AN ACCIDENT ?

DRIVEN INTO DEATH BY A ROBOT ?

can you imagine this case before a jury ?

Not saying a human driver would have or could have done any better.

Just the thought of this,perishing under ROBOT SUPERVISION !
TERRIBLE.
 
  • Like
Reactions: FAC

tohunt4me

Well-Known Member
Some wild stuff here
Detecting the street as being "clear" because it didn't anticipate trailers with high road clearance
Well . . . . . now they know.

Congress will probably legislate skirting around 18 wheeler trailers for " safety".

Just like the airbags being recalled for killing people.

Just like the asbestos that was legislated for schools and hospitals.

Wal Mart and a few other companies are already doing it.
It aids in fuel economy.some designs will push cars out on a blind spot left turn.
 

Attachments

Last edited:

tohunt4me

Well-Known Member
Ah yes, human error still blueprints it's self in their rise to the thinking age.
It was human error that caused the accident.
Now they will address this issue.

( more children died between crib rails before improvements were made,than the amount of people who will die this way.More Pinto gas tanks exploded ,than driverless cars hitting trucks )
 
Last edited:

MattyMikey

Well-Known Member
It was human error that caused the accident.
Now they will address this issue.
Exactly. Learn from these type of unfortunate events. This guys death will make sure the future is safer. He may have had error in this, but luckily, he will make sure many more people are safer in the future.

With the millions of miles just for Tesla Autopilot has driven to be this first death, statistically speaking is MUCH safer than human drivers.
 

tohunt4me

Well-Known Member
they call it AUTOPILOT when clearly it isn't.

This will be a HUGE lawsuit that Tesla settles fast.


edit
Mr. Brown apparently posted videos of himself riding in autopilot mode. “The car’s doing it all itself,’’ he said in one, smiling as he took his hands from the steering wheel.

DOH famous last youtubes
" Hold my beer"
"Watch This !"

Compare this guy to the x pilots and the N.A.S.A. recruited test pilots.
Testing new systems is risky.
Learning is gained from success and Failure.
They learned a Big one out of this.
More will be lost.
More will be learned.
 
Last edited:
Top