• UberPeople.NET - Independent community of rideshare drivers. It's FREE to be a person and enjoy all the benefits of membership. JOIN US! CLICK HERE

The first death resulting from a crash involving a self-driving car!!

observer

Well-Known Member
Moderator
Not surprised. I still refuse to believe yo can program a computer to account for any infinite driving scenario. If they ccan't account for making sure the roadway is clear more than 3 feet off the ground, then we are still a decade away from being used for livery
I'm not too sure the 3 foot height had anything to do with this accident.

The car may not "see" the trailer from a few feet away but it definitely should have seen it from 100, 200, 500 feet away where the three foot height should not have been a factor.

Apparently the Tesla comes with 12 sensors that "see" objects for sixteen feet around the vehicle. It also has forward facing cameras and radar units.

Even if the cameras couldn't see "The white trailer and brightly lit sky", the radar units didn't detect it.
 

tohunt4me

Well-Known Member
Its gonna get implanted but people know they like control and its just gonna fail in a couple of years. I'm already seeing people going back to taxis because of privacy issues where they get bigger fair then they expected including me. Taxi drivers in Boston are already using Prius
I wouldn't be so sure.
Demographics.
America has an aging population.
Industry wants shipping by truck without drivers.
Shipping with ships want crewless ships.
It will advance.
 

painfreepc

Well-Known Member
Always follow manufacturer instructions. Especially when driving a two ton deadly machine.

When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

He had already almost been killed for NOT PAYING ATTENTION WHILE DRIVING.

The Verge also notes that Tesla referred to Brown as a friend of the company, and that he recorded a somewhat viral video of his Model S nicknamed ‘Tessy’ having a near crash earlier. He said he had not been watching the road and that his car saved his life.

In this guys case he got a second chance and didn't learn from his first mistake.
So that's the way it's going to be, every time one of these cars kills somebody, its going to be somehow it was the driver's fault,
but the fact is this technology should not be out for the general public to begin with, it's not ready..
 

DriverX

Well-Known Member
  • Thread Starter Thread Starter
  • #45
I'm not too sure the 3 foot height had anything to do with this accident.

The car may not "see" the trailer from a few feet away but it definitely should have seen it from 100, 200, 500 feet away where the three foot height should not have been a factor.

Apparently the Tesla comes with 12 sensors that "see" objects for sixteen feet around the vehicle. It also has forward facing cameras and radar units.

Even if the cameras couldn't see "The white trailer and brightly lit sky", the radar units didn't detect it.
ever seen the radar display on a boat? not very accurate. The military grade stuff is probably better but that ain't in a Tesla. I'd guess for the far away vision they rely mostly on the camera and tracking the lane lines and road edges and things entering the frame, so maybe the truck was there not moving and had a brightly lit trailer that blended into the sky or something. unless it was at night of course. image recognition and tracking is still in it's tween stage so there will be lots of unexpected scenarios that dont' get considered and tested for when applying it to a autonomous driving.

Bottom line, this stuff is beta at best and you'd have to be nuts to trust your life on a beta release, unless your getting paid a lot to do it.
 

DriverX

Well-Known Member
  • Thread Starter Thread Starter
  • #46
There are going to have to be some really huge leaps forward in AI programming and sensory technology to make this even close to viable. Anyone who really understands how complicated driving is and how unbelievably stupid the smartest AI is knows that self driving cars are pure science fiction. We will likely cure cancer and old age, build orbital shipyards, and achieve FTL communication before self driving cars could be smart enough to do something like drive in a city and not be a deathtrap.
unless they are limited to 15 mph
 

observer

Well-Known Member
Moderator
So that's the way it's going to be, every time one of these cars kills somebody, its going to be somehow it was the driver's fault,
but the fact is this technology should not be out for the general public to begin with, it's not ready..
I do think it was the drivers fault but I agree the technology is not ready for consumer use. Some of us give too much credit and put too much faith in computers and technology.
 

observer

Well-Known Member
Moderator
ever seen the radar display on a boat? not very accurate. The military grade stuff is probably better but that ain't in a Tesla. I'd guess for the far away vision they rely mostly on the camera and tracking the lane lines and road edges and things entering the frame, so maybe the truck was there not moving and had a brightly lit trailer that blended into the sky or something. unless it was at night of course. image recognition and tracking is still in it's tween stage so there will be lots of unexpected scenarios that dont' get considered and tested for when applying it to a autonomous driving.

Bottom line, this stuff is beta at best and you'd have to be nuts to trust your life on a beta release, unless your getting paid a lot to do it.
From article,

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.
 

DriverX

Well-Known Member
  • Thread Starter Thread Starter
  • #49
From article,

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.
How they slipped that past the regulators must have taken one helluva a lobby.
 

MattyMikey

Well-Known Member
I do think it was the drivers fault but I agree the technology is not ready for consumer use. Some of us give too much credit and put too much faith in computers and technology.
I disagree with you. Though the technology is not perfect, it is over 30% safer in the US (or 100% in the World) than those driven by human drivers.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.

So basing things on statistics and not emotions, if it is safer now and we know it will only get even safer as they will make changes to avoid this again, I see no problem using it.

It should however be used as a tool and not completely replace the drivers override capability. At least at this time.
 

observer

Well-Known Member
Moderator
How they slipped that past the regulators must have taken one helluva a lobby.
I don't think Tesla believes this needed to be passed by regulators.

Since they require drivers to keep hands on steering wheel and in control. It's kind of an enhanced cruise control.
 

Djc

Well-Known Member
So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?

A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?
Technically a proper auto car should be able to senario cycle in a split second just like a human driver and take the path of least damage except computer will have more accurate speed, geometry and physics data (things the brain does automatically by learning from experience). Problem is computer would have to be programmed to recognize all types of objects and situations eg. what if object that jumps in front is a moose and other option is to slightly crash into the metal crash barrier on side of road (car can squeeze between moose and crash barrrier but will hit it along the way). Car would need to know going off road was safe and that hitting crash barrier is better than head on into a moose (knowing not enough room to break). Unfortunately self driving cars will always need manual overide for these situations until we can create an AI as smart as humans in learning, evolving and making descisions as impossible to hard code all these types of scenarios into a program for self driving cars. Also if a self driving car kills someone on the street who iis liable for damages - you as owner/driver or car manufacturer and whose insurance premiums will go up?
 

observer

Well-Known Member
Moderator
I disagree with you. Though the technology is not perfect, it is over 30% safer in the US (or 100% in the World) than those driven by human drivers.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.

So basing things on statistics and not emotions, if it is safer now and we know it will only get even safer as they will make changes to avoid this again, I see no problem using it.

It should however be used as a tool and not completely replace the drivers override capability. At least at this time.
Even though it may be safer, I don't think it is prudent to just throw out a system like this without more extensive testing or finding some way to make absolutely sure that the driver is paying attention at all times.

All the driverless vehicle proponents need is a couple more freak accidents like this and the public will become very wary of driverless vehicles.
 

Jbeck

Well-Known Member
With that guy dying in the self driving Tesla in Florida today .... I see reluctantance in passangers getting into driverless cars in the future. More people will die and so will self driving cars.... What you think?
 

MattyMikey

Well-Known Member
Even though it may be safer, I don't think it is prudent to just throw out a system like this without more extensive testing or finding some way to make absolutely sure that the driver is paying attention at all times.

All the driverless vehicle proponents need is a couple more freak accidents like this and the public will become very wary of driverless vehicles.
This is how you get the true extensive testing. Though I will say I'm shocked they didn't notice this glitch about the 3 foot prior to road tests.

But no, if it is safer now and by real use gets even safer, I disagree. Because it is actually saving lives now. It would be more dangerous statistically to stop its use.

So yes, I agree that public make freak out because they're rationale is emotional and not statistical. If they truly looked at the numbers and took emotion out of picture entirely, it shouldn't.

I do not disagree with you about having them add more safeguards as that would make it even safer.

I only disagree with saying it's not ready to be in the field now, it is. Make s few modifications now and safeguards and go back to adding mileage of fatality free.

Now if there is another fatality within the US and it happens within 60 million Autopilot miles then we can revisit this topic as my opinion would change.
 

observer

Well-Known Member
Moderator
I disagree with you. Though the technology is not perfect, it is over 30% safer in the US (or 100% in the World) than those driven by human drivers.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.

So basing things on statistics and not emotions, if it is safer now and we know it will only get even safer as they will make changes to avoid this again, I see no problem using it.

It should however be used as a tool and not completely replace the drivers override capability. At least at this time.
BTW, if this technology was available today at a reasonable cost, I would use it. As it proved to be safer, I would trust it more and more. It will SOME DAY become the norm.
 

Zoplay

Member
BTW, if this technology was available today at a reasonable cost, I would use it. As it proved to be safer, I would trust it more and more. It will SOME DAY become the norm.
When will this technology available in the market? Because technology can make a big change and nowadays it was going at a rapid speed.
 

MattyMikey

Well-Known Member
BTW, if this technology was available today at a reasonable cost, I would use it. As it proved to be safer, I would trust it more and more. It will SOME DAY become the norm.
Tesla is still my dream car and I would use Autopilot myself without question.

In Seattle there was an Uber Select driver that had a passenger that was saved by a nasty head on collision.

So this Autopilot I'm sure has saved many, many lives but unfortunately one person parished.

I know in the article with the Seattle driver, the car stopped before he even had a chance to react, meaning the driver involved is not always required.

If you haven't seen the video I encourage people to watch it, it puts this Autopilot in a different perspective:

http://www.cbsnews.com/news/teslas-autopilot-helps-seattle-uber-driver-avoid-car-crash/
 

OC Lady Uber Driver

Well-Known Member
The ride you take in a driverless vehicle will only be as safe as all of the situations programmers allow for. (In that same article, they talked about how Uber wants to use driverless cars in the future.)

The apply breaks code is probably based on the distance from the front bumper to the other vehicle and not distance from other points of the vehicle like the windshield or the top of the car that were sheared off of the driverless vehicle traveling under the semi truck trailer subsequently killing the human driver.
 
Last edited:
Top