Jump to content

Driverless travel


Recommended Posts

The Uber car was driving 40 mph when visibility through its video camera was only TWO car lengths. It made NO effort to avoid the accident. In my opinion, if it had been a human driver at the wheel, they would be facing potential IMPAIRED or INTOXICATED MANSLAUGHTER charges.

 

The role of automation was thrust into the spotlight after a driverless Uber car in the US killed a pedestrian but experts told a panel discussion in Hong Kong that the technology will work out

 

 

 

 

MTR Corporation chairman Frederick Ma Si-hang, who chaired the discussion, said automation would play a key role in his industry in tackling a shortage of train drivers as the network expanded.
“It’s increasingly difficult to hire young people to be train drivers. Our drivers are not allowed to carry iPhones on duty. These days, some of our young people do not like that. As a result, we have a problem hiring train drivers,” Ma said.

 

 

 

The subway system in Guangzhou's Flower City Square is also driverless. That's an entirely different issue from driverless cars.

 

Link to comment

I saw the video. Another article says Uber has the least advanced self-driving system and may have been pushing tests beyond its known ability.

 

This has to at least throw the usefulness of the human monitor into doubt. By the way, I thought navigation and the machine vision used the LIDAR system on the roof, not information from the video camera - but I could be wrong.

Link to comment

I saw the video. Another article says Uber has the least advanced self-driving system and may have been pushing tests beyond its known ability.

 

This has to at least throw the usefulness of the human monitor into doubt. By the way, I thought navigation and the machine vision used the LIDAR system on the roof, not information from the video camera - but I could be wrong.

 

 

The LIDAR was apparently non-functional - if not, we should be seeing some pictures from it. It should have detected the pedestrian, but failed to do so, up THROUGH the moment of impact.

 

EVERY installation with different cars, different hardware, more to the point - DIFFERENT SOFTWARE is different, much as every human driver is different. These things need regulation at LEAST equivalent to what we put human drivers through.

 

Too often, the reaction is that computers don't make mistakes or are somehow better equipped than humans to make these decisions.

 

There is discussion about what to do in an "unavoidable collision". Think about what that means - some kid is sitting at his desk writing software that, in a few years time, might decide that you are about to be in an "unavoidable collision", and choosing whether to run into a brick wall, or something that LOOKS like it MIGHT be a pedestrian. What could go wrong? And how many of us are willing to hand over our keys to this kid?

 

To me, it seems like a non-starter to have these things in general use - even with Tesla's "auto-pilot". It can only serve to allow the driver to divert his attention elsewhere, and ADD (my estimate) FIVE SECONDS to a human reaction time that needs to be well under ONE second.

 

I think there IS a niche for them, and that THAT is what they need to be working for, NOT taking over driving duties in everyone's cars.

Edited by Randy W (see edit history)
  • Like 1
Link to comment

We are undergoing tests now with robotic software for our fulfillment process up through entry to the accounting system. The outsourcers who own the software that developed the robot are absolutely stone faced when it comes to listeneing about the issues. As Randy indicated, there are multiple software systems invovled here, as well as hardware in this case. When the robot is doing its thing, it is also "talking" with one kind of config and then talks with another, at the same rate. The other systems are working at a different rate with time, since they are also handling other duties.

 

Over time, those other systems slow down so that when the robot is looking for a certain cursor position, for instance, the system has not had time to paint the screen and a failure in input occurs. It's a mapping problem then. That problem cascades as newer virtual users try to continue operations -- with no error checking by the robot to allow for a soft landing. What happens? Catastrophe. An outage. In the case of the Uber, a death. Scenario a bit different but the factors are similar.

Link to comment

Very true.

 

With multi-processor systems, it by needs must be a distributed processing architecture. I deal with this all the time in the systems that I develop for Police, and it's imperative that the command and control architecture between systems be asynchronous, due to speed mismatches. Not only that, but the interrupt structure has to be configured so that it's very flexible in handling unexpected messages, or messages that are out-of-time. It can't be a polling scenario either, as that's resource-intensive. Embedded systems can't afford to have the clock cycles tied up that way.

 

In the realm of error-checking, the structure gets out of control very easily, unless it's mapped well before-hand. And I'm talking about a smaller embedded system, like a cop car video system. There are so many inputs, from Emergency lights, siren, brakes, accelerometers, Radar, GPS, ignition, body-worn video activation, other trigger voltages, temperature, back-up battery systems, etc, and they all need to be handled in a timely manner. If one of them is malfunctioning, or is giving a bogus input, it needs to be recognized and addressed right away. in the case of a car....it's time to stop and pull over. Unless of course the discrepancy isn't recognized or captured...or reacted to.

 

When you get into a much bigger and interactive embedded system, the variables can easily spiral out of control. My hat is off to the guys who are developing these systems, but it's clear that the architecture isn't quite there yet.

Link to comment
  • 3 weeks later...

In the China Daily

 

My own opinion is that we need MUCH more stringent testing - the equivalent of a driver's exam for these vehicles. Include a VISION/visibility test to see what it can recognize AND in what level of detail, and an OBSTACLE course that is NOT pre-defined.

 

By using an Autopilot function, you may as well add FIVE SECONDS in an emergency to a reaction time that needs to be well under ONE SECOND.

 

After 50 years of development, general speech recognition software is still somewhere not much above 80%

 

Guidelines to ensure safe self-driving vehicle tests

According to the regulation, which will take effect on May 1, test vehicles should be able to switch between self-driving and conventional driving in order to ensure the test driver can quickly take over in case of a malfunction.

Moreover, test applicants must be independent legal entities registered in China, and have to first complete tests in designated closed zones before conducting road tests.

 

Link to comment

This accident occurred in Tempe, not far from where I live. The news had a special on it and showed the complete film, including the face of the driver. She was momentarily diverted just long enough to allow a jay walking woman walking her bike across a street at night to get hit by a car traveling within the speed limit. Nothing can take into account a pedestrian operating under diminished capacity.

 

When I saw the playback, if the robot had stopped in time to avoid the accident (assuming a brake lockup and enough stopping time), the driver would have had cuts from the seat belts on the lap and chest. I think the designers of these cars have the view that the passengers can drive in comfort while the machines do the work. Unfortunately, if the car really does its job to avoid crashes, the passengers will have to wear even more safety equipment to avoid injury. Injuries from blow up bags are already a problem.

 

I think the answer to this problem anyway, and I am not a fan of driverless vehicles, is to require a license for pedestrians. Just drive through the front lane of the average large grocery store. Pedestrians think that since they have the right of way (often they don't if they are on the sidewalk and not the crosswalk) that the car is required to get out of the way. Well, it is to avoid an accident. Watch the pedestrians. They don't even look. Well, pedestrian, meet reality.

Link to comment

This accident occurred in Tempe, not far from where I live. The news had a special on it and showed the complete film, including the face of the driver. She was momentarily diverted just long enough to allow a jay walking woman walking her bike across a street at night to get hit by a car traveling within the speed limit. Nothing can take into account a pedestrian operating under diminished capacity.

 

When I saw the playback, if the robot had stopped in time to avoid the accident (assuming a brake lockup and enough stopping time), the driver would have had cuts from the seat belts on the lap and chest. I think the designers of these cars have the view that the passengers can drive in comfort while the machines do the work. Unfortunately, if the car really does its job to avoid crashes, the passengers will have to wear even more safety equipment to avoid injury. Injuries from blow up bags are already a problem.

 

I think the answer to this problem anyway, and I am not a fan of driverless vehicles, is to require a license for pedestrians. Just drive through the front lane of the average large grocery store. Pedestrians think that since they have the right of way (often they don't if they are on the sidewalk and not the crosswalk) that the car is required to get out of the way. Well, it is to avoid an accident. Watch the pedestrians. They don't even look. Well, pedestrian, meet reality.

 

 

Back in the days prior to anti-lock brakes, the police would have looked for skid marks. As it is, the auto-pilot did not detect the pedestrian at all, and did not even slow down, or change its course. The pedestrian was about 50 meters directly in front of the vehicle. I don't know, but I would guess the auto-pilot continued without reaction even after the collision.

 

It may have been "unavoidable", it most likely WOULD have been determined as the fault of the pedestrian, but the fact that the auto-pilot didn't detect the presence of the pedestrian is a MAJOR failure, in my opinion.

 

SOMEONE needs to set up an non-predetermined obstacle course to demonstrate how EASY it is to trigger a major accident.

 

I'm a believer that autonomous vehicles will have a niche, but to put the auto-pilot function in the hands of ordinary drivers is not it.

 

In a similar accident in Houston with no auto-pilot involved, the driver wasn't even issued a citation when he ran over a bicyclist on a feeder road

  • Like 1
Link to comment

Now THIS, in my opinion, is a valid use for the auto-pilot technology - 15 KM/H

 

In the Sixth Tone

 

 

Alibaba’s Logistics Arm Tests Delivery Drones

 

159.jpg

 

Quote
Dubbed the “G-Plus,” the drone travels at around 15 kilometers per hour and can plan routes in real time using built-in GPS. It recognizes and responds to traffic lights, and can detect oncoming vehicles from a distance of up to 100 meters. Cainiao says it hopes to be mass-producing the models by year’s end. (Image: Weibo)

 

Edited by Randy W (see edit history)
Link to comment

Maybe start first by marking a "track" with paint or signs showing where the driverless cars will drive so everyone knows to be alert for them there. They might pay more attention.

 

I know I slow down whenever I see a sign "Asian Market" as there are usually some distracted drivers in the area. (temporarily avoids mop slap) :)

Link to comment
  • 1 month later...

You drive the car. No, YOU drive the car.

 

No one's even bothered to come up with a driver's test for these things.

 

from Bloomberg

 

Uber Self-Driving Car in Crash Wasn't Programmed to Brake

 

 

Sensors on an Uber SUV being tested in Tempe detected the woman, who was crossing a street at night outside a crosswalk, eventually concluding “an emergency braking maneuver was needed to mitigate a collision,” the National Transportation Safety Board said in a preliminary report released Thursday.
But the system couldn’t activate the brakes, the NTSB said.
“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior,” investigators said.
The responsibility for hitting the brakes was left with a safety driver, who sits behind the wheel monitoring the autonomous test vehicle. Even though Uber’s computers concluded the car would hit the pedestrian 1.3 seconds before impact, or about 82 feet (25 meters) away, it also didn’t alert the driver and she was looking away from the road at the time, according to NTSB.
The NTSB’s preliminary report raises multiple questions about the company’s autonomous system as well as the actions of the safety driver and the pedestrian felled in the crash. The pedestrian had drugs in her system and didn’t look for traffic, according to the NTSB. The report does not establish what caused the collision.
. . .. On Wednesday, it permanently shut down its self-driving car program in Arizona, adding that it planned to restart testing in Pittsburgh this summer. That irked Pittsburgh’s mayor who said the company needed to make serious changes to its autonomous program before it could get back on the road.

 

Link to comment
  • 1 month later...
  • 3 weeks later...

from the Shanghaiist - https://www.facebook.com/shanghaiist/videos/10155473267146479/

 

This Chinese tech giant's latest gadget is... a bus

Chinese cities will soon get thousands of self-driving buses
Shanghaiist
20 hours ago ·
Quote
This is the 427 service to the future of transport. Read more: https://wef.ch/2N9ZOvg

 

 

https://www.facebook.com/shanghaiist/videos/10155473267146479/

Edited by Randy W (see edit history)
Link to comment
  • 2 years later...

from the WSJ

Trying to get these things to cover EVERY possible driving situation that a human driver might come across is a big, wasted effort, in my book.

Start out with the driverless delivery vehicles and other autonomous/drone vehicles that DON'T need to behave the way a human driver would. Restrict them to defined routes and VERY controlled speeds. That is, they need to be able to STOP when they come across a situation that they weren't pre-programmed to handle.

The goal of these autopilot functions is to behave like or BETTER than a human driver - NOT safety first.

Autopilot Draws Investigation
Accident is latest seen linked to the electric-car company’s driver-assistance feature

1b0509888246268ff0a54e0135e2b012

Quote

 

“For multiple years now, Tesla has been making available on its vehicles an active driving assistance system that does not put safety first,” said William Wallace, Consumer Reports’ manager of safety policy. 

Tesla’s Autopilot came in first for capability and performance in a 2020 Consumer Reports ranking of 17 advanced driver-assistance systems, receiving a score of 9 out of 10. But the system performed poorly in the category of keeping the driver engaged, earning a score of 3.

 

 

Edited by Randy W (see edit history)
Link to comment

The biggest issue seems to be mixing driverless cars with drivered vehicles which can, unfortunately, be almost infinitely unpredictable. Throw in high speeds, inclement weather, pedestrians, random obstructions like cones and fallen trees, cyclists, missing or faded road markings... and it only gets worse.

If slow-moving, highly-spaced freight trains traveling solely on dedicated tracks are still run with 2 or 3 men crews, I don't see how driverless cars will be a thing in the next 10-20 years.

Link to comment

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...