“They don’t drive like people. They drive like robots.”

Why self-driving cars get into accidents

#Driving

Tue, Oct 17th, 2017 11:00 by capnasty NEWS

According to the Seattle Times, self-driving cars get rear-ended because they drive really weird: while robots "obey the letter of the law," humans "violate the rules in a safe and principled way," causing problems when the two share the road.

What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.

Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.

  662

 

You may also be interested in:

Amateur-Built Electric Car Going After Record Set by Tesla
100 Cars Alarm Symphony
"If we're 100 years into the automobile era, it seems pretty inconceivable that the car as we know it is going to be around for another 100 years."
"We think that it’s a service to the community to know if you’re a crazy driver or not."
"Fake engine noise has become one of the auto industry's dirty little secrets."