Monday, January 21, 2019

Top Ways People Could Prank A Fully Driver-Attended Vehicle

Phoenix New Times ran an article entitled "Top 12 Ways People Could Prank a Fully Driverless Vehicle." While taking great care to explicitly NOT advocate that people do these things (I guess they have lawyers on retainer), Phoenix New Times makes the point that the driverless car algorithms are fairly new, and that if you throw something at them that they don't expect, the algorithms could be fooled to act incorrectly. (Of course, these are examples in which society agrees what the correct activity should be.)

By Grendelkhan - Own work, CC BY-SA 4.0, Link

So I read the 12 ways, and it hit me that while these could fool driverless cars, many of them could fool driver-attended cars also.

Examples:

Get in, but not out. Autonomous vehicles at intersections presumably have their doors locked, but when an autonomous taxi is changing passengers, that's an opportunity for an intruder to get inside. Maybe the person is a rude prankster, or maybe just drunk. Maybe the person tries to take over driving operations. What happens next?

This can easily be shown to be something that isn't unique to autonomous vehicles - especially in South Africa.

A Toyota Corolla Quest, belonging to a Pakistan national who was hijacked at Jambila on the R38 between Barberton and Badplaas on Wednesday afternoon, was rediscovered in Johannesburg later that afternoon.

Unfortunately the R27 500 that was stolen from Ridwane Patel (30) has not yet been recovered and the suspects are still at large.

According to Capt Jabu Ndubane, police spokesman, Patel was hijacked by five men on Wednesday at around 10:00.


And Patel didn't have the benefit of a camera in his car and a Waymo employee waiting to assist.

Back to Phoenix New Times.

Getting punked. If no driver or riders are around, who will take the banana out of the tailpipe or help catch a prankster?

And what if a driver is around, and doesn't know what is going on? The car could still stall.

OK, here's another one.

Alter road signs to fool computers, but not humans. University of Washington computer-security researcher Yoshi Kohno showed in 2017 that if you know the algorithms that help the computers in driverless cars process their detection data, the computers can be easily fooled. In a spooky demonstration of the potential weakness in self-driving cars, strips of black-and-white tape on a stop sign caused a lab-based autonomous system to see it as a 45-mph speed-limit sign.

Yes, this is a wonderful, ingenious method that specifically targets the car's algorithms. But there's a lot easier way to do this that will not only fool autonomous cars, but every car.

A vigilante has secretly been protecting parking spaces by creating fake road signs - for a zone that doesn't exist.

Around 18 of the realistic "Zone F" signs have been placed on lampposts in Bath, Somerset.

They are professionally made and put up on lamp posts but the local council has confirmed there is no Zone F.


And if the vigilante doesn't want to park, she could just as easily replace a real stop sign with a fake 45 mph sign and watch the fun begin.

And if you go through the entire list, many of the autonomous tricks could easily be applied to all cars.

Well, except one:

Hack them.

While you can hack computer software, you can't hack an actual driver.

Or can you?

What if the driver is instructed by a bozo instructor?

One spring afternoon, my daughter was scheduled for her second in-car lesson. Just as school let out, the driving instructor pulled up in a bus lane to collect her. He insisted that she get behind the wheel and directed her to proceed through the bus lanes.

Luckily, no one was killed (although the daughter was shaken). But if the girl hadn't been subsequently instructed by someone with a brain (her parent), what could have happened?

blog comments powered by Disqus