Maytinee Kramer / Staff Writer
opinion@fiusm.com
Google is going above and beyond as a leader in efforts to create driverless cars. However, it’s run into one obvious, yet odd, problem: humans. Google’s autonomous test cars are programmed to abide by the rules of the road and though they take the most cautious approach, this can put them out of step with other vehicles.
Just last month, one of Google’s self-driving cars approached a crosswalk and, as it was supposed to do, slowed down to allow a pedestrian to cross, prompting its “safety driver” to apply the brakes. The pedestrian safely crossed the road, but Google’s car wasn’t – a human-driven sedan hit it from behind. Another Google car test in 2009 could not get through a four-way stop because its sensors kept waiting for the other vehicles to come to a complete stop and let it go. The human drivers kept inching forward, looking for an advantage, causing Google’s robot to paralyze.
This is one of the biggest challenges facing automated cars. As humans, we are taught to follow the rules of the road but reality shows that humans don’t actually behave by the book.
“The real problem is that the car is too safe,” Donald Norman, director of the Design Lab at the University of California, San Diego says. “They have to learn to be aggressive in the right amount and the right amount depends on the culture.”
Every day the roads are filled with traffic wrecks and deaths, yet humans still remain aggressive and short-sighted on the road. It’s common to see drivers on their phone, eating, talking or being distracted in some way, causing them to lose track of their surroundings. Others fall asleep behind the wheel or want to race another car at high speed.
Autonomous cars, unlike real drivers, don’t do any of these things.
Dmitri Dolgov, head of software for Google’s Self-Driving Car Project, said that one thing he’s learned from the project is that human drivers need to be “less idiotic.”
It seems as if humans and machines are an imperfect mix. Knowing this, Google is trying to smooth out the relationship between the car’s software and humans, programming it to be a little more aggressive.
Though driverless cars may sound boring for those who like to drive, it doesn’t mean we can’t take a page out of Google’s book. The road is hazardous at all times, and drivers need to be safer and more aware. Rather than completely switching to driverless cars, maybe drivers should actually act as they are supposed to. Put down the phone, don’t rage and don’t speed. If humans could follow the road rules, then perhaps driving would be much safer and there would be fewer accidents and deaths. In today’s world, somebody always has to be liable, so we should be liable for ourselves first.
Be the first to comment on "Are driverless cars safer than actual drivers?"