19 Aralık 2018 Çarşamba



Bilgin ÜNAL
Electrical-Electronics Engineer

According to Turkish Language Association, a driver is the “one” who drives the car. This “one” means a person. I mean, we are talking about a living creature who can see, hear, hold in an eye-hand-leg coordination.

Our eyes make us see the world. Thanks to our eyes, we see what is happening around us while we walk, run, talk and of course while we drive. Although we can see the moving objects in the scope of 125 degrees angle (one eye), and in total 140 degrees (two eyes), we can see max. 45 degrees with a blink of eye. This angle decreases to 25-30 degrees when we are older and 10-20 degrees in the dark. Bleary-eyes, a tired body, careless driving, etc. cause accidents in the traffic.

Traffic accidents cause loss of life and property. According to Turkish Statistical Institute, 1.202.716 traffic accidents happened in 2017 in Turkey. “Approximately 1.35 million people die each year as a result of road traffic crashes in the world” [5]. “The law assumes that we each know basic facts that are widely understood in the community. For example: kids might run into the street, drinking can impair you as a driver, snow and ice can be slippery, and train tracks can be dangerous. If you’re driving on the road, even if you don’t have a driver’s license, the law imputes everything a licensed driver should know to you.”[1] %2.6 of the persons in prison are there because of those accidents. And it means that penalties don’t have an effect on traffic accidents.

The driver is charged if he/she is responsible. But what if the car is “driverless”? I think the least responsible one would be the owner of the car. A person “travelling” in his autonomous car will not be different from a person travelling in a bus. So, Who will be responsible? Designers, programmers, standards, hackers? Here, it is possible to see which way the crime scene investigations will evolve. Such accidents will not be able to be solved by means of classical traffic police methods. We will need to have cyber-traffic police in our life. And we need to have new laws.

“One of the basic questions is: How reliable is the cell network? What if there is no mobile network available? What if sensor(s) fail? Should there be redundancy for everything? Is there a threshold that determines when the car is reliable, e.g., when two out of four sensors fail?”[2]

Also we will need black boxes to understand how the accident happened, which data, equipment or decision failed. “In aircrafts “black boxes” are used to determine what happened after a crash. Should this be also a part of a self-driving car?”[2]

The subject takes us to 5 ethical questions on risk, safety and trust which we still need to answer [3]:

1. Which risks are worth taking?
2. Is the car making a choice, or are we?
3. Is there a moral code we can all agree on?
4. Or should we choose our own car's moral code?
5. Can we ever learn to trust our self-driving cars?

In fact, “the arrival of the self-driving car presents a challenging new dilemma: Whom should the vehicle save – and whom should it harm – when an accident is unavoidable?” [4]