Driver-less Cars

Ethical Decision making for Driver-less Autonomous cars

Who will decide that whether striking a lamp post off its root is not a big deal but to a human being it is? Who will teach an autonomous car the ethical value of life?

With similar questions, the inventors of autonomous cars have been dozed off here and there. Explanations have been asked as to how will a car decide on the spot which decision is good to make while in a dilemma? According to reports, the autonomous cars have been trained ethically to make decisions on road just like a human would do.

According to few researchers from the University of Osnabruck in Germany, teaching a driver-less car to be ethically equipped using algorithms is not a tedious task. How will this work?

During one research plan, people were made to experience virtual reality simulations in intense traffic scenarios. How did humans react while seeing an animal, a person or an inanimate object is how priority was assigned to each. These priorities are then set as values to a machine and therefore autonomous cars can make a human decision while on roads.

But there are situations when even a human is confused as to apply brakes or rather get hurt. Situations like what to do if a dog suddenly comes in front of your vehicle, do you keep moving or you take a sudden turn causing an accident by hurting someone else? Who is responsible for taking such decisions and if something happens whom will we blame?

According to researchers at Massachusetts Institute of Technology :“The algorithms that control [autonomous vehicles] will need to embed moral principles guiding their decisions in situations of unavoidable harm,” In cases where the car has to make a decision on whether to take a sudden turn and hurt the pedestrians or harm the passengers, then the decision should be don’t harm the pedestrians if that means hurting its own passengers. The law makers have to face the obligations of making laws for such situations.

The cars are well equipped with sensors, LIDAR and cameras to inspect the conditions and thus take a decision. Autonomous cars visualize the road conditions, depend on speed of the object that comes in the way. The cars shouldn’t only be ethically able but also fast enough to calculate the dangerous aspects and apply the results.

This doesn’t end the problem here, but one more aspect comes into picture and that is what if the controls are hacked by some anonymous person. He takes over the board and accesses it for his own good. This may lead to yet another difficulty in maintaining road safety.

What can possibly be done?

This situation is no piece of cake. Whether driven by a human or an electric car, the decision making power can fail at times. The cars can be only equipped with making the most optimized decision without breaking the law and also affecting minimum damage both on ground levels and human life.

If ethically cars are programmed, our risk of getting into complex situations and depending on the car will be minimized big time!! Let’s see what all comes forth as the solution!

 

8.8 out of 10
Previous ArticleNext Article