New York: Driverless cars are now no longer a future idea. They will come sooner or later, but there is an important ethical question regarding safety.
According to researchers at Massachusetts Institute of Technology (MIT), driverless cars can pose a social dilemma of whether they should be hitting a pedestrian or turn aside in such a way that they will crash and harm their passengers.
Although driverless cars are programmed with safety rules but public is wondering what the software would under these circumstances.
People want to live in a world where driverless cars will minimize casualties. However, every person wants to own a car that will protect him/her at all costs.
Autonomous vehicles generally come programmed with its own set of safety rules, but scenario where these rules do not end up functioning what they are ultimately aimed for, is where the problem arises.
A researcher said, “Suppose a driverless car must either hit a pedestrian or swerve in such a way that it crashes and harms its passengers. What should it be instructed to do?”
So to bring it together, the study puts an important question to light that the people are still in a conflict situation regarding the prospect of having driverless cars on road.
People want the car to move in such a way that it should be able to save 10 people coming in its pathway, but they would not be entirely comfortable riding such a vehicle, if it comes to their own personal safety.
As per study, 76 pc believe it to be ethical, if the cars sacrifice one passenger for the greater numbers on the roads, but wouldn’t actively like to be inside such a vehicle. The rating dropped by around 33 pc when posed with the question of riding such a programmed car, says study.
People are also against this move of the government regulating these auto-driven cars. They were generally against regulations by government, relating to the utilitarian programming of the vehicles.
The study was published recently in the Journal of Science.