Tuesday, December 9, 2014

In the article "The Robot Car of Tomorrow May Just Be Programmed to Hit You", Patrick Lin talks about the new autonomous cars that will be released in 2016 and tries to determine whether or not they are an ethical solution. The topic is very complicated so even after reading the entire article, it still isn't clear whether or not he believes it is. The first example he provides is if the car has to chose to hit either an SUV or a Mini Cooper. The SUV will be able to absorb the impact more easily and less damage will be done so if it was programmed to make smart decision then it would hit the SUV. The issue with that is people will start buying cars that are not safe so that they don't get targeted, which will hurt companies that are known for their safety. Despite this, I feel like people would still buy safer cars because if they are getting hit by a normal car then they still need to be safe. I also feel like it could potentially encourage people to ride bikes short distances which is better for the environment. I don't believe it will become an issue and people will continue to buy SUV's even after autonomous cars are everywhere. The next example he talked about was if it should hit a motorcycle rider with or without a helmet. He said that this would encourage people to take their helmets off when they are riding. This scenario requires 2 motorcycles to be driving near each other though which doesn't happen very often. It is much more likely that they will be driving by a car and a helmet would make them much safer. I don't think the cars will influence whether or not they wear a helmet.

Lin then proposes to have the target chosen at random. I think that this is a good idea, but it should only be used when the targets are similar. If the car has to chose between denting a bus and running over a biker, it should hit the bus. When it is determining which car to hit however, I think it should be random. He says that this will not be better than human driving, but I disagree. It has a faster reaction time so it is more likely to avoid the crash and it makes better decisions than a drunk driver would so it is still better than a human driver.

After the article was published, someone commented asking whether or not the person inside should be responsible for the crash or if the dealership should be. I feel like the driver should not be held responsible because he did not cause the crash. He shouldn't be penalized for an accident that he didn't cause. I feel like the dealership should not be penalized either unless something went wrong with the programming and it wasn't supposed to hit the target it hit. Otherwise, it did exactly what it was supposed to do so if the accident was unavoidable then nobody should be penalized.

Overall, I see feel like these cars are one hundred percent ethical. They will help save thousands of lives and stop billions of dollars in damages. Based on these outcomes I can conclude that the cars should be produced and there is nothing ethically wrong with them.