1. 程式人生 > >Driving Car Companies To Ensure Safe Use

Driving Car Companies To Ensure Safe Use

Pressure On Self-Driving Car Companies To Ensure Safe Use

By Alexander Hunkin

Wikimedia

In the last couple of years, self-driving cars have become a huge trend. However, a lot of questions have been raised about their safety on the road. The companies behind these vehicles are working toward making them safer. However, the fact is that it’s difficult to face crucial regulations which fall under the Federal Motor Vehicle Safety Standards (FMVSS), especially when it comes to more advanced designs.

Safety cases

In truth, there is a collision between the attempt to create safe self-driving cars and the urge to push them into the market. In some cases, this was the reason that accidents happened. So, basically, if you try to rush progress, you are bound to run into safety issues. Safety cases seem to be a solid solution for this. They are based on making sure the progress isn’t too quick so that it surpasses safety measures, in terms of writing down why a system is safe. A safety case consists of a safety goal, a strategy for meeting it, and the evidence that it works in practice.

For example, one of the major safety goals is the detection of tire pressure, so that a blowout doesn’t happen. What the company can do is come up with a proper safety strategy — such as making sure that the car sets its distance and speed limits according to what is going on with the tires, and in the best case scenario, pull off the road and wait for service. The safety case would need to suggest tests which are going to prove that the strategy really works.

The evaluation of safety cases can come first, or as a feedback after an unlucky accident. No matter what the strategy is, it would be wise to make the cases public, because then people would understand how the cars are being tested for safety reasons, not just read about how the complex algorithm of an autonomous car works.

The accidents

As previously mentioned, several accidents have happened which have spurred heavy discussion about the safety of self-driving cars and the companies behind them.

The first case happened on May 7, 2016. It was when a driver turned on Tesla Model S’ Autopilot on and drove at 74 mph down a highway near Williston, Florida. At one moment, a tractor-trailer appeared from an intersecting road. Even though Tesla’s Autopilot has radar and cameras that scan the road, holds the lanes, brakes, speeds up, and passes other cars without an issue, it failed to recognize it. The investigation showed that the design of the Autopilot covered hitting vehicles in front of them, but was not prepared for a situation when a vehicle comes from an intersecting highway.

Another critical accident happened on March 18, 2018. This was when an Uber self-driving test car drove down a road near Tempe, Arizona and hit a 49-year-old woman who was crossing the street with her bicycle. The car had a safety driver in it, but the footage had shown that they were not watching the road.

Who is to blame?

Looking at cases like this, the big question has been raised of who is to blame for the accidents. In the first case, the deceased driver was held responsible for the accident. In the case of the Uber vehicle, many tried to blame both the victim and the safety driver, but it was determined that Uber had rushed the release of their self-driving cars, ignoring a clear goal that would fully cover safety issues.

The discussion about who had been negligent and to what point lead to the conclusion that it hadn’t really been properly determined who or what would be liable when such a situation happens. For now, the law basically says that people cannot just completely rely on the car itself. They must always mind the road and be prepared to take control if necessary.

However, it has also been discussed that the law needs to be updated because there are various degrees of car automation. If the car is still very dependent on the driver, then they are the ones that take responsibility. However, with the emergence of fully automated cars, it was proposed that the system itself is to be seen as the driver, and therefore liable for anything that happens.

Therefore, the necessity for companies behind self-driving cars is not only to comply with safety standards but to ensure the safety of fully automated cars as well. After all, this is a technology that is still being developed, and the fact remains that self-driving vehicles aren’t perfect. Furthermore, in order to truly be able to properly present their case, every company that either produces or rents automated vehicles should have their back covered by a lawyer. As we have ascertained in this article, the legal side of the whole story is still quite complicated, so it would be wise to have legal counsel at hand.

In summation

Self-driving cars are definitely a part of the future. However, the systems are still not perfect, and accidents tend to happen. Still, people must not forget that human error causes a huge number of accidents on a daily basis as well.

So, the goal of each company is to keep perfecting their systems and have a legal backup in case something goes wrong. And, of course, they should all avoid the mistake of rushing the release to the market before the vehicle is properly tested to the best of what a manufacturer can do and cover with a study case.