Ad provided by Google

A self driving car may appear attractive and whether we like them or not, they are on the way to our UK roads. Imagine though, the thrill of one day being able to take the back seat and put your feet up while your car drives you to your destination. Is it possible that a journey to work that is wrought with traffic maybe taking up to 3 hours could suddenly be traffic free and only take an hour?

The technicalities of self driving cars

Not having to get behind the wheel to work any more sounds appealing to many of us, especially during stressful times. For some, it would mean being able to catch up on some work. For others, they can catch up on their sleep after night out on the town with their friends. Whatever your reason self-driving cars will be prone to error and crashes are one of them.

We still have to be aware that nipping around on the road still poses legal obligations, even in self-driving cars. Just who is to blame if one accidentally crashes? Will it be you, the car, the programmer or anyone else? We all know that technology does have a habit of failing us sometimes so imagine you're reading your newspaper and are fully dependent on  on your car to function well. Suddenly, your car begins drifting out of its lane, and its computer systems fail, so who is to blame?

Although self-driving cars appear to be awesome with the thought kicking off our shoes and put our feet up, in reality, this would be impossible! At least in the initial stages, we will need to still keep our eyes peeled on the roads at all times. Should your new driverless car lose control for whatever reason, you will still need to be in a position to immediately takeover.

The legalities of self driving cars

Self driving cars will be equipped with alerting systems to warn drivers. As technology grows in the future, so will the car's alert systems. We can expect car manufacturing companies to eventually become giant software companies, that will enable programming upgrades and updates.

This obviously means that we will need to be alert even though the car is self driving. It is possible that self driving cars will be programmed to alert passengers if they are not paying attention to their surrounding environment as well. This will mean our attention will still need to be inside and outside of the car.

We all know that computers can fail at times, so there will be more than likely an alert that a passenger may need to take over. It is also possible that self driving cars will not be able to cope on certain terrains, or they may consider it as too dangerous, thus again an alert system may well be implemented.

A driverless car may be able to collect data just like a black box on an aeroplane. This system will be in place to provide information to the manufacturer and insurance companies should there be any fatal accidents.

The legal bits are an incredibly grey area which could be enough to put off anybody wanting to drive self driving cars. The truth is, however, there are thousands of daily accidents with today's cars, but if self driving cars reduces this number significantly, then it will all be worth while for worth humans and autonomous agents working together.

For years, the military has been using robots to disarm bombs etc. so this subject of 'who's to blame' has been highly debated. During experiments, they found that the more human-like robots there were, the more humans put their trust into autonomous agents. However, this doesn't solve the issue, there is in fact a lot at stake with the law, the human operator, the manufacturer, software, technicians/mechanics, the installer of the program and more. In the event of an accident, each one would have to prove that their operations were not defective.

So, who do you think should be to blame if a self driving car goes wrong?