As of April 2018, California will allow fully autonomous cars to be tested without safety drivers on public roads. One interesting question in this context is whom to blame for damages caused by artificial intelligence or machine learning systems? We took a look at this under Austrian law:

The question of liability is complex when it comes to autonomous systems or systems with artificial intelligence. A central principle of the right to compensation, namely the fault of the party causing damage, is already questionable in the case of a driver whose self-driving car has caused an accident. In this case, the issue might be whether the driver could have intervened to prevent the accident. The fault of the manufacturer (for example in the person of the software programmer) will generally be technically difficult and expensive to verify.

However, since the injured party usually does not have an agreement with the manufacturer, direct contractual liability will not be applicable. This leaves liability according to product liability acts (questionable whether also for software), liability based on contract with protective effects in favour of third parties, or tort liability. What is clear, however, is that only persons can by liable, not machines.

For damages caused by self-driving cars, in contrast to other systems with artificial intelligence, the so-called "car owner's liability" (Halterhaftung) might be a valid basis for a claim. This special liability is based on the mere fact that operating a car poses a risk to the general public and is not based on the principle of culpability. How this will ultimately affect the manufacturers, especially concerning claims for recourse by the car owner's insurance company, cannot yet be seriously assessed.

Another interesting aspect about self-learning system is the question, to whom the rights to the work products of "self-learning" systems belong. Why? Because only natural and legal persons can be owners of rights and duties, but not machines.

Hence one always has to look at the person behind the system when attributing rights in creative endeavours or inventions. If certain work (including software code) created by an autonomously learning system would generally qualify for copyright or patent protection, under Austrian law it would have to initially be attributed to a natural person (ie a human being). Authors in the sense of the Austrian Copyright Act or inventors in the sense of the Austrian Patent Act must always be natural persons. Of course, such authors or inventors can grant third parties, which includes legal persons, rights to the to the protected work results.

But what is the author's or inventor's position when a self-learning system autonomously produces work? One could take the position that a work created by a self-learning system is only a consequence of the creative or inventive efforts of the person who created the logic behind the self-learning system, and that therefore this person is also to be credited for the end result. On the other hand, it can be argued that rights must be attributed to the person who provided the impetus for creating the concrete work result – eg by entering certain data. Perhaps both persons are co-authors or joint-inventors. Or maybe nobody can assume rights to such work results, if their contribution to the end result was so small that a "creative" or "inventive" effort can hardly be seen, as the system developed the work result almost fully autonomously.

Further reading | Click here to see how a claim for restitution of machine-generated data looks like

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.