Significant developments on both sides of the Atlantic shine a light on developing regulatory issues for autonomous vehicles.

The first notified insurance claim against a (semi) automated vehicle manufacturer has been brought in the US. Reports have also arrived from California indicating that drivers may be becoming over-reliant on partially automated vehicles in a manner inconsistent with their actual abilities.

Such issues are not unique to the US and likely to present a problem to UK regulators and insurers, who must ensure they get ahead of the development curve. The Automated and Electric Vehicles Bill has been passed from the House of Commons to the House of Lords ahead of a debate at the end of February. It is clear that whilst this piece of legislation may have a smooth journey, those using autonomous vehicles may not have the same experience, particularly as driver awareness of the limitations of the technology may be lacking.

The Government has also recently commenced consultations on amending the Highway Code to ensure that drivers are aware that they must exercise 'full control' over advanced driver assistance systems (ADAS). It has also been proposed that a change is made to the Road Vehicles (Construction and Use) Regulations 1986 (Regulations) to allow usage hand held communications devices as a means of performing parking manoeuvres via remote control.

First US claims

The first claim against an autonomous vehicle manufacturer has been filed against General Motors, following a collision in San Francisco between a Cruise Automation 2016 Chevrolet Bolt and a motorcyclist.

The motorcyclist alleged that the car had swerved into his path, after failing to complete a move into another lane. GM has denied this, stating that that the motorcyclist had in fact caused the collision.

There are several interesting questions which arise out of this incident:

  • If the claim proceeds to Trial and liability is found to rest with the vehicle, then whose responsibility is the crash?The vehicle owner?The vehicle was believed to be operating with a driver assistance feature at the time and the manoeuvre was within that the stated capabilities of that feature;
  • In the alternative, is it GM's responsibility that the car did not respond to the presence of the motorcyclist correctly?

The progression of this claim will be watched closely by the industry, as the judgment is likely to determine the parameters of stakeholders' respective liabilities in relation to the use of increasingly advanced driver assistance systems.

Manufacturers asleep at the wheel?

Driver awareness is being seen as an increasingly important impediment to the safe operation of this technology.

Tesla's website advertises their 'Autopilot' feature as "the hardware needed for full self-driving capability at a safety level substantially greater than that of a human."

What is not as clear is a warning that local regulatory approval and substantial software updates need to be in place before fully autonomous driving can function. The use of such a provocative name arguably provides a false impression about the vehicle's capabilities, particularly if such technology can deliver the claimed benefits 99% of the time. It is perhaps unsurprising that repeated successful use of these features could lead to over reliance by users, with potentially catastrophic consequences.

Indeed, there have been two recent incidents, again in California, which raise concerns that drivers, when presented with feature names such as "Autopilot", fundamentally misunderstand the limitations of the automation available within their vehicles. Both drivers, on being questioned about the collisions, referred to the vehicles being on autopilot. The use of this term should give manufacturers and regulators cause for concern about the efficacy of marketing materials and instruction manuals.

It should be made clear that the assistance functions available within the vehicles operate as required. However, there is perhaps inevitability in some users becoming reliant on those functions and misjudging further situations, assuming that the assistance function will be in control.

In the first incident, a Level 3 Tesla driving an estimated 65 mph vehicle collided with a stationary fire truck on a freeway. The driver reported the vehicle was on 'autopilot'.

The Tesla's vehicle manual states "Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead". Drivers are therefore expected to have detailed knowledge of the capabilities of a Level 3 vehicle to enable them to take control of the vehicle in such circumstances.

Whilst no injuries were sustained, it is not clear whether the driver was unaware of the gap in the Tesla's functionality or that he was aware, but was afforded insufficient time to take control and take steps to avoid the collision.

In a second incident, a man was arrested for drunk driving in California, having been found asleep at the wheel of his stationary Tesla in the middle lane of a freeway. He advised officers that his vehicle was engaged in autopilot.

Such issues are likely to be fundamental to the developing legal liability position globally. Combined with the inevitability of users overestimating the functions of their vehicles, a further steady stream of future incidents is expected.

UK Regulatory developments

With an eye on the situation unfolding in the US, it is heartening to see that UK regulators are taking steps to make prudent alterations to the Highway Code, to emphasise driver responsibility in automated vehicles, and to also allow the safe use of assistance systems.

The changes to the Highway Code will stress that "you as the driver are still responsible for the vehicle and MUST exercise full control over these systems at all times," when using systems such as motorway assist or remote controlled parking. It is hoped this counters many of the bold claims made by manufacturers and avoids the incidents seen in the US.

The changes to the Highway Code place the onus squarely on the driver to ensure their full attention is on the road irrespective of the systems available to them. The proposed changes also make clear that further amendments will be made upon the introduction of more advanced autonomous vehicles, as driver attention can be legitimately transferred away from the road.

Furthermore, there is a proposed exemption to the Road Vehicles (Construction and Use) Regulations 1986 to allow remote control parking using hand held devices. The current regulations, as amended in 2003, prohibit an individual from using a mobile device or hand held device. Given the frequency of parking accidents in relation to manual vehicles, it is hoped this change will reduce accident levels.

It is evidently sensible that efforts should be made to ensure that the far reaching effects of advanced driver assistance systems (ADAS) are being made clear to drivers, and appropriate amendments to regulations should be made to include these new systems.

As a general point, whilst ultimate control lies with the driver, there is clearly a responsibility on the part of manufacturers to make clear the capabilities of their vehicles, and a failure of their part to do this – as seen in the US – can lead to serious consequences and potential liability challenges. Insurers must be alive to this developing area when assessing claims involving these vehicles.

Important questions must be resolved concerning how manufacturers are mandated to ensure the necessary information is brought to the attention of the driver. It is clear that regulatory intervention will be an important safeguard in this area, to prevent manufacturers (and drivers) from being asleep at the wheel.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.