Data analytics could be the key to making sense of principles-based regulations.

Principles-based regulation was introduced in 1990 by the Financial Services Authority, the UK's financial services regulator at the time (since superseded by the FCA and PRA). The approach has since found widespread support among regulators elsewhere, adopted to varying degrees by those from the Hong Kong's SFC to France's AMF.

Even the U.S., home of rules-based regulation, is not immune. Many of the legislative rules in the country are, in practice, applied in a principles-based manner by the SEC and CFTC. Hence, instruments that act like derivatives or securities are regulated as such. (As they are with other regulators around the world as well.)

There are obvious benefits to the approach of preferring broadly stated principles to prescriptive, technical rules. It leaves less scope for legal loopholes to be exploited by the unscrupulous. It also provides greater flexibility to adapt to developments in the market such as new financial instruments, without the need for fresh regulation.

But it comes at a cost of clarity. As regulators move away from detailed rules to broad principles, the exact requirements become more difficult to determine. They can only truly be understood by examining the outcomes; and it is no coincidence that regulators such as the FCA and SFC also talk about an "outcomes-based" approach.1

Walking the walk

Examining outcomes means looking at the enforcement action taken and the regulatory impact. Financial penalties and other sanctions provide proof of regulatory priorities. Furthermore, regulators applying broadly drafted rules are often extremely detailed in the enforcement notices issued. They frequently specify their precise expectations for the controls, procedures and governance they want firms to put in place.

Modern analytical tools can help make sense of this data. They employ quantitative analysis to clarify enforcement priorities and the potential costs of non-compliance. Not just penalties, disgorgements and restitution levels, but investigation costs, too, which can be enormous. Textual analytics and natural language processing can then sift through enforcement notices to identify the actual requirements that are left unstated in the rules – how the regulator wants to see the principles applied in practice.

This analysis offers some comfort to firms. Back testing shows that most regulators are consistent in their approach. That makes such analytics and machine learning Artificial Intelligence a powerful suite of tools to manage current regulatory risks.

But it also has predictive power. We can also access future intent, by combing regulator business plans, speeches, regulatory notices and dear CEO letters. Similar analysis of regulatory notices and announcements shows the impact different types of output have on the regulatory risk. This shows that speeches have a very high impact. For example, if the head of enforcement or supervision, highlights an issue in a speech, enforcement action usually follows in time.

Used intelligently, therefore, technology can help us not only clarify what regulators are looking for today, but to stay one step ahead of their demands for tomorrow. For this, however, we need to take them at their word.

 - By John Byrne, CEO, Corlytics

Footnotes 

1 https://www.fca.org.uk/business-plan-2016-17/1-our-role and http://www.sfc.hk/web/EN/regulatory-functions/intermediaries/supervision/supervisory-approach.html

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.