The terms "revolution" and "disruption" in the context of technological innovation are probably bandied about a bit more liberally than they should. Technological revolution and disruption imply upheaval and systemic reevaluations of the way that humans interact with industry and even each other. Actual technological advancement, however, moves at a much slower pace and tends to augment our current processes rather than to outright displace them. Oftentimes, we fail to realize the ubiquity of legacy systems in our everyday lives – sometimes to our own detriment.

Consider the keyboard. The QWERTY layout of keys is standard for English keyboards across the world. Even though the layout remains a mainstay of modern office setups, its origins trace back to the mass popularization of a typewriter manufactured and sold by E. Remington & Sons in 1874.1 Urban legend has it that the layout was designed to slow down typists from jamming typing mechanisms, yet the reality reveals otherwise – the layout was actually designed to assist those transcribing messages from Morse code.2 Once typists took to the format, the keyboard, as we know it today, was embraced as a global standard – even as the use of Morse code declined.3 Like QWERTY, our familiarity and comfort with legacy systems has contributed to their rise. These systems are varied in their scope, and they touch everything: healthcare, supply chains, our financial systems and even the way we interact at a human level. However, their use and value may be tested sooner than we realize.

Artificial intelligence (AI) and blockchain technology (blockchain) are two novel innovations that offer the opportunity for us to move beyond our legacy systems and streamline enterprise management and compliance in ways previously unimaginable. However, their potential is often clouded by their "buzzword" status, with bad actors taking advantage of the hype. When one cuts through the haze, it becomes clear that these two technologies hold significant transformative potential. While these new innovations can certainly function on their own, AI and blockchain also complement one another in such ways that their combination offers business solutions, not only the ability to build upon legacy enterprise systems but also the power to eventually upend them in favor of next level solutions. Getting to that point, however, takes time and is not without cost. While humans are generally quick to embrace technological change, our regulatory frameworks take longer to adapt. The need to address this constraint is pressing – real market solutions for these technologies have started to come online, while regulatory opaqueness hurdles abound. As innovators seek to exploit the convergence of AI and blockchain innovations, they must pay careful attention to overcome both technical and regulatory hurdles that accompany them. Do so successfully, and the rewards promise to be bountiful.

AI + Blockchain, Defined

First, a bit of taxonomy is in order.

AI in a Nutshell:

Artificial Intelligence is "the capability of machine to imitate intelligent human behavior," such as learning, understanding language, solving problems, planning and identifying objects.4 More practically speaking, however, today's AI is actually mostly limited to if X, then Y varieties of simple tasks. It is through supervised learning that AI is "trained," and this process requires an enormous amount of data. For example, IBM's question-answering supercomputer Watson was able to beat Jeopardy! champions Brad Rutter and Ken Jennings in 2011, because Watson had been coded to understand simple questions by being fed countless iterations and had access to vast knowledge in the form of digital data Likewise, Google DeepMind's AlphaGo defeated the Go champion Lee Sedol in 2016, since AlphaGo had undergone countless instances of Go scenarios and collected them as data. As such, most implementations of AI involve simple tasks, assuming that relevant information is readily accessible. In light of this, Andrew Ng, the Stanford roboticist, noted that, "[i]f a typical person can do a mental task with less than one second of thought, we can probably automate it using AI either now or in the near future."5

Moreover, a significant portion of AI currently in use or being developed is based on "machine learning." Machine learning is a method by which AI adapts its algorithms and models based on exposure to new data thereby allowing AI to "learn" without being programmed to perform specific tasks. Developing high performance machine learning-based AI, therefore, requires substantial amounts of data. Data high in both quality and quantity will lead to better AI, since an AI instance can indiscriminately accept all data provided to it, and can refine and improve its algorithms to the extent of the provided data. For example, AI that visually distinguishes Labradors from other breeds of dogs will become better at its job the more it is exposed to clear and accurate pictures of Labradors.

It is in these data amalgamations that AI does its job best. Scanning and analyzing vast subsets of data is something that a computer can do very rapidly as compared to a human. However, AI is not perfect, and many of the pitfalls that AI is prone to are often the result of the difficulty in conveying how humans process information in contrast to machines. One example of this phenomenon that has dogged the technology has been AI's penchant for "hallucinations." An AI algorithm "hallucinates" when the input is interpreted by the machine into something that seems implausible to a human looking at the same thing.6 Case in point, AI has interpreted an image of a turtle as that of a gun or a rifle as a helicopter.7 This occurs because machines are hypersensitive to, and interpret, the tiniest of pixel patterns that we humans do not process. Because of the complexity of this analysis, developers are only now beginning to understand such AI phenomena.

When one moves beyond pictures of guns and turtles, however, AI's shortfalls can become much less innocuous. AI learning is based on inputted data, yet much of this data reflects the inherent shortfalls and behaviors of everyday individuals. As such, without proper correction for bias and other human assumptions, AI can, for example, perpetuate racial stereotypes and racial profiling.8 Therefore, proper care for what goes into the system and who gets access to the outputs must be employed for the ethical employment of AI, but therein lies an additional problem – who has access to enough data to really take full advantage of and develop robust AI?

Not surprisingly, because large companies are better able to collect and manage increasingly larger amounts of data than individuals or smaller entities, such companies have remained better positioned in developing complex AI. In response to this tilted landscape, various private and public organizations, including the U.S. Department of Justice's Bureau of Justice, Google Scholar and the International Monetary Fund, have launched open source initiatives to make publicly available vast amounts of data that such organizations have collected over many years.

Blockchain in a Nutshell:

Blockchain technology as we know it today came onto the scene in late 2009 with the rise of Bitcoin, perhaps the most famous application of the technology. Fundamentally, blockchain is a data structure that makes it possible to create a tamper-proof, distributed, peer-to-peer system of ledgers containing immutable, time-stamped and cryptographically connected blocks of data. In practice, this means that data can be written only once onto a ledger, which is then read-only for every user. However, many of the most utilized blockchain protocols, for example, the Bitcoin or Ethereum networks, maintain and update their distributed ledgers in a decentralized manner, which stands in contrast to traditional networks reliant on a trusted, centralized data repository.9 In structuring the network in this way, these blockchain mechanisms function to remove the need for a trusted third party to handle and store transaction data. Instead, data are distributed so that every user has access to the same information at the same time. In order to update a ledger's distributed information, the network employs pre-defined consensus mechanisms and military grade cryptography to prevent malicious actors from going back and retroactively editing or tampering with previously recorded information. In most cases, networks are open source, maintained by a dedicated community and made accessible to any connected device that can validate transactions on a ledger, which is referred to as a node.

Nevertheless, the decentralizing feature of blockchain comes with significant resource and processing drawbacks. Many blockchain-enabled platforms run very slowly and have interoperability and scalability problems. Moreover, these networks use massive amounts of energy. For example, the Bitcoin network requires the expenditure of about 50 terawatt hours per year – equivalent to the energy needs of the entire country of Singapore.10 To ameliorate these problems, several market participants have developed enterprise blockchains with permissioned networks. While many of them may be open source, the networks are led by known entities that determine who may verify transactions on that blockchain, and, therefore, the required consensus mechanisms are much more energy efficient.

Not unlike AI, a blockchain can also be coded with certain automated processes to augment its recordkeeping abilities, and, arguably, it is these types of processes that contributed to blockchain's rise. That rise, some may say, began with the introduction of the Ethereum network and its engineering around "smart contracts" – a term used to describe computer code that automatically executes all or part of an agreement and is stored on a blockchain-enabled platform. Smart contracts are neither "contracts" in the sense of legally binding agreements nor "smart" in employing applications of AI. Rather, they consist of coded automated parameters responsive to what is recorded on a blockchain. For example, if the parties in a blockchain network have indicated, by initiating a transaction, that certain parameters have been met, the code will execute the step or steps triggered by those coded parameters. The input parameters and the execution steps for smart contracts need to be specific – the digital equivalent of if X, then Y statements. In other words, when required conditions have been met, a particular specified outcome occurs; in the same way that a vending machine sells a can of soda once change has been deposited, smart contracts allow title to digital assets to be transferred upon the occurrence of certain events. Nevertheless, the tasks that smart contracts are currently capable of performing are fairly rudimentary. As developers figure out how to expand their networks, integrate them with enterprise-level technologies and develop more responsive smart contracts, there is every reason to believe that smart contracts and their decentralized applications (d'Apps) will see increased adoption.

AI + Blockchain, Together at Last

AI and blockchain technology may appear to be diametric opposites. AI is an active technology − it analyzes what is around and formulates solutions based on the history of what it has been exposed to. By contrast, blockchain is data agnostic with respect to what is written into it – the technology bundle is largely passive. It is primarily in that distinction that we find synergy, for each technology augments the strengths and tempers the weaknesses of the other. For example, AI technology requires access to big data sets in order to learn and improve, yet many of the sources of these data sets are hidden in proprietary silos. With blockchain, stakeholders are empowered to contribute data to an openly available and distributed network with immutability of data as a core feature. With a potentially larger pool of data to work from, the machine learning mechanisms of a widely distributed, blockchain-enabled and AI-powered solution could improve far faster than that of a private data AI counterpart. These technologies on their own are more limited. Blockchain technology, in and of itself, is not capable of evaluating the accuracy of the data written into its immutable network – garbage in, garbage out. AI can, however, act as a learned gatekeeper for what information may come on and off the network and from whom. Indeed, the interplay between these diverse capabilities will likely lead to improvements across a broad array of industries, each with unique challenges that the two technologies together may overcome.

906574a.jpg

Footnotes

1 See Rachel Metz, Why We Can't Quit the QWERTY Keyboard, MIT Technology Review (Oct. 13, 2018), available at: https://www.technologyreview.com/s/611620/why-we-cant-quit-the-qwerty-keyboard/.

2 Alexis Madrigal, The Lies You've Been Told About the Origin of the QWERTY Keyboard, The Atlantic (May 3, 2013), available at: https://www.theatlantic.com/technology/archive/2013/05/the-lies-youve-been-told-about-the-origin-of-the-qwerty-keyboard/275537/.

3 See Metz, supra note 1.

4 See Artificial Intelligence, Merriam-Webster's Online Dictionary, Merriam-Webster (last accessed Mar. 27, 2019), available at: https://www.merriam-webster.com/dictionary/artificial%20intelligence.

5 See Andrew Ng, What Artificial Intelligence Can and Can't Do Right Now, Harvard Business Review (Nov. 9, 2016), available at: https://hbr.org/2016/11/what-artificial-intelligence-can-and-cant-do-right-now.

6 Louise Matsakis, Artificial Intelligence May Not Hallucinate After All, Wired (May 8, 2019), available at: https://www.wired.com/story/adversarial-examples-ai-may-not-hallucinate/.

7 Id.

8 Jerry Kaplan, Opinion: Why Your AI Might Be Racist, Washington Post (Dec. 17, 2018), available at: https://www.washingtonpost.com/opinions/2018/12/17/why-your-ai-might-be-racist/?noredirect=on&utm_term=.568983d5e3ec.

9 See Shanaan Cohsey, David A. Hoffman, Jeremy Sklaroff and David A. Wishnick, Coin-Operated Capitalism, Penn. Inst. for L. & Econ. (No. 18-37) (Jul. 17, 2018) at 12, available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3215345##.

10 See Bitcoin Energy Consumption Index (last accessed May 13, 2019), available at: https://digiconomist.net/bitcoin-energy-consumption.

Keywords:Artificial Intelligence + Robotics, Blockchain, Fintech

Mofo Tech Blog - A blog dedicated to information, trend-spotting & analysis for science & tech-based companies

Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP. All rights reserved