Simplicity is proportional to efficiency. Asymptotically, decentralized systems are the embodiment of simplicity; they tend toward sustaining complexity and (inter)connectivity to viable levels.
In the context of blockchain platforms, it would then seem paradoxical to question the significance of interoperability and interconnectivity. However, in practical terms, end-users, developers, and businesses are even now compelled to navigate multiple layers of gratuitous complexity to try to address even the simplest of scenarios that span multiple blockchains.
Arguably, as an industry, we will eventually get this right. However, it is worth looking at the big picture by asking: “What happened? What has led the industry down these detours? And where can we focus on getting back on track?”
The global economy’s settlement layer and associated distractions
The ambitious vision of the “world computer,” a Turing-complete state machine to end all that came before it, catalyzed a wave of innovation the likes of which can only be compared to the early days of the internet. The mission to build a platform where public economic consensus attempts to provide the fabric for decentralized governance has taken decentralization to the masses — cryptoeconomics is now just “economics.”
However, the flip side of this extraordinary innovation has been an overt commercial attempt to try to establish a single, so-called “global economy’s settlement layer.” The basis for this tactic is that one blockchain must serve as the global settlement layer for all transactions, no matter on which blockchain or chains they execute. The self-serving argument is that this one settlement layer provides an “anchor” for the industry, establishing finality if arbitration is required.
“There can be only one” is the belief and the motto among the immortals in the Highlander saga. In the real world, however, this effort has been an unfortunate maximalist diversion for much of the industry. Decentralized settlement cannot be a zero-sum game; otherwise, we have lost the plot.
A standards-based approach to interoperability
Historically, in the technology domain, standards have enabled varied technology capabilities to work seamlessly together. Standards have served to establish and reinforce compatibility and compliance among diverse entities so that ecosystems may operate efficiently and effectively. They serve as the substrate behind the “Lego” blocks of products and services by instituting coherent protocols (as well as metadata, schemas, ontologies etc.) that can be widely understood, tested, analyzed, applied and validated.
Standards enable and constrain requirements, specifications and attributes that are typically used to ensure that products and services meet their pre-established purpose and that they deliver outcomes as expected. Importantly, standards provide a “neutral” layer of abstraction obviating the risk that one or more entities may attempt to control and/or subvert the longer-term goals.
Without standards, the internet as we know it could not exist; without standards, our ability to connect, communicate and collaborate using tools such as email and messaging could not exist; and without standards, the plumbing that powers blockchain protocols would be so sufficiently primitive that much of the cryptographic engineering that we take for granted would be well-nigh impossible.
Toward a model for blockchain interoperability
When we look at the history of interoperability and the internet, it is apparent that while it took multiple decades for much of the core infrastructure to be built out, innovation and large-scale usage exploded in the late 90s with what is now referred to as the web. Subsequently, with the so-called Web 2.0 and the cloud-wave of technologies, once again there was an uptick circa 2006–2010 concerning innovation and standards in the consumer and the commercial domains, respectively.
What can we surmise? First, there is a core layer of interoperable standards that are required to establish the underlying foundation. Interestingly enough, this is where the blockchain industry has been focused on protocols (infrastructure), including layer two, smart contracts (processing), and oracles (data).
Second, while connectivity is the bedrock, standards for e-commerce are what it took to transform the staid internet into the explosive web as we know it. This is where we have work to do — wrapped tokens are a tactic, not a strategy.
The industry needs to focus on digital assets: asset definitions and templates, asset swaps, ledger, and inter-ledger transactions, and more — built on a foundation of standards-based interoperability.
Leading with a lexicon for digital assets
It has been said that language is the prerequisite for “extended trains of thought.” Language is a genetic capability common to humans and distinguished by the characteristic of “discrete infinity” — i.e., the capacity for essentially unbounded composition of simpler objects into complex structures. Without a discourse on the validity of this hypothesis, it can be safely said that at a minimum, language is the underpinning for shared communication and collaboration.
This summer, under the leadership of Microsoft and other tech giants in the technology and financial sectors, the InterWork Alliance was launched. Its key focus is the development and evangelization of the Token Taxonomy Framework. The Token Taxonomy Framework is an early attempt to create a lexicon and a language for digital assets.
The Token Taxonomy Framework was designed to bridge the gap between developers, business analysts and managers, and policymakers and public regulators, enabling them to work together to model, architect, design, validate and create and deploy new business models and networks based on digital assets.
A common lexicon for digital assets provides a shared basis and a starting point for mutual understanding and enables the development of tools to support communication, collaboration, and commerce.