By Paul Rapino Head of Growth, Interwork Alliance
The 2021 Token Taxonomy Act is not the same as the Token Taxonomy Framework. However, they are both in high demand because tokenization is here to stay!
What is a token? A token is simply an indication, proof, or expression of something else. Tokens have virtually no value on their own; they are only useful because they represent something more significant. Examples include the game pieces — the race car, boot, and the plastic hotels — you use to play Monopoly. Except for nostalgia, these pieces do not mean much outside of the value they represent in a Monopoly game. A digital token, however, is a piece of data that stands in for, or represents, a valuable piece of information.
What is tokenization? Tokenization is the process of turning something with value into a unique representation of that value. The tokenization process removes sensitive data from your business systems (i.e., game, ecosystem, marketplace, value chain) and replaces it with an undecipherable token. The original data is then stored in a secure, cloud data vault. Secure or encrypted numbers can be decrypted with the appropriate key (i.e., rules). Tokens cannot be reversed because there is no mathematical relationship between the token and its original number.
And finally, a taxonomy is simply a way to classify a thing or concept.
The Token Taxonomy Act
In March 2021, Rep. Warren Davidson (R-OH) re-introduced his signature Token Taxonomy Act. This bill establishes much needed clarity for businesses, consumers, and regulators operating in the growing U.S. blockchain ecosystem.
According to the National Law Review, “As introduced, H.R. 1628, known as the Token Taxonomy Act, would define a ‘digital token’ as a token that is created pursuant to rules for which the creation and supply are not controlled by a central group or single person, among other requirements.” To qualify as a “digital token” under the bill, the transaction history must be able to resist modification or tampering by a single person or group of persons under common control. Moreover, the digital token must be capable of being transferred between persons without an intermediate custodian.
“This is an important first step to promoting innovation and maximizing the potential of virtual currencies for the U.S. economy, all while protecting customers and the financial well-being of investors,” said Congressman Darren Soto (D-FL). “The strong support for this bipartisan legislation from U.S. businesses and stakeholders is clear indication that our friendly, light-regulatory proposal will propel the United States to be at the forefront of this industry.”
While the Token Taxonomy Act has many hurdles to clear, the Interwork Alliance (IWA) supports the work of Congress and industry to understand tokenization and clarify the regulation of digital tokens.
Now let’s look at the IWA Token Taxonomy Framework.
The Token Taxonomy Framework
The IWA is a member-led, non-profit organization, not a governmental organization. We are technology and politically neutral. Although highly aligned with what’s happening in the tokenization space, the IWA focuses on how organizations can define tokenization as a valuable business term and use tokens to work better together at the business level. Our intention when we launched in 2020 has remained constant: to empower all organizations to adopt and use token-powered services in their day-to-day operations across use cases and networks, bringing inclusivity to globally distributed applications.
Tokens will disrupt global economics and radically change how commerce will be transacted. While various implementations exist today for tokens specific to numerous blockchain platforms, the industry lacks a venue for all participants to collaborate on a shared description and approach – resulting in a lack of interoperability, reuse, and common ground to address regulatory issues. The IWA is this venue, developing a clear definition and scope of the token concept, including use cases, taxonomy, and terminology, and a specification neutral to the underlying technology.
The Token Taxonomy Framework (TTF) is a fundamental building block that bridges the gap between developers, line of business executives, and regulators, allowing them to work together to model existing and define new business models based on tokens. Our open-source Framework’s purpose is to:
- Clearly define common token concepts and terms in non-technical and cross-industry language using real-world, everyday analogies so that business, technical, and regulatory participants can understand them.
- Produce token definitions that have clear and well-understood requirements for token properties and behaviors that are implementation neutral for developers to follow and standards organizations to validate.
- Establish a base Token Classification Hierarchy driven by metadata that is simple to understand and navigate for anyone interested in learning and discovering tokens and underlying implementations.
- Deliver tooling meta-data that enables the generation of visual representations of classifications and modeling tools to view and create token definitions mapped to the taxonomy.
- Produce standard artifacts and control message descriptions mapped to the implementation neutral taxonomy and provide base components and controls that consortia, startups, platforms, or regulators can use to work together.
- Encourage differentiation and vertical specialization while maintaining an interoperable base.
IWA deliverables in our first year
The IWA has grown to 60 organizations and 1300 members across the globe, working on the Token Taxonomy Framework and these key projects:
- InterWork Framework (IWF): Defines implementation-agnostic protocols for contracts (also called “smart contracts”) to be composed of clauses mapped to the TTF tokenization standard.
- Analytics Framework (AF): Delivers the ability to instrument multi-party contracts, preserving privacy where applicable, and create shared data streams to support value-add services and industry-driven data reporting.
- Certification: Analyzes the standards output of the IWA and recommends, executes, and maintains an optimal IWA Certification Program that validates conformance to the IWA specifications and promotes interoperability.
Our working groups are currently using these frameworks in the following use cases, with many more to follow as our membership grows.
- Sustainability Business Working Group: This group is focused on defining standardized tokenization of critical elements, standardized clauses for ledgering templates, and market-driven multi-party analytics – initially in the areas of emissions, offsets, and trade contracts.
- Global Trade & Supply Chain Business Working Group (GT&SC BWG): This group’s mission is the advancement of tokenization through creating standards for tokenization, contracts, and analytics. The initial area of focus will be electronic bills of lading.
- DLT Security Business Working Group: The mission of this working group is to facilitate the continuous development of standards and guidelines that will contribute to the advancement of leading best practices for tokenized platforms of distributed ledger technologies.
- Debt & Equity Working Group: The goal of this working group is to establish the standards for Tokenization, contractual extensions, workflows, and analytics for creating, issuing and trading digital bonds and equities.
By focusing on these real-world projects, market requirements, and performance metrics, the IWA will define tokenization and interworking standards to drive business-level interoperability, multi-party interchange, and trust across applications and networks. Our global membership includes leaders, adopters, innovators, developers, and businesses representing the best practices for every use case the token-powered ecosystem has to offer. Members pay an annual fee, and contribute their knowledge and insight to support the open-source frameworks.
Learn how you can be involved — click here.