Taxonomy Q&A: Marley Gray, TTI Part I

iStock

Standards always have been the growing pain for each nascent technology. In the first part of a two-part interview, IntelAlley spoke with Marley Gray, chair of the Token Taxonomy Initiative, board member of the Enterprise Ethereum Alliance, and a principal architect at Microsoft, regarding the Initiative’s goals.

What deliverables should the digital currency and tokenized asset markets expect from the Token Taxonomy Initiative?

Marley Gray, TTI

Token-technology standards are advancing more rapidly than the token policy. As new laws and regulations are addressed around this nascent technology, token definitions – both technical and policy – are critical. It’s critically important that the staff leading the technology and policy initiatives, like the EEA and its related Token Taxonomy Initiative and the Chamber’s Token Alliance, respectively, come together to appropriately define what a digital asset is both from a technical and regulatory perspective.

The Chamber of Digital Commerce will become an EEA Associate–Collaborative Member, and the EEA will become a strategic partner of the Chamber of Digital Commerce. This formal partnership between the EEA and the Chamber of Digital Commerce enables staff at both organizations to access meetings held by the EEA’s Token Taxonomy Initiative and Chamber’s Token Alliance as well as the joint resources of both organizations to provide input collaboratively.

This partnership is a prime example of a technology standards group like the EEA and an advocacy group, such as the Chamber of Digital Commerce, look to bridge the gap between the technological interoperability and policy aspects of tokenization that will push the industry forward. The key here is to ensure that within the entire token ecosystem, those producing tokens meet both the industry’s regulatory and interoperability requirements.

This cross-pollination is going to be incredibly crucial for the growth of this ecosystem. With these technology and policy communities coordinated, the ecosystem will end up with standards and regulations that are interoperable.

Our focus is to create a common vernacular between technology and policy.

The public’s understanding is that a token is, for better or worse, likely is rooted in cryptocurrency and publicly traded, in often very speculative, ‘investments’ based on the concept that we can and should change the way money is stored or exchanged. This perspective is a myopic understanding of what the power of tokenization actually is, the ability to represent value, not just money. We are surrounded by tokens. We just don’t think of them that way. Every single business process between parties involves tokens today; it’s just that these tokens are not shared between parties, but are duplicated for each, requiring negotiation between the parties as to the truth of the current business process. The Token Taxonomy Framework (TTF) is a way to expand the world’s view of tokens to the potential that tokens have to change the way we all do business with each other by agreeing on standard terms and sharing easy to understand new ways we can do business with each other.

Should the financial services industry expect something as granular as a CUSIP number of a Legal Entity Identifier?

The TTF encourages existing standard data formats and industry-accepted concepts to be expressed as a part of the TTF standard. Representation of industry-accepted CUSIP is expressed via a property set in the TTF and includes the industry description and prescribes its format when implemented in a Token. The CUSIP Property Set in the TTF will also have links, called maps, to existing CUSIP specifications, regulatory background, and even source code for implementing the CUSIP property set for various platforms and programming languages.

This practice should occur in all industries and frameworks like ISDA’s Common Domain Model to Healthcare Continuity of Care Document.

Is the TTI targeted more towards the tokenized asset’s coders or issuers?

Wherever you choose to build your token, you will be using the same well-defined behaviors and characteristics as everyone else, leading to the interoperability advantages that Ron outlined. I’m expecting to see all sorts of token types defined via the TTF: tokens that have digital value only, tokens that represent real-world assets, utility tokens for new ecosystems, tokens for the social good (e.g., digitalID), and more. Some may be listed on exchanges, some may not, but application developers will understand how to interact with them uniformly.