Tokenization has been advertised as a new way to mark and track value. The key drivers of adoption are traceability, provenance and carrier-neutrality. In theory, it’s a long sought-after upgrade to our financial system. However, it, like all networks, it too has vulnerabilities that make it susceptible to capture by entities that favor centralization or co-optation for their own nefarious purposes.
The Bitcoin and cryptocurrency sector itself has been a subject to cyclical scandals coinciding with market downturns, and there’s no reason to expect that tokenization would be any different. The nature of the scandals themselves however, have mostly to do with the people building applications and offering services on the network, and less so with the core idea. Therefore rather than throwing the baby out with the bathwater, it would rather be smart to identify the potential exploits, exploiters and to regulate them away from existence. This also appears to apply to jurisdictions as well.
A jurisdiction favoring malpractices may itself be flagged as such.
Sometimes, the simplest possible solution is the right one.
The concept of tokenization is brilliant: give each asset a portable digital identity, program its attributes, give it permissions required for the user to trade it, and we have an enforceable digital contract that may automate almost any process.
The truth however, is a little bit more complex. Tokenization is a new financial language still forming its basic types, classes and grammar. To be successful, the idea has to develop best practices. Fundamentally, the division is between federated and centralized networks, but beyond that the fundamental question is, who is allowed to do what and why? and who gets to regulate that? Centralized networks provide arbitrary rules, but federated ones must negotiate them. This inevitably leads to digital fiefdoms on one side, and participatory digital democracy on the other. What’s the role of AI and who gets to host, program and regulate that? what can and should be outsourced to the AI? Answering these fundamental questions likely write our future for a long time into the future.


This reminds me of an anecdote. I took part in Latin America’s first Bitcoin conference, LaBitconf 2013. On an evening break in a restaurant where we had dinner with the speakers and founders, I spoke to Andreas Antonopoulos. His message at the time were that the killer app of crypto is the DAO, and to be specific, an open source, transparent, decentralized DAO. Something that can’t be evil. He meant built-in safeguards and deterministic logic. The infrastructure would then be basis for user-friendly apps. As the idea of blockchain and tokenization matures, his idea becomes ever more relevant.
Tokenization promises freedom, but the promise is kept by conscious effort
Developers must acknowledge that VC money isn’t everything. The fundamentals must be in place for the apps to change anything. Most of the added value in crypto for the average user comes through optionality. Freedom from. Freedom from corporate gatekeeping, vendor lock-in, jurisdiction specificity and most importantly, freedom from the whims of tyrants. This kind of freedom comes with responsiblity. The responsibility to design systems that are ethical by design. The ethics? To design systems that allow constant iteration, but cannot be co-opted by an entity that is incentivised to dominate it, or at least minimize the chances.
Maybe in thef future, we will have more sustainable projects. There’s hope, but it’s in our hands.

