Dhiway

Unlocking the Future with Data Tokenization: Getting the Most from Decentralized Ledgers

Back in 2018, our conversations revolved around how India’s UPI is picking up, and similarly why document verification is still so time consuming. We were discussing mainly usecases like VISA processing, Higher education across the border, and dreaded Government office visits in India, where every time you visit, there will be one more document missing between us and what we want to get. That is where we focused on solving for ‘trust’ in digital transactions, leading us to dive deep into something called Verifiable Credentials (VC). 

Fast forward five years, and many usecases spanning from nation scale (ONDC, NPCI etc), and small apartment usecases (for the likes of certificates for cultural programs, sports events etc), we’ve gathered some insights we’d love to share.


Understanding Verifiable Credentials (VC)

VC is a w3c standard for how to share information/credentials. At its core, VC aims to:

  • Confirm that the credential hasn’t changed since it was issued.
  • Verify the issuer’s identity.
  • Check if the credential is still valid (not expired).
  • Determine if the credential has been revoked.

VCs are a powerful way to manage credentials, helping machines quickly verify digital documents and boosting trust. It also focuses on providing the user the much needed ownership of her/his data. But, alas widespread adoption hasn’t fully taken off yet. The reasons for it are many, and valid when you think from the lens of the adopters. Lets list some.

  • It is an ecosystem story: You need all 3 parties (Issuer, Holder, and Verifiers) to use it to make it meaningful
  • Interoperability: Even though standard is flexible and provides options to interoperability, majority of the solutions use custom proof section, and ZK (ZeroKnowledge) system, making the issuer data, not easily verifiable from the service providers.
  • Government is a major stakeholder: In many scenarios, and usecases, the ball stops at the government. Because the Government’s adoption is a key blocker to build ‘trust’ in solutions.
  • Changes to existing software products: For many private companies we spoke to, the economics of making changes to existing software stacks was not feasible, and VC standards needed that change to even get started.
  • Who pays?
    • As it is an ecosystem play, the question in mind of for-profit companies is, who should pay? Why should I pay if I am issuing VC, because the service providers are benefiting from ‘Verifiable’ data.
    • For non-profits, it is not sustainable to keep funding the efforts repeatedly
    • For Govt and Executives, the ‘perk’ is not very clear. If people don’t come to offices, how do they ‘exercise’ their power?
    • Few of the visionaries wanted it, but bureaucracy is such that the very few startups in this space can never work with the government because of stricter tender norms.

But someone has to break this huge blocker, and here is how we approached the problem and designed the solution stack.


Data Tokenization by Dhiway


This is where data tokenization comes into play—think of it as adding a powerful new capability of verification without changing your existing data. Here’s what makes it valuable:

  • Integrity Checks: Easily verify if content remains unchanged by comparing hashes.
  • Issuer Verification: Use digital signatures (private/public keys) to confirm who issued the document.
  • Validity Checks: Include expiry directly within your data to manage validity effortlessly.
  • Revocation Control: The issuer can dynamically manage document status through tokens.

But that’s just the beginning. Data tokenization offers even more advantages than traditional VCs:

  • Easy Integration: Seamlessly connect with your current systems without additional changes.
  • Universal Compatibility: Tokenize any document type, not just JSON-based VCs.
  • Enhanced Trust: Transparent, decentralized logging ensures complete trustworthiness.
  • Time-series Verification: See document changes over time, not just single-point verification.


Sounds Ideal, What’s the Catch?

The primary limitation is that unlike VCs, which may be verified offline, data tokenization requires internet connectivity to interact with network nodes for verification. Beyond this, data tokenization is remarkably versatile and simple to implement.


More Benefits You Should Know About

  • Compliance Ready: Fully aligns with data privacy standards like DPDP and GDPR.
  • Flexible Security: Easily rotate keys for added security.
  • Empowerment: Delegate issuance powers, enabling broader participation.
  • Enhanced VC (VC++): Even VC JSON data (without embedded proof) can be tokenized for convenient use, much like a digital wallet.
  • Enhanced mDoc (mDoc++): Supports modern credential formats like mobile Driver Licenses (mDL).
  • Simplified Storage: Store tokenized data flexibly across various platforms like Google Drive, WhatsApp, local folders, or FTP servers, eliminating the need for dedicated wallets.


One may ask, “ ‘Tokenization’ is the word which is loosely used with web3 and blockchain technologies, isn’t it for ‘Transaction’ only?”. We always say, tokenization is a key word that came to highlight, because more people are using it. It is already being done. The STOCK MARKET works on tokenized shares today. The Bankers Cheque (Demand Draft/DD) is another such example. We are talking about ‘Data’ here, which is surely a superset of every thing we talk about Tokenization.

Does Tokenization solve all the things mentioned above?

  • Common Standard. Simple API to access. 
  • NO (major) CHANGES to existing software products, they continue to generate documents as is, with a new API added to it. 
  • The government can integrate without concern of sovereignty concerns.
  • Who pays? : This will be explained in a separate blog, many interesting pointers here, and we believe you will come back to read about it 🙂
  • It is an ecosystem story: Decentralized networks are designed for ecosystems. With CORD’s OnChain Governance, an option to run an enterprise network, the ecosystem should be able to benefit out.


All good? Let’s see how Dhiway can help.

What does Dhiway do? Simply put, we’re amplifying trust to find and fulfill opportunities. 

Now, our vision is, there would never be ‘one chain to rule them all’, hence providing the document / data creators an option to choose what they need. Public, private, consortium led tokenization network/stack is the right way to make people consume these features faster. Keep customer’s needs, their regulations, and their compliance requirements in mind when we propose a solution.

With that, Dhiway’s CORD based platform allows people to get started quickly on this, and start your journey with ‘Verifiability’.

Join us to be future ready  

Join us to be future ready