Use Cases

Tokenization Requires Verifiable Computation Over Data

Space and Time Foundation

The Space and Time Foundation is an independent organization dedicated to the advancement and adoption of Space and Time.

Tokenization has moved beyond pilots and single-asset experiments. Institutions are now deploying tokenized financial products in production: private credit portfolios, money market funds, structured notes, and stablecoins with dynamic backing. Tokenized assets today function as financial instruments whose value depends on ongoing computation over evolving datasets.

The data dependency of tokenized assets

Early tokenization pilots focused on simple representations: a token that tracks the price of a stock or treasury bond. The smart contract held a reference to an asset, and the token's value moved with it. Straightforward, but limited.

As tokenization programs scale, asset value increasingly derives from computation: cash flow aggregations, portfolio rebalancing rules, compliance filters, NAV calculations, and historical performance metrics. The asset's behavior is defined by computation over data that changes continuously.

For example, tokenized money market funds require active computation: rebalancing holdings based on yield curves, applying redemption rules based on liquidity thresholds, and enforcing investment criteria based on credit ratings. A private credit portfolio calculates distributions based on incoming payments across dozens of loans, applies waterfall logic to different tranches, and adjusts valuations based on default rates and recovery projections.

The asset itself is inseparable from the computation that defines it. Smart contracts enforce the rules that determine what the token represents at any given moment.

Why this matters for institutional adoption

Institutions moving assets onchain need infrastructure that can handle this complexity while maintaining verifiability. Traditional finance has sophisticated back-office systems that handle these calculations, but those systems operate in centralized environments where the institution controls the data and the computation. Moving onchain means those calculations need to happen in a way that's transparent, enforceable by smart contracts, and independently verifiable by participants.

This is where most tokenization projects hit a wall. Smart contracts can enforce logic, but they can't perform complex computation over large datasets or access historical information needed for valuations. Offchain systems can do the computation, but then the results have to be trusted rather than verified. Institutions end up in a position where they either simplify the asset to fit what smart contracts can natively handle, or they introduce centralized dependencies that undermine the value proposition of tokenizing in the first place.

What verifiable computation unlocks

Space and Time enables smart contracts to enforce rules based on computation over data they can't natively access. Instead of simplifying the asset to fit the limitations of what a contract can compute onchain, protocols define the computation as SQL queries that execute on Space and Time, generate ZK proofs, and return verified results that contracts can trust and enforce.

A tokenized credit portfolio can calculate NAV based on historical repayment data, apply waterfall distributions based on proven cash flows, and adjust valuations based on aggregated default rates, all using data that exists offchain but is provably correct when it reaches the contract. The computation happens where the data lives. The verification happens where the enforcement lives.

Verifiable computation changes what kinds of assets can be tokenized without introducing trust assumptions. Institutions don't have to choose between sophisticated financial products and decentralized infrastructure. They can build dynamic, data-driven, enforceable tokenized assets that behave like real financial instruments while keeping the computation verifiable and the execution trustless.

Computation as part of the asset itself

As tokenization matures, the distinction between the asset and the computation that defines it disappears.​​The token's value is the output of computation over real data, enforced by smart contracts that verify the computation was performed correctly.

Institutional tokenization requires verifiable data infrastructure. Without it, tokenized assets remain simplified versions of what they could be, either limited to what smart contracts can natively compute, or dependent on centralized systems that participants have to trust. With it, tokenized assets become what institutions actually need: sophisticated financial products that operate onchain with the same complexity and verifiability as traditional instruments, but with the transparency, composability, and programmability that make tokenization valuable in the first place.

Rules, rollups, and historical analysis are executed deterministically, proven cryptographically, and verified onchain. That's what makes tokenization work for real financial assets. Space and Time is the infrastructure that makes it possible.

Space and Time Foundation

The Space and Time Foundation is an independent organization dedicated to the advancement and adoption of Space and Time.