Skip to main content

How SonarX approaches the trade-off between speed and quality/finality in Blockchain data

Unlike traditional centralized data, blockchain introduces variable block times, heterogeneous execution environments, probabilistic finality, and cross-chain risk. What comes because of all the above is that institutional-grade blockchain data require vendors to confront and overcome the hardest engineering edges of decentralized systems:
  • Finality modeling (reaching durable finality)
  • Consensus and economic risk analysis
  • High-integrity re-org aware and re-org adjusted data (for both historical and Real Time)
Blockchain breaks the idea of instant truth. TradFi assumes feeds reflect settled facts. On-chain data reflects activity in progress, not final reality. This difference changes everything about how a data vendor must operate. SonarX’s job is to translate the unpredictable surface of decentralized networks into institutional-grade data that never surprises the downstream system. Blockchains rely on distributed consensus. Nodes must agree. Blocks must propagate. Signatures must be verified. These steps create latency that no optimization can fully erase. As a consequence, block times vary widely:
  • Bitcoin targets 10 minutes.
  • Ethereum aims for 12 seconds.
  • Solana pushes for higher throughput, yet it still relies on distributed consensus, which introduces delays.
  • Hyperliquid Core generates 200k individual blocks a second and allows no delay.
This is not a flaw. It’s a tradeoff. Decentralization buys censorship resistance and structural resilience. But it adds time, and institutions need to understand precisely how much time. Consequently, SonarX must surface blockchain events with low delay, yet it cannot pretend that early observations equal the final truth.

Visibility is not finality, and Institutions only care about finality — not visibility. A transaction appearing in a block does not mean it will survive the following few blocks. Reorganizations happen. Forks happen. Finality is probabilistic until the network advances far enough.
In this environment, “100% reliability” requires more than fast ingestion. It requires modeling economic security and consensus behavior on a chain-by-chain basis. As such, SonarX:
  • Models finality per chain —> Determine the confirmation depth required to treat a block as effectively immutable
  • Understands economic security —> Measure the economic cost of attacking a chain at any moment
  • Measures validator risk —> Track staking levels, validator sets, and protocol parameters that affect finality guarantees.
  • Tracks consensus parameters —> Deliver data only when certainty crosses a defined threshold, based upon historical reorganization patterns.
Integrity forces latency — the controlled kind, not accidental delay. SonarX distils the results of those analyses and modeling into a specific parameter called “P90 Lag Period”, which is individually tailored to each chain. We will see how this parameter applies later on. This turns quality into a cross-disciplinary problem. It is not about ETL pipelines or data cleaning. It demands cryptographic understanding, consensus modeling, economic analysis, and chain-specific intelligence:
  • Every chain has different finality mechanics.
  • Every client has a different tolerance for risk.
  • Every dataset must reflect these realities.
Guaranteeing integrity requires a two-layered approach. The first layer captures data in real time with minimal delay. The second layer certifies finality once the network has progressed far enough (this is where the P90 Lag Period parameter plays its critical role). This approach aligns with institutional expectations by providing both visibility and reliable end-state information. Without this dual model, the data cannot support trading, treasury, settlement, surveillance, or compliance workflows. Given the highly sensitive and proprietary nature of the data, analysis, and logic behind the P90 Lag Period, SonarX does not make it public. However, we are happy to share any details of it, including the specific values for each chain, with our customers, privately. Here you can see a quick summary of the freshness of SonarX Datasets (across 120+ chains overall), based upon the P90 Lag Period configurations:
Average FreshnessMedian Freshness90th-Percentile FreshnessFastest FreshnessSlowest Freshness
7 minutes4 minutes13 minutesunder 3 seconds51 minutes
SonarX’s high-quality process encodes these rules into its Reliability Layer, a robust operational paradigm at the core of our differentiated offering for Real-Time and Full-Historical datasets.