The Next 30 Years of Transaction Processing
The Next 30 Years of Transaction Processing
- 30min
In the last 7 years, the “transactions of everyday life” have increased 100x to 1,000x, even 10,000x across several sectors (e.g. cloud computing, energy, real time payments). Yet popular transaction databases are 20-30 years old, and even newer cloud databases are circa 2012, designed for a different workload and scale.
Research into maximizing durability (and therefore availability) has advanced since 2018 (e.g. storage fault-tolerance, high frequency trading architectures, deterministic simulation testing) but can be hard to retrofit. At the same time, extracting transaction performance from general purpose database designs is becoming more and more expensive.
How can we reset transaction infrastructure for the next 30 years? To adapt and apply specialization to unlock three orders of magnitude more performance? How can we evolve our engineering methodologies for tighter tolerances and stricter safety standards?
Join us to look at what the world needs, and how these needs invent and predict the future of transaction processing, in the context of micro transactions and Interledger.
Research into maximizing durability (and therefore availability) has advanced since 2018 (e.g. storage fault-tolerance, high frequency trading architectures, deterministic simulation testing) but can be hard to retrofit. At the same time, extracting transaction performance from general purpose database designs is becoming more and more expensive.
How can we reset transaction infrastructure for the next 30 years? To adapt and apply specialization to unlock three orders of magnitude more performance? How can we evolve our engineering methodologies for tighter tolerances and stricter safety standards?
Join us to look at what the world needs, and how these needs invent and predict the future of transaction processing, in the context of micro transactions and Interledger.