Mobiu – Soft Algebra Optimization for Quantum & AI
Soft algebra optimizer for quantum & complex systems
Mobiu

Stable optimization in a noisy quantum world.

Mobiu turns soft algebra and demeasurement into a drop-in optimizer that stays stable where Adam and RMSProp break — from VQE and QAOA to reinforcement learning under uncertainty.

Problem

The global optimization gap

Classical optimizers like Adam, RMSProp and friends were tuned for large, smooth deep-learning landscapes. In quantum and other non-deterministic systems, the assumptions they rely on simply break.

  • Non-deterministic gradients from hardware noise and sampling.
  • Exploding / vanishing updates in highly entangled parameter spaces (VQE, QAOA).
  • Non-convex, jagged loss surfaces that violate smoothness and stationarity assumptions.
  • Expensive shots and compute wasted on unstable training.
Solution

The Mobiu soft-algebra optimizer

Instead of classical momentum, Mobiu maintains a dual soft state that tracks both measured performance and latent "could-have-been" structure at every step: St+1 = St · Δt + Δt. Here St = a·0̄ + b is a soft number with nilpotent axis 0̄² = 0.

Core

Soft algebra stability

Dual-axis arithmetic absorbs volatility and damps gradients, preventing divergence while preserving sensitivity to meaningful signal.

Engine

Demeasurement

A complementary operation to quantum measurement: mapping noisy outcomes back into soft-number space, including variance as a first-class citizen.

Control

Adaptive trust mapping

Self-calibrating learning rates that trust strong, consistent gradients — and automatically cool down when uncertainty dominates.

Math

Soft algebra in one minute

In Mobiu, every scalar becomes a soft number: X = a·0̄ + b, with 0̄² = 0.

  • b captures what has actually been measured.
  • a·0̄ captures nearby alternative outcomes — an infinitesimal "could-have-been" axis that never explodes thanks to nilpotency.
  • The optimizer carries this structured uncertainty forward without destabilizing the main trajectory.

What is demeasurement?

Quantum measurement collapses a superposition into a single outcome. Demeasurement is our way back up: we take noisy samples and lift them into soft-number space: St = demeasure(at, bt) = at·0̄ + bt.

Instead of discarding noise, we encode it as structured, differentiable information that guides more cautious and intelligent updates — especially valuable when gradients come from finite shots or unstable environments.

Evidence

Extended validation across 13+ quantum problems

According to our whitepaper with 800+ random seeds and comprehensive benchmarks, Mobiu’s soft optimizer achieved:

  • +43.88% average improvement over Adam in final optimization gap.
  • +75.01% improvement over baseline methods across all tested problems.
  • Statistically significant gains (p < 0.001) on all 13+ quantum tasks.
  • Consistent gains on VQE molecules (H₂, LiH, H₂O, NH₃, Be₂, He₄) and spin models (Ising, Heisenberg, XY, Transverse Ising).

Full methodology, ablations and detailed results for each quantum problem are available in the Mobiu whitepaper.

Applications

From quantum chemistry to noisy RL

Mobiu focuses first on quantum optimization, with a clear expansion path into classical AI and complex, noisy environments.

VQE QAOA Hybrid quantum-classical ML Noisy RL Financial modeling

Quantum chemistry

Tighter ground-state energies for molecules like H₂ and LiH under tight shot budgets and hardware noise.

Spin & lattice models

More stable variational optimization on rugged landscapes such as Heisenberg and Ising chains.

Reinforcement learning

Policy optimization where rewards are sparse, delayed and noisy — without giving up stability.

Risk & finance

Optimization where volatility and uncertainty are first-class signals, not noise to be ignored.

Team

Where algebra meets attention

Mobiu sits at the intersection of mathematical logic, quantum computation and attention-based learning systems.

Ido Angel — Founder & Vision. Author of "Attention: The Atom of Consciousness", exploring unified theories of attention across biological and artificial systems and applying them to optimization and learning.

Dr. Moshe Klein — Scientific Founder. Co-author of "Foundations of Soft Logic". His soft algebra framework underpins Mobiu's optimizer and its dual-axis treatment of potential and measurement.