Pyramid Architecture
The geometric foundation of AletheionGuard's uncertainty quantification
The Pyramidal Model
Apex (Height = 1)
Perfect knowledge. Q1 = 0, Q2 = 0. Model is certain and correct.
Middle (0.3 < H < 0.7)
Moderate uncertainty. Some Q1 or Q2. Requires human review.
Base (Height = 0)
Maximum uncertainty. High Q1 and/or Q2. Cannot make reliable prediction.
Mathematical Foundation
The Height Formula
Geometric Interpretation:
- • The pyramid represents the uncertainty space
- • Q1 and Q2 form the base dimensions (x, y axes)
- • Height is the vertical distance from base to apex
- • Pythagorean theorem: distance = √(Q1² + Q2²)
- • Height = 1 - distance (inverted for proximity to truth)
Example: High Confidence
Example: Low Confidence
Architecture Levels
AletheionGuard implements a progressive architecture with 4 levels of increasing sophistication:
Level 0: Q1 + Q2 (Basic)
FoundationQ1 and Q2 gates added only at the output layer. Basic uncertainty separation.
- • Q1 Gate (output layer)
- • Q2 Gate (output layer)
- • Independent predictions
- • Q2 MSE: ~0.057
- • ECE: ~0.10-0.15
- • 100-200 hyperparameter trials
Level 1: Pyramidal + Q1 + Q2 (Current)
ProductionFull pyramidal architecture with height gate, base forces, and epistemic softmax.
- • Q1 Gate (independent)
- • Q2 Gate (conditioned on Q1)
- • Height Gate (neural)
- • Base Forces (4 forces)
- • Epistemic Softmax
- • Temperature Modulation
- • Q2 MSE: ~0.045 (-23.3%)
- • ECE: ~0.07-0.10 (-30 to -50%)
- • 20-40 hyperparameter trials
- • 80% faster convergence
Level 2: Pyramidal + Attention Gates
RoadmapQ1 and Q2 integrated into attention heads for fine-grained uncertainty.
- • All Level 1 components
- • Q1/Q2 in each attention head
- • Layer-wise uncertainty propagation
- • Uncertainty-aware attention weights
Level 3: Full Fractal Q1 + Q2
FutureMeta-uncertainty and fractal gates at every layer with full uncertainty composition.
- • All Level 2 components
- • Meta-uncertainty (uncertainty about uncertainty)
- • Fractal gates everywhere
- • Hierarchical uncertainty composition
The Four Base Forces
Level 1 introduces 4 epistemic forces that balance at the pyramid base for stability:
🧠 Memory
Confidence in learned patterns from training data.
High: Overconfidence in memorization
Low: Insufficient learning
⚠️ Pain
Signal of error and need for correction.
High: Model knows it's wrong
Low: Overconfident when wrong
🎯 Choice
Active decision-making and selection between alternatives.
High: Exploring many alternatives
Low: Rigid, single-path thinking
🔍 Exploration
Seeking new knowledge and venturing beyond training distribution.
High: High uncertainty, OOD detection
Low: Conservative, staying in-distribution
Balance Principle
When all 4 forces are balanced (~0.25 each), the pyramid is stable. Imbalance indicates specific epistemic issues that can be addressed through targeted training or retrieval.
Epistemic Softmax
Level 1 uses adaptive temperature based on uncertainty to prevent overconfidence:
Traditional Softmax
Problem: Temperature is constant regardless of uncertainty.
Epistemic Softmax
Benefit: Higher uncertainty → Higher τ → Flatter distribution (less confident).
Pyramidal VARO Loss
Level 1 training optimizes multiple objectives simultaneously:
Accuracy Terms
- • Q1 MSE
- • Q2 MSE
- • Height MSE
Structural Terms
- • Force balance
- • Q2 gating
- • Fractal constraint
Calibration Terms
- • RCE loss
- • ECE minimization
- • Brier score
Practical Benefits
1. Faster Convergence
Level 1 converges 80% faster than Level 0 (20-40 trials vs 100-200 trials).
2. Better Calibration
Q2 MSE improves by 23.3%, ECE by 30-50% compared to Level 0.
3. Interpretable Structure
The pyramid provides a clear geometric interpretation of uncertainty.
4. Compositional Reasoning
Fractal structure enables hierarchical composition (Level 2+).
Code Example
Next Steps
Want to Learn More?
Explore the technical architecture documentation and research paper