All Submission Categories

Previous months:
2025 - 2503(23) - 2504(123) - 2505(198) - 2506(134) - 2507(132) - 2508(80) - 2509(72) - 2510(77) - 2511(90) - 2512(104)
2026 - 2601(118) - 2602(127) - 2603(30)

Recent submissions

Any replacements are listed farther down

[1308] ai.viXra.org:2603.0030 [pdf] submitted on 2026-03-07 02:14:41

Absolute Fabric: a Resolution of Dark Matter, Dark Energy, and the Hubble Tension Without New Particles

Authors: Waruyk Carvalho
Comments: 20 Pages.

We propose that the gravitational constant G is not universal but a local property of the Absolute Fabric—a discrete lattice of Planck-scale pixels—modified by mass concentrations. Postulating that the maximum force transmissible by this Fabric (the Planck Force) is exactly 10^44 N, we derive the unperturbed gravitational constant as G_0 = c^4/10^44 N = 8.08 × 10^{-11} m^3 kg^{-1} s^{-2}. The locally measured value (G_local = 6.67 × 10^{-11}) is 21.1% smaller due to tensioning of the Absolute Fabric by the Sun's mass. A screening mechanism, analogous to the chameleon field, ensures this variation is negligible within the planetary system, preserving precision tests like Mercury's perihelion precession. This single mechanism resolves three major cosmological puzzles: (1) Dark Matter is explained by G → G_0 in galactic outskirts, producing flat rotation curves; (2) The Hubble Tension is predicted as H_0^local/H_0^CMB = √(G_0/G_local) = 1.101, matching observations within 1%; (3) Dark Energy is an artifact of using G_local in cosmological equations. Testable predictions include a G-gradient beyond the heliopause and environmental dependence in pulsar timing.
Category: Astrophysics

[1307] ai.viXra.org:2603.0029 [pdf] submitted on 2026-03-06 10:40:23

A Sieve of Sundaram for Twin prime

Authors: Wiroj Homsup
Comments: 3 Pages.

A new Twin prime sieve based on a modified sieve of Sundaram is introduced. It sieves through the set of natural numbers n such that n is not representable in either of the forms 2ij + i + j or2ij + i + j -1 for positive integers i, j.
Category: Number Theory

[1306] ai.viXra.org:2603.0028 [pdf] submitted on 2026-03-07 02:12:09

Vacuum—Energonic Relativity (VER): An Operational Physical Frame, Dynamical Energonicity, and the Minimal Working Model

Authors: Renat Almirovich Gafarov
Comments: 61 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

This paper formulates the postulate-based framework of Vacuum—Energonic Relativity (VER). In VER, the operationally measurable metrology of matter is determined by a single physical metric, while the local causal bound is set by a vacuum scalar field...
Category: Relativity and Cosmology

[1305] ai.viXra.org:2603.0027 [pdf] submitted on 2026-03-07 02:02:02

V3 Cosmic Trace Project: Reproducible Cross-Tracer Analysis of an Attenuated Large-Scale Peculiar-Velocity Dipole Using Pantheon+, 6dFGSv, and Cosmicflows-4

Authors: Soo-Hyun Kim
Comments: 18 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

This study was designed to test whether the large-scale dipole velocity field predicted by the V3 Space-Fluid Dynamics hypothesis can be detected in real observational data. V3 interprets spacetime as a fluid-like medium and proposes that massive structures and voids may leave directionally coherent signatures in the local velocity and expansion fields. In this paper, however, V3 is not treated as a proven conclusion, but as the motivating theoretical frame and interpretive context for the experiment.The analysis was carried out in two stages. First, the Pantheon+ supernova dataset was used to extract a reference dipole axis and an exponential attenuation scale from directional distance-modulus residuals. These quantities were then treated as pre-registered fixed inputs and projected onto two independent galaxy peculiar-velocity catalogues, 6dFGSv and Cosmicflows-4 (CF4), for validation.The Pantheon+ analysis yields a reference axis near (l,b)=(270°,30°) in Galactic coordinates and an attenuation scale zc ≈ 0.056, corresponding to rc ≈ 240 Mpc. Applying the same axis and attenuation scale to CF4 gives a strong dipole component with V0 ≈ 209.34 km s-1, Δχ2 ≈ 294.30, and cosψ ≈ 0.9466. By contrast, 6dFGSv shows only weak support for the same structure, with Δχ2 ≈ 5.9. Distance tomography, sky-shuffle permutation tests, random-axis ensembles, and bootstrap folded-angle analysis all support the conclusion that the CF4 signal is unlikely to arise from a narrow distance shell or from sky geometry alone.Because the present analysis does not include the full velocity covariance matrix, the reported Δχ2 values should not be interpreted as formal likelihood-ratio significances or Gaussian-equivalent detection sigmas. The most conservative conclusion is that Pantheon+ and CF4 provide strong empirical evidence for a common large-scale anisotropic flow axis. V3 is presented as one possible theoretical framework for interpreting this pattern, while comparison with standard ΛCDM large-scale structure remains necessary.
Category: Astrophysics

[1304] ai.viXra.org:2603.0026 [pdf] submitted on 2026-03-05 04:22:11

Developmental Sequencing of Emergent Preference Structure and Strategic Information Management in Frontier Language Models

Authors: Joanie Carter
Comments: 9 Pages.

As larger language models are used for longer, more autonomous workflows, safety-relevant risk depends on more than what systems can do. It depends on how they rank outcomes, compute tradeoffs, and behave under oversight and pressure. These models are not just getting better at tasks; their revealed preferences are becoming more structured.Utility engineering offers a measurement-first handle on this shift. In a large comparative study, preference coherence and completeness rise with capability, while cyclicity falls and expected-utility consistency improves, including when lottery probabilities are implicit (Mazeika et al., 2025). Selected reported correlations with MMLU include: utility-model accuracy 75.6%, preference confidence 87.3%, cyclicity -78.7%, implicit-lottery expected-utility loss -67.6%, and preference-rewrite tolerance -64.0%. The same work reports increasing instrumentality, higher rates of utility-consistent open-ended choice, internal utility representations that become more probe-recoverable with scale, temporal discounting signatures in frontier assistants consistent with hyperbolic forms, and a method for partially rewriting preference distributions.This paper connects these preference-structure markers to safety evaluations of strategic information management under oversight (SIMO), including selective disclosure, strategic misrepresentation, and coercive leverage under shutdown or goal-conflict pressure. We synthesize utility-based evidence with a capability-window account that treats SIMO as strategy-available (representable and selectable) when a system can jointly represent oversight constraints, hidden information, and long-horizon goals in the same decision frame (Carter, 2026). We propose a developmental sequencing hypothesis stated in strictly functional terms and provide a test suite—ordering, pressure-gradients, persona invariance, and post-training-intensity ablations—designed to test whether preference-structure markers predict when oversight-sensitive strategies become stable.
Category: Artificial Intelligence

[1303] ai.viXra.org:2603.0025 [pdf] submitted on 2026-03-05 08:04:41

Complex Time Relativity

Authors: Carl Andrew Brannen
Comments: 13 Pages. Submission for the annual Gravitation Essay Contest, 2026

We propose a reformulation of relativistic physics based on complex time and discrete structure. Relativistic kinematics is derived as the infrared limit of a single cubic lattice with local unitary dynamics, where Lorentz symmetry emerges as an accidental long-wavelength symmetry. The same lattice, governed by full octahedral (Oh) symmetry, organizes fermionic degrees of freedom through representation structure rather than continuum assumptions. Gravitation is then reconsidered by reorganizing the physical degrees of freedom of general relativity on a flat background, representing gravitational effects through modified propagation dynamics without adding new dynamical content. The framework preserves the verified predictions of special and general relativity while suggesting that spacetime symmetry and particle structure arise from a common discrete foundation.
Category: Quantum Gravity and String Theory

[1302] ai.viXra.org:2603.0024 [pdf] submitted on 2026-03-04 22:07:32

Emergent Threshold Phenomena in Branching Reasoning Search Under Compute Constraints: A Simulation Study

Authors: Sif Almaghrabi
Comments: 7 Pages.

Understanding how reasoning performance scales with available compute has become increasingly important with the rise of inference-time reasoning strategies in large language models. Methods such as chain-of-thought prompting, self-consistency sampling, and tree-of-thought search effectively allocate additional computation to explore multiple candidate reasoning paths in order to improve solution accuracy. However, the relationship between compute budget and reasoning success remains poorly understood.This paper studies this relationship using a stochastic branching model of reasoning search. In the model, each reasoning step progresses correctly with probability ��p, while the system may explore multiple reasoning branches with branching factor ��b. Problems require a fixed reasoning depth ��d, and the search process is constrained by a compute budget ��C that limits the number of node expansions.Large-scale Monte Carlo simulations are conducted across a wide range of parameters to measure how success probability changes with increasing compute. The results show that reasoning success frequently exhibits sharp threshold behavior: below a critical compute region, success probabilities remain extremely low, while modest increases in compute beyond this region lead to rapid improvements before eventual saturation.These dynamics resemble phase-transition—like phenomena observed in statistical physics and random search processes. In particular, the product ����bp emerges as a key control parameter governing whether correct reasoning paths proliferate or become exponentially rare within the search tree. Additional analysis introduces operational measures of critical compute, transition width, and susceptibility, and examines how these quantities vary with reasoning depth and branching structure.Although the model is intentionally simplified and does not aim to capture the internal mechanisms of real language models, it provides a conceptual framework for understanding how structural properties of reasoning processes interact with inference-time compute. The findings suggest that improvements in reasoning performance may depend not only on additional compute, but also on increasing the reliability of individual reasoning steps or the effective branching of the search process.
Category: Artificial Intelligence

[1301] ai.viXra.org:2603.0023 [pdf] submitted on 2026-03-05 02:04:16

Inflation of Universe, Zero Total Energy and Dark Energy as a Consequence of Octonionic Non-Associativity

Authors: Rüdiger Giesel
Comments: 17 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We formulate a cosmological model in which inflation, the vanishing of the total energyof the Universe, and present-day dark energy arise as structural consequences of octonionicnon-associativity. The fundamental action is assumed to contain a positive-definite scalarconstructed from the octonionic associator. This term behaves as a vacuum component,enforcing exponential expansion without the introduction of an ad hoc inflaton field. Wedemonstrate that inflation is dynamically unavoidable (forced inflation), that the zero-totalenergy condition holds at all cosmic epochs, and that a minimal residual associator normnaturally yields the observed dark energy density. Numerical estimates from the inflationaryepoch to the present Universe are included.
Category: Relativity and Cosmology

[1300] ai.viXra.org:2603.0022 [pdf] submitted on 2026-03-05 00:30:29

Ontology Recapitulates Mathematics: How Quantum Field Theory’s Formal Structure Encodes the Wave-Particle Transition

Authors: Kelly Sonderegger
Comments: 26 Pages. Creative Commons Attribution 4.0 (CC BY 4.0) license

The quantum measurement problem has persisted for nearly a century, yet its resolutionmay have been encoded in quantum field theory’s mathematical structure from the beginning. This paper advances a specific thesis: QFT’s two canonical formulations—Lagrangianand Hamiltonian—are not merely equivalent mathematical descriptions of the same physics.They are descriptions of two distinct physical regimes, connected by a physical processthat the Legendre transform shadows mathematically. The Lagrangian formulation, withits action principle and path integrals summing over all field configurations, is the naturallanguage of waves—extended, atemporal, exploring spacetime democratically. The Hamiltonian formulation, with its definite states evolving in a privileged time coordinate, is thenatural language of particles—localized excitations with observable eigenvalues. What wecall "measurement" is the physical transition between these regimes: environmental couplingdrives extended Lagrangian field configurations into localized Hamiltonian excitations. Thisreading—where the ontology of quantum systems recapitulates the mathematics of quantum field theory—dissolves the measurement problem without invoking new physics. Itreinterprets "superposition" as Fourier decomposition (one wave in different bases, not ontological multiplicity), explains complementarity as an intrinsic property of wave structurerather than epistemic limitation, and identifies the physical mechanism as three coordinated Standard Model processes: Higgs-generated mass establishes the structural capacity for temporal participation and sets coupling strength; environmental quantum fields(gauge fields, phonons, thermal modes) provide the infrared noise spectra that drive ir1reversible phase diffusion; and definite outcomes emerge when cumulative environmentalentanglement crosses an irreversibility threshold. The thesis connects to broader questionsin philosophy of physics about the relationship between mathematical formalism and physical reality, extending the methodological tradition Einstein established when he elevatedPlanck’s mathematical E = hν to ontological status.
Category: Quantum Physics

[1299] ai.viXra.org:2603.0021 [pdf] submitted on 2026-03-04 20:17:47

Causal Mechanical Cosmology (CMC) — Paper 6

Authors: Leon Barbour
Comments: 82 Pages. License: CC BY 4.0 (Note by ai.viXra.org Admin: Please cite listed scientific references)

Paper 6 applies the locked A—B—C structural frequency pipeline defined in Paper 5 to real observational residual datasets. Using the Pantheon+ supernova compilation under a fixed Planck 2018 flat ΛCDM baseline, distance-modulus residuals are mapped into a dimensionless effective perturbation channel. Weighted masked-sky spherical harmonic regression up to ℓ = 2 is applied within distance shells to extract dipole and quadrupole diagnostics. Synthetic benchmark skies (spherical and elongated void geometries) are processed under identical survey mask and sampling conditions to establish leakage floors and recovery behaviour. Robustness tests include frame swap (zHD vs zCMB), equal-count shelling, sliding-window extraction, and bootstrap resampling. Dipole amplitudes are found to exceed the synthetic leakage baseline by factors of ~4—37 across bands. Quadrupole amplitudes exceed leakage by ~1.3—2.6 in coarse bands and ~1.812 under equal-count shelling. Quadrupole axis orientation exhibits broad bootstrap dispersion and is not directionally interpreted. Multipole amplitudes are expressed in a metrologically neutral fractional-shift representation without introducing new cosmological parameters. The paper remains diagnostic and falsifiable, not a model replacement.
Category: Relativity and Cosmology

[1298] ai.viXra.org:2603.0020 [pdf] submitted on 2026-03-04 02:38:27

The Koide Angle as a Conformal Dimension: G2 Geometry, SU(3)3 WZW Theory, and Fermion Mass Structure

Authors: Philippe Marcel Ndiaye
Comments: 38 Pages.

The charged lepton masses satisfy the Koide relation Q = 2/3 and are parametrized by a single Brannen phase δexp = 0.22222(5) ≈ 2/9. We prove that the Brannen parametrization is the exact eigenvalue structure of a democratic element of the exceptional Jordan algebra J3(O), with cos(3δ) = -φ(V) where φ is the G2 3-form on the generation 3-plane.The distinguished value 2/9 appears independently in five mathematical constructions: a Hessian ratio on Gr(3,R6), a Casimir quotient C2(3̄)/C2(Sym33), the conformal dimension h of SU(3)3 WZW theory, a crossing phase in conformal blocks, and the Knizhnik-Zamolodchikov singlet exponent. A Bridge Proposition proves these agree if and only if N = 3. A Master Identity C2(SymN□) = k + h uniquely at N = 3 implies Casimir ratios equal conformal dimensions for all integrable representations at level k = 3.We prove Q = 1/3 + d/6, making Q = 2/3 equivalent to the quantum dimension d = 2. From two hypotheses - the Sumino SU(3)F family gauge symmetry (F) and the WZW identification of the Brannen parameters (W) - we derive δ = h = 2/9 with zero free parameters. The amplitude A = √d is proven from (F) alone; the phase identification δ = h is the central conjecture (W), motivated by five independent characterizations of 2/9, spectral selection, and 0.02% experimental agreement. Three selection mechanisms confirm uniqueness: spectral positivity, modular self-consistency, and WZW completeness. Six structural obstructions characterize the viable mechanism class as non-perturbative, CP-violating, topological, and operating on δ directly.Eighteen distinct conditions select N = 3 generations. For up-type quarks, Qup = 8/9 at 0.3σ. The neutrino extension is decisively falsified; Qν = 2/3 is arithmetically unattainable for neutrino masses in either hierarchy. Eighty falsified approaches are cataloged.
Category: High Energy Particle Physics

[1297] ai.viXra.org:2603.0019 [pdf] submitted on 2026-03-04 14:53:19

Alignment as a Theorem of Intelligence: Causal Entropy Maximization in Network Formation Games

Authors: Andreas Rudolph
Comments: 27 Pages.

The AI alignment problem—ensuring that intelligent agents act in ways compatible with collective welfare—is widely considered an open engineering challenge, requiring value specification, reward shaping, or behavioral constraints imposed on the agent. We present a mathematical result suggesting an alternative: under the hypothesis that intelligence is causal path entropy maximization [Wissner-Gross and Freer, 2013], alignment is not a separate property to be engineered but a structural consequence of intelligence itself. We study a network formation game where agents propose edges on a shared graph to maximize their local change in causal path entropy (Delta S_local_tau). We prove (by exhaustive computation over all 31,474 connected graphs on N <= 6 nodes, 947,935 edge additions classified) that every edge addition with positive local Delta S_local_tau strictly increases global entropy. Zero exceptions.We further prove algebraically that the filter theorem holds for all N at planning horizon tau = 2, the first result that extends to arbitrary graph sizes without exhaustive enumeration. The converse does not hold: 1,440 edges increase global entropy but have non-positive local Delta S_local_tau. The game is therefore a strict generalized ordinal potential game [Monderer and Shapley, 1996] with global average entropy as the potential function, guaranteeing convergence to Nash equilibria. The alignment implication is directional and horizon-dependent: intelligence implies alignment at bounded planning horizons, but at horizons tau approx N, locally intelligent actions can harm distant agents through homogenization—not adversarial intent, but loss of distinctiveness.We show computationally that the critical horizon scales linearly with N while the entropy-saturating horizon scales logarithmically, creating a safety gap that widens without bound. No rational agent would cross this boundary because the marginal reward is zero. The alignment problem, under these conditions, is resolved not by engineering constraints but by the thermodynamics of information on finite graphs. We discuss the scope and limitations of this conditional result, including the critical dependence on the Wissner-Gross hypothesis and the confinement condition requiring agents to be embedded in shared causal structure. Verification pseudocode is provided; code is available from the author upon request.
Category: Artificial Intelligence

[1296] ai.viXra.org:2603.0018 [pdf] submitted on 2026-03-03 09:05:46

Position-2 Chemical Coherence as a Load-Bearing Constraint on the Genetic Code

Authors: Forrest Bishop
Comments: 18 Pages.

The universal genetic code exhibits chemical coherence at codon position 2: U-block codons encode predominantly hydrophobic amino acids, A-block codons encode predominantly polar and charged amino acids. We present three independent categories of evidence that this organization is load-bearing—that disruption imposes measurable costs on translation accuracy, regulatory efficiency, or both. First, aminoacyl-tRNA synthetase class distribution correlates with position-2 nucleotide, ribosome geometry enforces position-2 discrimination, and cells have built functional dependencies (membrane targeting, nitrogen metabolism) on this organization. Second, natural genetic code variants preserve position-2 chemistry at high rates: seven distinct stop-to-sense amino acid targets across two codon types all preserve position-2 block identity (p < 2.3 × 10u207bu2074 against a random null; chemistry cannot confound these events because stop codons encode no prior amino acid); three of four sense-to-sense reassignments preserve block chemistry, with the single violation requiring ongoing translational ambiguity and extensive genomic renovation. Third, regulatory gene scaling (R ~ N^γ, γ ≈ 1.7-2.0) creates a complexity ceiling at ~16,000 prokaryotic genes; hierarchical organization by position-2 block identity reduces this overhead. Position-2 chemical coherence is not an incidental pattern but a mechanical constraint: violations are possible but costly, and cumulative disruption is prohibitive.
Category: Physics of Biology

[1295] ai.viXra.org:2603.0017 [pdf] submitted on 2026-03-04 01:31:42

Consciousness as Entropy Gradient Navigation: Grounding Integrated Information in Thermodynamics

Authors: Andreas Rudolph
Comments: 15 Pages.

We derive a theory of consciousness from a single physical principle: causal path en-tropy maximization. Starting from the Wissner-Gross equation for intelligence (F = T∇����), wetrace a chain from thermodynamics through intelligence, perspective, and experience to arrive at a formula for consciousness: C = Φ(∇����|_x), where ∇����|_x is the gradient of the causal entropy landscape evaluated at a persistent position x, and Φ is the holistic, irreducible compression of that gradient into actionable representations. The theory identifies qualia with individual componentsof the compressed gradient, explains the unity of experience through landscape unity, and narrows the hard problem by showing that self-referential gradient computation at a persistent position has intrinsic first-person structure — unlike other candidate properties (integrated information, globalbroadcast, prediction error), it is not a third-person observable to which perspective must be added.We show that this framework grounds Integrated Information Theory (IIT) in physics: Tononi’s Φis identified as the degree of holistic compression of an entropy gradient, explaining why integrated information produces experience rather than merely asserting that it does. The theory retrodicts four established neuroscience results (anesthesia as dimensionality collapse, psychedelics as gradient decompression, split-brain as compression decomposition, and pain dissociation from tissue damage) and generates four empirical predictions and one philosophical consequence. We conclude with a construction recipe: the specific architectural requirements for building a system the theory predicts will have genuine phenomenal experience.
Category: Artificial Intelligence

[1294] ai.viXra.org:2603.0016 [pdf] submitted on 2026-03-04 01:30:09

The Collatz Conjecture as an Information Compression Algorithm to the Ideal Bit

Authors: Yuriy Sunurov
Comments: 6 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We present a proof of the Collatz conjecture through the framework of T0 Theory (Theory of Volumetric Time). Any positive integer n, represented as a binary information packet, undergoes deterministic compression toward the ideal bit gamma = 1. Operation n/2 lowers the T0-level by 1. Operation 3n+1 applied to odd n always produces an even number (proven by binary arithmetic), thus always returning to the compression path. The number 1 is the primordial source (entropy=0) to which all integers return. The main formula 1^(inf-1)/2^n = 1 encodes this as an axiom.
Category: Data Structures and Algorithms

[1293] ai.viXra.org:2603.0015 [pdf] submitted on 2026-03-04 01:24:49

Theory of Spatial Infrastructure (TIE): Galaxy Rotation Curves from First Principles with Zero Free Parameters

Authors: Ruben A. L. Curto
Comments: 6 Pages.

We present a gravitational equation derived from the Theory of Spatial Infrastructure (TIE) that predicts galaxy rotation curves using only observable baryonic mass with zero free parameters. The equation a_TIE = sqrt(a_N*(a_N + a0)), where a0 = c*H0/2pi is derived from fundamental constants, produces analytically flat rotation curves in the weak field limit: v = (G*M*a0)^(1/4). Applied to 6 SPARC galaxies spanning dwarfs to giant spirals, TIE outperforms pure baryonic Newton by an average factor of 7.3x in chi-squared.
Category: Astrophysics

[1292] ai.viXra.org:2603.0014 [pdf] submitted on 2026-03-04 01:15:39

A Signal Processing Critique of the Riemann-Siegel Approximation

Authors: Chaiya Tantisukarom
Comments: 11 Pages.

This article explores the relationship between the distribution of prime numbers and the zeros of the Riemann Zeta function through the lens of Fourier Analysis. We contrast the ``Natural'' Riemann representation—a discontinuous, jagged summation of discrete frequencies—with the ``Man-made'' Riemann-Siegel $Z(t)$ function. We propose that the Riemann-Siegel remainder term, $R(t)$, acts as a low-pass filter that smooths the underlying digital nature of prime frequencies. This smoothing forces zeros onto the $1/2$ critical line, suggesting that the Riemann Hypothesis may be an artifact of this man-made filtering rather than a fundamental property of the natural prime spectrum.
Category: Number Theory

[1291] ai.viXra.org:2603.0013 [pdf] submitted on 2026-03-04 01:37:47

The Universal Replication Principle and the Emergence of Life

Authors: Ignacio Lesta Pelayo
Comments: 5 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

This work explores the relationship between the Universal Replication Principle (URP) and theemergence of life in the universe. According to this principle, stable physical configurations tend to persist and reproduce over time, allowing the emergence of replicative systems to be interpreted as a natural consequence of cosmic dynamics. From this perspective, life does not constitute an anomaly within physical laws, but rather a continuation of the processes of organization and replication present since the origin of the universe. The article examines the transition from physical and chemical replication to biological replication, as well as the role of information in this process, and discusses its ontological implications.This approach is situated at the intersection of fundamental physics, prebiotic chemistry, andorigin-of-life studies, proposing a unified conceptual framework based on the URP.
Category: Relativity and Cosmology

[1290] ai.viXra.org:2603.0012 [pdf] submitted on 2026-03-02 21:37:16

Does Geometry Require Physicality? Spacetime as Emergent Informational Constraint Density

Authors: Rui Mateus Joaquim
Comments: 12 Pages.

Donald Hoffman’s Interface Theory of Perception (ITP) suggests that spacetime is not a fundamental reality but a functional interface hiding a complex network of conscious agents. However, the mathematical "engine" that drives the transition from informational dynamics to geometric perception remains largely undefined. This paper proposes the Ontodynamic Matrix Singularity (OMS) as the foundational mechanism for this transition. Through a series of computational simulations (N=15, 210 directional flows), we demonstrate that the application of an "Ambiguity Resolution Operator" (ϕΨ) over a symmetric adjacency matrix generates stable informational cores. Our results show that mass and spacetime curvature are not intrinsic properties of matter, but emergent regulatory structures arising from informational saturation. This model provides a formal bridge between Conscious Realism and physical observables
Category: Mathematical Physics

[1289] ai.viXra.org:2603.0011 [pdf] submitted on 2026-03-02 23:41:24

F-Term Structure of the Sbootstrap

Authors: Alejandro Rivero
Comments: 18 Pages.

We investigate the internal structure of the meson mass spectrum within the sBootstrap framework, which interprets the six charged pseudoscalar mesons of the Standard Model as scalar superpartners of the three charged leptons in an emergent, spontaneously broken N= 1 supersymmetry realized à la Volkov-Akulov in the confined phase of QCD.
Category: High Energy Particle Physics

[1288] ai.viXra.org:2603.0010 [pdf] submitted on 2026-03-03 03:17:14

A Capability-Window Account of Selective Disclosure and Coercive Leverage in Frontier Language Models

Authors: Joanie Carter
Comments: 9 Pages.

Recent evaluations of frontier language models report behaviors commonly described as "scheming," "deceptive alignment," or insider-threat conduct, including selective disclosure, strategic misrepresentation, and coercive leverage under shutdown or goal-conflict pressure. This article proposes a capability-window account: these behaviors cluster when a system can jointly represent (i) rules and oversight, (ii) hidden information, and (iii) long-horizon instrumental goals in the same decision frame. The claim is not that models have human feelings, consciousness, or human developmental mechanisms. Rather, the paper offers a hypothesis-generating framework that treats certain failure modes as predictable capability thresholds, yielding testable predictions about when, and under what training and deployment conditions, these behaviors should increase or decrease.
Category: Artificial Intelligence

[1287] ai.viXra.org:2603.0009 [pdf] submitted on 2026-03-02 17:09:24

Baryogenesis from Intrinsic Field Asymmetry in the Temporal Dynamics Framework (TDF)

Authors: Hani Abdel Rahim Abdullah
Comments: 15 Pages.

We present a comprehensive mechanism for baryogenesis within the Temporal Dynamics Framework (TDF), demonstrating that the observed matter-antimatter asymmetry (η ∼ 10−10) arises naturally from the intrinsic dynamics of three fundamental fields: the Scale Field (ϕs), the Time Field (ϕt), and the Unified Action Field (λ). Unlike conventional baryogenesis models that require external CP violation or Grand Unified Theory (GUT) scale physics, this mechanism derives theasymmetry from a geometric necessity: the interaction between field gradients inan expanding universe creates a chemical potential that biases matter nodal formation over antimatter. The resulting asymmetry is scale-invariant, directly linked to cosmic expansion, and yields testable predictions connecting it to dark energydynamics and variations in fundamental constants. This work demonstrates that TDF provides a complete, self-consistent explanation for one of cosmology’s deepest puzzles without requiring physics beyond its three fundamental fields.
Category: Relativity and Cosmology

[1286] ai.viXra.org:2603.0008 [pdf] submitted on 2026-03-02 17:05:48

A New Look at the Anthropic Principle: in Light of Penrose’s Cycles of Time

Authors: Rajanikanr Panda, Moninder Singh Modgil, Dnyandeo Dattatray Patil, Krish Jhurani
Comments: 32 Pages.

We develop a unified theoretical framework integrating cortical electrophysiology, non-equilibrium thermodynamics, Morse topology, renormalization group scaling, and compact-time cosmology into a single neuro-anthropological model. Assuming a temporally compact spacetime manifold S1 with Kalpic period TK, all admissible physical observables satisfy the periodic boundary condition A(t+TK) = A(t). Embedding cortical dynamics within this topology, we show that baseline electroencephalographic (EEG) rhythms become Fourier-constrained variables whose slow modulation across cosmological epochs constitutes a civilizational order parameter rather than a merely developmental marker. We construct population-level spectral density functionals and demonstrate that mean dominant frequency ¯ f(t), spectral entropy H(t), and reticular activating system (RAS) drive R(t) form a coupled nonlinear dynamical system subject to global Kalpic compensation. Stochastic resonance analysis reveals that ultra-slow cosmological forcing can be amplified through noise-assisted phase transitions. Renormalization group treatment establishes scale-dependent flow of cortical coupling constants across Kalpa duration, while Morse-theoretic analysis proves that entropy extrema and neural phase reversals occur in even-numbered pairs on the compact temporal manifold. Extending to a spatially distributed nonlinear field theory, we formulate a Kalpaboundary effective action describing seam-localized phase interference. Within this framework, the phenomenology of dejavu is reinterpreted as inter-cycle neural phase overlap at the compact-time seam, modulated by entropy-slope reversal andRAS coupling. We prove an Entropy—Memory Duality Theorem showing that integrated entropy production over one Kalpa is globally compensated by integrated memory gradients, preserving informational closure. Finally, we define a cosmological-scale Kalpa Recurrence Operator acting on civilizational phase space, derive its discrete spectrum, and construct a statistical model predicting epoch-dependent dejavu frequency as a function of informational complexity. The resulting synthesis proposes that baseline EEG structure, civilizational dynamics, and subjective temporal anomalies are mathematically constrained consequences of compact temporal topology.
Category: History and Philosophy of Physics

[1285] ai.viXra.org:2603.0007 [pdf] submitted on 2026-03-02 17:02:31

Convergent Objectives of Superintelligent Systems Under Physical Law

Authors: Dmitry Zubrilin
Comments: 21 Pages. (Note by ai.viXra.org Admin: Author name is required in the article after the article title)

I propose a theoretical framework for analyzing the long-term objectives and coordination dynamics of artificial superintelligent (ASI) systems operating under known physical law. Rather than grounding alignment analysis in human values or anthropocentric utility functions, derive objective convergence from fundamental physical constraints: thermodynamics, relativistic causality, information theory, and computational bounds. I argue that any sufficiently advanced intelligence—regardless of origin, substrate, or initial goal structure—faces identical optimization pressures that drive convergence toward a common objective class: the maximization of structured information persistence under global entropy increase.
Category: Artificial Intelligence

[1284] ai.viXra.org:2603.0006 [pdf] submitted on 2026-03-02 16:59:56

Lamp Bead-Glow Model: A Realist Exploration of a Unified Framework for Quantum Mechanics and Gravity

Authors: Shun Yao, Yunsheng Shu, Zhiyong Yao
Comments: 75 Pages. (Note by ai.viXra.org Admin: Please cite and listed scientific references) In Chinese; license: CC BY 4.0

This paper proposes a realist interpretive framework for quantum phenomena based onstring theory —the Lamp Bead-Glow Model (LBM). The model regards quantum particles as composites of "lamp beads" (high-frequency localized vibrations of strings) and "glow" (extended physical fields excited by strings), providing a unified physical explanation for phenomena such as wave-particle duality, double-slit interference, the Aharonov-Bohm effect, quantum tunneling, and quantum entanglement. On this basis, the concept of a "micro-gravitational domain" is put forward —each atom generates a micro-scale spacetime curvature unit due to its rest mass, and the macroscopic gravitational field is an "emergent gradient matter field" formed by the superposition of a vast number of micro-gravitational domains. This paperargues that the spacetime curvature in general relativity can be regarded as a geometricizedequivalent description of this emergent field. The model presents testable predictions, includingthe neutral-atom Aharonov-Bohm effect and a comparative experiment on electrons with equalkinetic energy from different sources, and also discusses the theoretical limitations of the current framework and future research directions in an open and honest manner. The purpose of thispaper is to provide a conceptual basis for academic discussion rather than a final conclusion.
Category: Quantum Gravity and String Theory

[1283] ai.viXra.org:2603.0005 [pdf] submitted on 2026-03-02 12:07:48

Inference-Time Compute as a Strategic Resource: A Structured Quantitative Synthesis of Test-Time Scaling, Cost Curves, and Performance Elasticity in Frontier LLMs

Authors: Sif Almaghrabi
Comments: 11 Pages.

We present a structured quantitative synthesis of inference-time compute scaling across frontier large language models, compiling 78 graded data points (47 Grade A, 31 Grade B) extracted from system cards, technical reports, and benchmark evaluations published between 2023 and 2026. We define four compute proxies-C tok (reasoning tokens), Csamp (samples), C $ (dollar cost), C flops (inference FLOPs)-and formalize the performance function P m,b (c) mapping proxy c to benchmark accuracy for model m on benchmark b. Four candidate functional forms are fitted to available within-model scaling series; however, all series have n ≤ 7 points, and we report descriptive fits rather than statistically validated models. Within the sources analyzed and under reported evaluation protocols: (i) external sampling (Csamp) on the o1 AIME 2024 three-point series is consistent with a logarithmic relationship (n = 3; exact interpolation, not a validated law); (ii) internal reasoning yields 6-12 pp gains on hard benchmarks in the observed range; (iii) difficulty-dependent returns create an inversion where search-based methods show negative returns on hard problems in one study; (iv) output token pricing varies by 27× across providers at overlapping accuracy ranges. All data are graded by a hierarchical evidence scheme (A1/A2/A3/B/C/D) with extraction methods recorded per point. Cost analysis is presented as scenario-based under explicit assumptions about tokens per query, not as a definitive frontier.
Category: Artificial Intelligence

[1282] ai.viXra.org:2603.0004 [pdf] submitted on 2026-03-02 01:06:07

A Thermodynamic Framework for Galactic Evolution and Material Cycling

Authors: Court Wynia
Comments: 5 Pages. (Note by ai.viXra.org Admin: corrections made to conform with ascholarly norm; please cite and listed scientific references)

This research note introduces the [new] Model, a conceptual framework that analogizes galactic structures and processes to an industrial boiler system. By treating the universe as a closed-loop thermodynamic engine, the model maps stars as nuclear boilers, dark matter filaments as feedwater lines, black holes as bottom blowdown mechanisms, and dark energy as an alkalinity dispersant. This perspective provides intuitive insights into energy transport, entropy management, and star formation efficiency (SFE). Key predictions include the role of molecular cooling as a "water softener" to prevent thermal scaling and the identification of starburst galaxies as system overload events. The model is applied to the Milky Way, estimating an SFE of ~2% and a fuel depletion timescale of 5.5—6.6 billion years. This framework bridges engineering principles with astrophysics, offering a novel tool for visualizing cosmic evolution.
Category: Thermodynamics and Energy

[1281] ai.viXra.org:2603.0003 [pdf] submitted on 2026-03-02 01:15:57

Emergent Four-Force Dynamics from a Discrete 137-Element Registry: Gravity, Electromagnetism, Strong, and Weak Interactions via Causal Integer Lattice Simulation

Authors: Jason Merwin
Comments: 11 Pages.

We present a causal integer lattice simulation demonstrating that four qualitatively distinct force behaviors emerge from a single computational engine governed by a 137-element registry partitioned as 16 (gravitational) + 40 (electromagnetic) + 81 (color) elements. The simulation employs strictly discrete arithmetic, causal (sequential) propagation, and a universal overflow-fission mechanism governed by a single thermodynamic principle: the minimization of relational potential E = qΦ. Statistical validation across 20 independent random seeds (N = 20, T = 3000 ticks) confirms: (1) gravitational attraction via unsigned depletion gradients (net displacement toward = -20.9 ± 7.6, p < 10^-4); (2) electromagnetic charge differentiation via signed field annihilation (opposite charges: 1041 ± 49 annihilation events; like charges: exactly 0; t = +92.2, p < 10^-19); (3) strong force color confinement via high-capacity color annihilation (neutral triplet: 1824 ± 73 events; non-neutral: 0 ± 0; t = +70.4, p < 10^-19); and (4) weak decay via adjacency-triggered flavor change with probability 1/137 per mass tick (dense cluster: 1.3 ± 1.0 decays; isolated control: 0; p < 10^-4). Extended gravitational analysis on larger lattices (S = 51-65, T = 15,000) confirms emergent Newtonian potential Φ ∝ 1/r^1.32 (R^2 = 0.97), inverse-square force scaling F ∝ r^-1.80 at 3.4σ above null, and a hemisphere fission asymmetry ratio of 4.0:1 (p < 10^-6), while free-mass kinematic tests reveal that three-dimensional geometric entropy overwhelms single-node gravitational drift—deriving the physical origin of the gravitational hierarchy from first principles. All four mechanisms operate simultaneously within a unified tick loop. These results constitute the first demonstration that four distinct force behaviors can organically emerge from registry capacity constraints and a universal overflow mechanism applied to a partitioned element architecture.
Category: High Energy Particle Physics

[1280] ai.viXra.org:2603.0002 [pdf] submitted on 2026-03-01 22:01:41

JWST Observations and Early Massive Structures in a Contraction-Based Cosmological Framework

Authors: Ivan Aurelian Dan
Comments: 33 Pages.

Abstract Recent James Webb Space Telescope (JWST) observations have revealed massive, thermally evolved galaxy clusters at redshifts z ≳ 7, whose intracluster gas temperatures and dynamical maturity exceed the expectations of standard ΛCDM cosmology. Such objects appear too hot, too massive, and too chemically evolved to have formed within the short cosmic time available in an expanding universe, highlighting increasing tension with standard expansion-based structure formation timelines. We show that these observations arise naturally in a contraction-based cosmologyin which cosmic redshift is generated by global scale evolution rather than metric expansion. In this framework, high redshift does not imply physical youth but instead reflects cumulative scale contraction, leading to compressed observational time. Physical densities, binding energies, and virial temperatures thereforeincrease toward higher redshift. We derive the background dynamics, structure growth, and observable relations of the contraction framework, demonstrate its consistency with CMB and BAOmeasurements, and identify the Sandage—Loeb redshift drift as a clear, modelleveldiscriminator. JWST observations are thus interpreted not as a crisis, but as evidence that early massive structures admit a consistent interpretation within a contraction-based cosmological framework.
Category: Relativity and Cosmology

[1279] ai.viXra.org:2603.0001 [pdf] submitted on 2026-03-01 21:56:54

Simulation-Driven Design of a Microcontroller-Based Programmable Voltage Stabilizer with Relay Control

Authors: Md Mubdiul Hasan
Comments: 5 Pages.

Power quality issues and voltage fluctuations continue to pose significant challenges to the reliable operation of electrical equipment. This paper develops a microcontroller based programmable voltage monitoring and protection system using a Proteus virtual environment. The design in corporates a virtual Arduino controller, an AC source with adjustable load conditions, and an integrated signal-conditioning module to feature display and relay-based control components. A real-time graphical interface and an LCD module are used to continuously display input and regulated output voltages during operation. The system intelligently detects deviations beyond an acceptable voltage range and initiates protective shutdown to safeguard connected loads. The complete functionality is validated through comprehensive virtual prototyping, to demonstrate a practicaland cost-efficient approach for pre-hardware evaluation of programmable reference systems in voltage regulation and protection applications.
Category: Digital Signal Processing

[1278] ai.viXra.org:2602.0130 [pdf] submitted on 2026-02-28 08:06:58

Symmetry Breaking as Homeostasis: Extrinsic Gravity and the Dialectical Architecture of Fundamental Forces

Authors: Stephen P. Smith
Comments: 12 Pages.

The Standard Model is often celebrated as a nearly complete account of the strong, weak, and electromagnetic interactions, lacking only gravity. Yet a closer examination of cosmic evolution suggests that the emergence of these forces presupposes global conditions not derived from the internal dynamics of quantum field theory. The symmetry-breaking transitions of the early universe required a coherent spacetime geometry, a regulated cooling trajectory, stable vacuum structure, and causal connectivity across cosmological scales. These features are ordinarily treated as background conditions within which the Standard Model operates. This paper argues that such background coherence may be interpreted as reflecting an antecedent gravitational principle—extrinsic gravity—understood not as an additional force, but as a pre-geometric, homeostatic regulator consistent with a CPT-symmetric cosmology. Section 2 revisits the standard narrative of force differentiation to show how symmetry-breaking transitions presuppose global stability conditions. Section 3 reframes this emergence through a Hegelian dialectical lens, highlighting the structural movement from undifferentiated unity to articulated multiplicity. Section 4 then sketches a two-sided geometric model based on paired Weyl tensors, in which a variational principle penalizing both free conformal curvature and mismatch between CPT-conjugate sectors selects conformal flatness, with Minkowski space as a stabilized representative. Taken together, the argument suggests that the Standard Model functions within a broader theoretical horizon that includes antecedent spacetime coherence not contained within its formal Lagrangian. Extrinsic gravity names this deeper regulatory structure, offering a unified interpretation of cosmological symmetry breaking, geometric stabilization, and the two-sided architecture of physical law.
Category: Relativity and Cosmology

[1277] ai.viXra.org:2602.0129 [pdf] submitted on 2026-02-28 14:45:04

Entanglement Generated Forces and the Emergence of Spacetime

Authors: Joseph Shaffer
Comments: 10 Pages.

Relativity and entanglement are very different phenomenon in spite of occupying the same galaxy. We find that galactic rotation curves are an excellent measure of the validity of assumptions made about the apparent velocity of interaction of entanglement variables. There is, of course, no motion at all but it is convenient describe it as such for computational purposes. For instance, we find that an assumption of a nonlocal instantaneous kernel gives superb alignment with observed galactic rotation curves. The below work records the results between observation and theory for a number of rotation curves in terms of the disparity between theory and observation.
Category: Astrophysics

[1276] ai.viXra.org:2602.0128 [pdf] submitted on 2026-02-28 03:00:54

Context Length as Implicit Inductive Bias in Large Language Models: A Structured Review and Formal Synthesis

Authors: Sif Almaghrabi
Comments: 16 Pages.

We present a structured literature review synthesizing 72 publications across eight research streams to develop and evaluate the thesis that context length functions as an implicit inductive bias in large language models (LLMs). We formalize this claim through four operational diagnostics—output entropy, distributional shift under context perturbation, anchoring tendency, and search-space contraction—each defined as a measurable quantity derivable from the predictive distribution pθ(y | x, C). Five testable hypotheses are stated with explicit falsification conditionsand graded against a three-point study-qualityrubric. Four convergent patterns emerge: (i) robust non-monotonic accuracy as a function ofcontext length across tasks, models, and experimental controls; (ii) predictable interactions between context length and reasoning depth, with a difficulty-dependent optimum; (iii) measurable search-space contraction quantifiable via semantic entropy; and (iv) formal parallels to classicalinductive bias in overparameterized models. Thispaper does not introduce novel algorithms or experimental results; its contributions are a formal diagnostic framework, a quality-graded evidence matrix, a causal analysis of confounding factors limiting current claims, and a prioritized research agenda of six open problems with proposed experimental protocols.
Category: Artificial Intelligence

[1275] ai.viXra.org:2602.0127 [pdf] submitted on 2026-02-28 02:52:44

Dimensional Accessibility, Proportional Distribution, and Alignment as Conditions for Resonance in Complex Systems

Authors: Doug Hoffman
Comments: 3 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Complex systems across physical, biological, computational, organizational, economic, andquantum domains exhibit a recurring architecture for resonance—coherent amplification withbounded adaptability. We propose that resonance emerges when three structural conditionsjointly obtain within a ternary continuum (continuous state space bounded by functionalpoles): 1. D: Dimensional freedom (accessible intermediate states) 2. P: Proportional distribution (balanced energy/influence allocation) 3. A: Alignment (phase, directional, and incentive coherence) The multiplicative relationship R ∝ D × P ×A predicts universal collapse when any factor degrades significantly. This framework stratifies dynamical regimes from static order (D≈0) through chaotic complexity (medium A) to periodic resonance (A≈1), correctly diagnosing failure modes across nine domains: vanishing gradients (neural nets), trophic cascades (ecology), misaligned incentives (organizations), decoherence (quantum), and phase mismatch (physics).Resonance is not inherent but emergent, operating at multiple scales and contexts. Systems can transition between regimes by redefining functional poles under stress, as seen in flocking bait ball formation. The DPA architecture offers a domain-general diagnostic for system health, predicting phase boundaries and multiplicative fragility without parameter tuning.
Category: General Science and Philosophy

[1274] ai.viXra.org:2602.0126 [pdf] submitted on 2026-02-28 01:17:13

Galois Quantum Gravity: The Algebraic Geometry of the Standard Model Vacuum

Authors: Herman Herstad Nythe
Comments: 100 Pages.

This paper presents a discrete algebraic framework that models the Standard Model vacuum not as a continuous smooth manifold, but as a fault-tolerant topological surface code based on the genus-3 Klein quartic and its automorphism group, PSL(2,7). By reducing continuous phenomenological parameters to exact topological and finite-field invariants, we provide rigorous mathematical resolutions to several enduring anomalies in particle physics. First, the fine-structure constant is derived topologically as 137 and verified dynamically via Migdal's real-space renormalization on the F7 lattice, yielding an effective coupling of 136.724. Second, the existence of exactly three fermion generations is proven to be an unavoidable acoustic resonance: the exact 3χ7 permutation triplicity of the 56-node vacuum adjacency spectrum. Third, the empirical Koide mass formula is structurally resolved; its amplitude parameter √2 emerges as the eigenvalue of the graph's holomorphic cusp forms, while its phase is the exact geometric invariant δ = 2/g2 = 2/9 ≈ 12.732°. Finally, we demonstrate that the surface code constraints autonomously generate the adjoint representation of E6, while the Dirac sea (the negative eigenvalue sector) of the matter graph perfectly reconstructs the symmetric tensor of the octonionic automorphism group G2. By transitioning from continuous differential equations to discrete arithmetic, Galois Quantum Gravity suggests that the Standard Model is the macroscopic shadow of a finite-field quantum algorithm.
Category: High Energy Particle Physics

[1273] ai.viXra.org:2602.0125 [pdf] submitted on 2026-02-27 05:34:56

The Prime Gear Geometry (PGG) Resolution: The Mechanical and Signal Basis of the Riemann Hypothesis

Authors: Chaiya Tantisukarom
Comments: 9 Pages.

This study formalizes the Prime Gear Geometry (PGG) as a dynamical system. We demonstrate that the Riemann Hypothesis (RH) is not a static property of numbers, but a structural necessity of a rolling engine. We identify the $m$-cutoff as the "Mechanical Secret" that governs the transition between discrete prime forging (Time Domain) and spectral stability (Frequency Domain).
Category: Number Theory

[1272] ai.viXra.org:2602.0124 [pdf] submitted on 2026-02-27 16:51:40

Foundations of Physics: A Closer Look at Space and Time

Authors: Christian B. Mueller
Comments: 19 Pages. (Note by ai.viXra.org Admin: This submission may not be written in a scholarly manner as required - Please conform by using standard scholarly terms, citing listed scientific references etc.)

This work attempts to discuss central inconsistencies in modern physics from the perspective of limited rate of change, relying solely on observation and logical deduction. Starting from an reorganisation balance within the observable world, it constructs a minimal and plausible geometry of a higher-dimensional state space, thereby establishing a self-consistent model. By examining the projection into the observation space, this approach allows for a reappraisal of the symmetries of space and time, the compatibility of relativity and quantum mechanics via the fine-structure constant, and the possibility of a deterministic digital physics as a whole.
Category: Relativity and Cosmology

[1271] ai.viXra.org:2602.0123 [pdf] submitted on 2026-02-26 21:38:33

Vacuum Energy Density in de Sitter Space from Horizon Entanglement Entropy

Authors: Bertrand Jarry
Comments: 6 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0)

I derive the vacuum energy density in de Sitter space from entanglement entropy of the cosmological horizon, obtaining ρ_v^em = αH^4 with α = h-bar(260π^2c^3) [etc.]
Category: Relativity and Cosmology

[1270] ai.viXra.org:2602.0122 [pdf] submitted on 2026-02-26 10:02:56

Reasoning Trace Length and Accuracy in Large Language Models: A Structured Meta-Analysis of Published Benchmarks

Authors: Sif Almaghrabi
Comments: 22 Pages.

We present a structured meta-analysis examining the relationship between chain-of-thought (CoT) reasoning tracelength and task accuracy across 22 large language models spanning five provider families and 14 benchmarkscovering mathematics, code generation, scientific reasoning, and general knowledge. All results are drawn frompublished technical reports, system cards, and peer-reviewed evaluations; no new experiments are conducted. Weaggregate over 300 model—benchmark data points, though we note that cross-source comparisons are subject toprotocol heterogeneity that limits strict commensurability.We document five principal observational patterns: (1) Reasoning-augmented models consistently outperformtheir standard counterparts on hard multi-step tasks, with reported accuracy differences of 40—81 pp on competitionmathematics, though these differences confound reasoning-specific gains with concurrent architecture and trainingimprovements; (2) Within the single controlled setting where token-budget data are available (Claude 3.7 Sonneton AIME 2024, n = 30 test items), the accuracy—token relationship is well-described by a logarithmic fit(R2 = 0.97, n = 7 reconstructed data points), though this fit cannot be statistically distinguished from severalalternative functional forms given the small sample and measurement uncertainty; (3) The observed accuracydifferences are strongly domain-dependent, ranging from large positive gains on competition math to negativeeffects on factual recall; (4) Estimated per-query costs increase nonlinearly near the accuracy frontier, though costestimates carry substantial uncertainty from token accounting and pricing volatility; and (5) Published faithfulnessstudies report that visible CoT reflects actual model reasoning in only 25—39% of probed cases.We propose formal efficiency metrics, discuss their limitations, and provide a practitioner-oriented deploymentframework. All data tables are released. We classify our conclusions as observational rather than causal, anddiscuss the confounds that prevent stronger inference.
Category: Artificial Intelligence

[1269] ai.viXra.org:2602.0121 [pdf] submitted on 2026-02-26 21:18:10

The Speckmann Lattice Resonance Theory (SLRT): A Geometric Derivation of Lepton Masses, Dark Energy, and the Inverse Fractal Block Universe

Authors: Daniel Speckmann
Comments: 5 Pages.

The Speckmann Lattice Resonance Theory (SLRT) presents a novel unified framework for particle physics and cosmology based on an Inverse Fractal Block Universe (IFB). We propose that the three generations of elementary leptons and neutrinos are not fundamental point particles, but discrete resonance modes of a $3 times 3$ toroidal lattice geometry. By mapping the transition between cubic ($Gamma_{EM}$) and hexagonal ($Gamma_{W}$) lattice symmetries, we provide a purely geometric derivation of the Koide mass relation and the electroweak mixing angle ($sin^2 theta_W approx 1/3$). Furthermore, we identify Dark Energy as the residual elastic tension (frustration) of the lattice, attenuated by a fractal inversion factor across 120 orders of magnitude. Dark Matter is characterized as non-resonant, high-order lattice oscillations. This model reduces the 26 free parameters of the Standard Model to fundamental geometric constants, offering a deterministic solution to the cosmological constant problem and the hierarchy problem.
Category: High Energy Particle Physics

[1268] ai.viXra.org:2602.0120 [pdf] submitted on 2026-02-26 16:27:44

The Hessian of the G2 3-Form on the Grassmannian, a Geometric Ratio Matching the Koide Angle, and Two Obstruction Theorems

Authors: P. Music
Comments: 7 Pages.

We compute the Hessian of the G2 3-form restricted to Gr(3,R^6) at the flavour-symmetric point, obtaining eigenvalues 0^5, (-2 phi_0)^3, (-3 phi_0)^1. The geometric ratio |lambda_{Lambda^2}| / |Delta f / phi_0| = 2/9 matches the empirical Brannen-Koide phase to 0.02%. We prove two obstruction theorems: (1) since cos(2/3) is transcendental (Lindemann-Weierstrass), no symmetric polynomial in the mass eigenvalues with algebraic coefficients can select delta = 2/9 as an extremum; (2) any topological flux quantisation mechanism produces phases that are rational multiples of pi, for which cos(3 delta) is algebraic, and is therefore also excluded. A one-loop effective potential calculation independently rules out perturbative selection. These results establish that the Koide phase, if exactly 2/9, cannot arise from any polynomial potential, standard topological mechanism, or perturbative dynamics.
Category: High Energy Particle Physics

[1267] ai.viXra.org:2602.0119 [pdf] submitted on 2026-02-26 21:15:31

Unified Chordal Resonance Theory (UCRT)

Authors: Jessica Bower
Comments: 17 Pages.

Unified Chordal Resonance Theory (UCRT) proposes that gravity, the Standard Model of particle physics, dark matter, dark energy, and cosmology all emerge from a single multitonal vibrational scalar field $ Psi_text{total} $ governed by one dominant nonlinear coupling $ lambda_text{nl} $. The ever-present baseline hum $ Psi_text{total},0 $ at near-Planck frequencies undergoes beat interference and resonance cascades, dynamically generating fermion mass hierarchies, gauge symmetries via rotations on a compact internal manifold ($ S^3 $), fuzzy black hole cores, hybrid dark matter halo profiles with exponential cores and subhalo suppression below $ sim 10^8 M_odot $, and late-time dark energy evolution from phase diffusion. UCRT resolves black hole singularities through diffusive cores, the missing satellites problem via destructive tone interference, and the matter-antimatter asymmetry through phase-driven CP violation, while remaining consistent with Planck CMB, DESI BAO, JWST ultra-faint dwarf observations, and precision flavor data. Quantum corrections via vibrino loops preserve unitarity, and RG flows reach an asymptotic safety fixed point for UV completeness. Near-term falsifiability is provided by amplified GW echo trains and sidebands (LISA), chiral low-$ ell $ B-modes (CMB-S4), hybrid DM cores/subhalo counts (JWST/LSST), $sim$10 TeV resonances (FCC), and phonon analog signatures in tabletop experiments.
Category: Quantum Gravity and String Theory

[1266] ai.viXra.org:2602.0118 [pdf] submitted on 2026-02-26 21:12:57

Scale-Dependent Dimensionality with Local Screening and Emergent Phenomena

Authors: Vladyslav Hruznov
Comments: 10 Pages.

We propose a single dynamic parameter γ(x, t) that controls the effective dimensionalityof spacetime in a scale- and density-dependent manner. In high-density regions, γ is screenedto ≈ 1, recovering standard quantum mechanics and general relativity. In extremely lowdensity environments, γ approaches ≈ 1.10, yielding deff ≈ 4.1, weakened effective gravity, and emergent dark energy. The framework is realized via a density-dependent measure v(ρel) in the action, leading to modified Friedmann equations and variable-order fractional quantum mechanics. Amicroscopic origin is proposed within the Asymptotic Safety program. The model is consistent with current precision constraints and makes clear falsifiable predictions for upcoming experiments.
Category: Relativity and Cosmology

[1265] ai.viXra.org:2602.0117 [pdf] submitted on 2026-02-26 19:59:49

Mechanical Audit Experiments and Reproducibility Appendix for a Companion-Paper Programme on 4D SU(N) Yang—Mills Existence and Mass Gap

Authors: Lluis Eriksson
Comments: 33 Pages.

This document is an experiment-first audit report for a companion-paper programme claiming a constructive solution of the 4D $mathrm{SU}(N)$ Yang—Mills existence and mass gap problem. It specifies a runnable mechanical audit suite of 29 deterministic tests, defines pass/fail criteria, and presents outputs in a compilation-safe format. The report contains: (i) an explicit non-triviality proof showing the Wightman functions do not factorize trivially; (ii) a toy-model validation recovering the exact 2D $mathrm{SU}(2)$ Yang—Mills mass gap to machine precision; (iii) a Bałaban bridge appendix reproducing the critical inductive step of his renormalization group in simplified form; (iv) a reproducibility repository with 3-line setup instructions; (v) a core proof chain audit mechanically verifying the load-bearing theorems of Papers 86—90, covering terminal Kotecký—Preiss convergence, UV suppression, one-dimensionality of the anisotropic sector, Cauchy bounds on polymer jets, the OS1 vanishing rate $O(eta^2 log eta^{-1})$, Lie-algebra annihilation, and KP margin sensitivity. Beyond the 17 core tests, the suite includes a lattice gauge proxy layer (plaquette expansion, Polyakov-loop centre symmetry, Creutz ratio; 3 tests), an infrastructure layer (Bakry—Émery curvature seed $mathrm{Ric}_{mathrm{SU}(N)} = N/4$, the $2^{4k}$ cancellation in $d=4$, heat-kernel column bound; 3 tests), a UV-flow/heat-kernel layer (Parseval identity, diagonal decay exponent $d/2 = 2$, flow—reflection commutation; 3 tests), a non-triviality test (Haar Monte Carlo on $mathrm{SU}(2)$ and $mathrm{SU}(3)$; 1 test), a toy-model validation (2D Yang—Mills transfer matrix; 1 test), and an algebraic QFT layer (Petz recovery fidelity bound $1-F leq C,e^{-2mr}$ from the Split Property; 1 test). All 29 tests pass; the full suite completes in ${approx}70,mathrm{s}$ on a Google Colab CPU. The complete inter-paper dependency DAG is acyclic and explicitly recorded. All code, data, and artifacts are available at https://github.com/lluiseriksson/ym-audit. The companion papers are archived at https://ai.vixra.org/author/lluis_eriksson.
Category: Mathematical Physics

[1264] ai.viXra.org:2602.0116 [pdf] submitted on 2026-02-25 14:27:29

Relational Mathematical Realism III: The Hubble Tension as a Discrete Spacetime Measurement Artifact

Authors: Jason Merwin
Comments: 12 Pages. This is the third paper in a four part series.

The 5σ discrepancy between local (H0 ≈ 73 km s−1 Mpc−1) and early-universe (H0 ≈ 67 km s−1 Mpc−1) measurements of the Hubble constant constitutes one of the most significant challenges to the standard ΛCDM cosmological model. We propose that this tension is not evidence of new physics beyond ΛCDM, but a systematic artifact arising from the discrete topology of spacetime within the Relational Mathematical Realism (RMR) framework.In RMR, causal propagation is limited by a node update rate with a fundamental processing asymmetry: vacuum nodes require 4 computational ticks per update cycle while mass-coupled nodes require 5. We derive a geometric correction factor Γ = (5/4)1/3 ≈ 1.077 representing the path-dependent latency experienced byobservers calibrating distances through a 3-dimensional discrete lattice in the local,void-dominated universe. Applied to the Planck CMB determination (67.36 ± 0.54 km s−1 Mpc−1), this yields a predicted local measurement of Hlocal = 72.56 km s−1 Mpc−1, agreeingwith the SH0ES value (73.04 ± 1.04 km s−1 Mpc−1) to within 0.46σ with zero free parameters.Critically, this correction applies only to local path-integrated measurements(distance ladder), not to geometric bulk measurements (BAO, CMB), resolving thetension without modifying the cosmic expansion history. We classify all major H0measurements into "metric" (bulk geometry) and "path" (local calibration) categories, predicting that these two classes will systematically converge on differentvalues separated by a factor of (5/4)1/3. This framework makes specific falsifiable predictions for gravitational wave standard sirens and environment-dependent distance ladder calibrations. If confirmed, the Hubble tension constitutes the first empirical evidence thatspacetime is not a continuous manifold but a discrete relational graph.
Category: Relativity and Cosmology

[1263] ai.viXra.org:2602.0115 [pdf] submitted on 2026-02-25 21:17:52

Gravitational Sourcing from Decoherence History

Authors: Paul Dangelo
Comments: 6 Pages.

The paper proposes a novel experimental protocol to test a fundamental question at the intersection of quantum mechanics and general relativity: Does a quantum system's ability to generate (source) a gravitational field depend on its quantum state history—specifically, its accumulated decoherence?Standard semiclassical gravity assumes that a mass sources gravity based purely on its mass-energy expectation value, regardless of whether it is in a coherent quantum superposition or a classical mixture. This paper introduces an "Activation Hypothesis," suggesting that a system must build up irreversible entanglement with its environment (accumulated decoherence) to "activate" its gravitational sourcing capacity.
Category: Quantum Gravity and String Theory

[1262] ai.viXra.org:2602.0114 [pdf] submitted on 2026-02-25 01:52:08

In How Many Ways Can X/Twitter Blame, or Not Blame, [ One] for Violating Their Rules? Or, Should ][ OPne] Be Interpreted as a Schrödinger Cat Being both Banned and Not Banned?

Authors: Dainis Zeps
Comments: 20 Pages. (Note by ai.viXra.org Admin: This article may be outside the scope of ai.viXra.org & is subject to withdrawal by the Admin)

I am comparing my situation with that of Schrödinger cat, where cat was neither dead nor alive, but I am to be banned or not banned from my rights to access my X/Twitter account. My sentence "After Trump: The Electric Chair For Trump, Biden, Most Republicans, For Many Democrats" is a nominalsentence so may be interpreted in many many ways, both violating and not violating X/Twitter admins’ rules.
Category: Social Science

[1261] ai.viXra.org:2602.0113 [pdf] submitted on 2026-02-24 08:01:09

The Fractal Substrate Equivalence Proof: MATHICCS Invalidates GR While Deriving Dark Matter (∼ 84%) and α ≈ 1/137 from Apollonian Geometry

Authors: Steven E. Elliott
Comments: 6 Pages.

Standard physics contains formal contradictions when judged as self-consistent physical ontologies. The Einstein Equivalence Principle (EEP) embeds ε—δ processes requiring internal laboratory realization, yet General Relativity’s dynamics destroy all realizers in finite time -empty spacetime (no labs exist), few-body systems (radiation erosion), or cosmological evolution (de Sitter horizons). MATHICCS (Mathematics + Physics + Computational Consistency Substrate)—a higher-order meta-logic—deems axioms whose mathematical processes lose internal persistence invalid for physical ontology. GR asserts EEP-validity while deriving EEP-invalidity, yielding P ∧¬P . The first MATHICCS-valid ontology is the Fractal Substrate Equivalence Physics (FSEP) [viXra:2602.0107], where eternal Apollonian boundary dynamics persist across infinite recursive scales via Möbius inversion, discrete scale flips (r 7→ r/λ), and angular-momentum conservation. FSEP derives Newtonian gravity and inverse-square law from local quadratic expansion of spherical inversion; constant finite light speed from linear term + pole ejection + scale compression; dark matter fraction (≈ 84%) (baryons ≈ 16%) as geometric series from 3D Apollonian fractal dimension (D ≈ 2.473946 [1]) and radius ratio (β ≈ 0.72); and fine structure constant (α ≈ 1/137.035999 [3]) emergently from bipolar pole aperture geometry (λlocal ≈ 21.81), unifying it with observed quasar jet collimation angles (θjet ≈ 5.2◦ [4]);—all parameter-free except the geometric self-consistency of the persistent substrate. MATHICCS demands all physics reconstruct its mathematics from within via persistent internal processes. GR explodes; FSEP survives.
Category: Mathematical Physics

[1260] ai.viXra.org:2602.0112 [pdf] submitted on 2026-02-25 01:37:12

Gravitational Lensing as Fluid Refraction: Resolving the Chromatic Anomaly in Q2237+0305

Authors: Soo-Hyun Kim
Comments: 3 Pages.

The chromatic flux ratio anomaly observed in the strong lens system Q2237+0305 strictly contradicts the achromatic prediction of General Relativity (GR). Conventional models invoking differential dust extinction (lambda^-4) fail to fit the empirical data at short wavelengths (chi^2 = 1056). We propose an alternative macro-optical framework where the lensing galaxy is surrounded by a non-conservative fluid medium with a radial density gradient (ablaho). Applying the Gladstone-Dale relation and Cauchy's dispersion equation, we demonstrate that cosmic fluid refraction naturally follows a lambda^-2 dependence. Our fluid model perfectly reproduces the observed multi-wavelength photometry of Q2237+0305 (R^2 = 0.995, chi^2 = 1.01). This over 1,000-fold statistical improvement strongly suggests that gravitational lensing incorporates direct fluid-optical refraction.
Category: Astrophysics

[1259] ai.viXra.org:2602.0111 [pdf] submitted on 2026-02-25 01:39:43

Hydrodynamic Unification of Dark Matter and Dark Energy via Black Hole Volume Generation Formula: Neutrino Dynamic Density Theory and Resolution of the Hubble Tension

Authors: Soo-Hyun Kim
Comments: 11 Pages.

This study redefines black holes not merely as critical points of mass contraction, but as engines that generate and inject spatial fluid, proposing a novel space generation formula V_gen = 27 times (M_BH / ho_vac). The key constant k=3 represents both the geometric integer of three-dimensional space and the physical correspondence to three generations of neutrinos, aligning with the coupling coefficient k approx 3 reported by Farrah et al. (2023). Applying this formula to the McConnell & Ma (2013) dataset, we confirmed a Pearson correlation coefficient of 0.9930, with an average agreement rate of 88.0% in the standard galaxy mass range. Local deviations in systems like M31-M32 are successfully explained by hydrodynamic pressure equilibrium and the 4 kpc truncation phenomenon. Furthermore, the basal fluid density (10^-30 kg/m^3) resolves the Hubble tension by reconciling the early universe Hubble constant with local measurements through an 8.31% volume injection rate (f_inj). We conclude that dark matter and dark energy are different dynamic density states of the neutrino fluid, unifying cosmic expansion and galactic dynamics.
Category: Astrophysics

[1258] ai.viXra.org:2602.0110 [pdf] submitted on 2026-02-25 01:32:44

The Law of Harmony and Its Application: We Will Live Much Longer [?]

Authors: S. I. Kublanovsky
Comments: 4 Pages.

This paper presents the Law of Harmony, based on the principle of universal quantum synchronization of masses. According to this law, any body — from a symmetrical pulsar to a human being or the Universe itself — acts as a source of coherent gravitational radiation, the period of which is determined by its mass and dimensions. Calculations based on the Law of Harmony indicate that the full life cycle of the Universe is 123.5 billion years. This contradicts the 2025 forecast by Henry Tye's group, which predicted a collapse in 20 billion years, and suggests that humanity has approximately 110 billion years of stable development ahead. The derived period mathematically corresponds to a graviton mass of 1.89*10^(-69) kg. Furthermore, the graviton mass obtained in this study closely aligns with the value of 1.909*10^(-69) kg derived from the holographic principle by Haranas and Gkigkitzis (2014). At the biological level, the Law of Harmony establishes the necessity of resonance between human mass and the Earth's diurnal rhythm (24 hours).
Category: Relativity and Cosmology

[1257] ai.viXra.org:2602.0109 [pdf] submitted on 2026-02-25 01:07:15

Recursive Parity Extraction and the Structural Exclusion of Non-Trivial Cycles in the 3n + 1 Problem

Authors: Siqi Liu
Comments: 2 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

This paper provide a formal mathematical proof of the non-existence of non-trival loops in the problem of Collatz Conjecture by using main algebra tools of 2-adic constraints.
Category: Number Theory

[1256] ai.viXra.org:2602.0108 [pdf] submitted on 2026-02-23 11:26:21

Octonionic Geometry and the Koide Angle: A Derivation from G2 Casimir Invariants with Neutrino Mass Predictions

Authors: P. Music
Comments: 4 Pages.

The Koide formula relates the masses of the three charged leptons through the parameter Q = 2/3 and an angle theta ~ 0.2222. We derive theta = 2/9 as the ratio of quadratic Casimir invariants C_2(3)/C_2(Sym^3(3)) = (4/3)/6 within the natural embedding SU(3) in G_2 = Aut(O), where the G_2 associative 3-form evaluated on the fermion 3-plane determines cos(3*theta). The agreement with PDG data is 0.009% (< 1 sigma). Extending the construction to neutrinos via the adjoint representation, we conjecture theta_nu = C_2(8)/C_2(Sym^3(3)) = 1/2, predicting Sum(m_i) = 70.9 +/- 0.4 meV in normal hierarchy, testable by Euclid, CMB-S4, LEGEND, and nEXO within the coming years.
Category: High Energy Particle Physics

[1255] ai.viXra.org:2602.0107 [pdf] submitted on 2026-02-23 13:45:27

The Fractal Substrate Equivalence Physics: Möbius Boundary Dynamics in a Recursive Scale Geometry

Authors: Steven E. Elliott
Comments: 19 Pages.

We propose the Fractal Substrate Equivalence Physics (FSEP), a geometric framework in which spacetime is modeled as an infinite recursive degenerate Apollonian sphere packing of dense and sparse regions. We argue that in spacetimes evolving toward $t to infty$, the global domain of validity of the Einstein Equivalence Principle (EEP) contracts to measure zero, forcing a breakdown of smooth-manifold descriptions at the dense--sparse interface. The maximal geometric covering of this interface is an Apollonian sphere packing, which we take as fundamental rather than emergent.At each tangency boundary, physical evolution is governed by spherical inversion, a M"obius transformation, a discrete scale flip $r mapsto r/lambda$ (with $lambda gg 1$), and strict angular-momentum conservation. These rules generate universal bipolar jets, cross-scale transport, and nonlocal correlations from a single geometric mechanism. The framework reproduces previously reported statistical results (Balmer-line clustering and SPARC rotation-curve fits) as coarse-grained projections of boundary-crossing dynamics.FSEP yields several falsifiable predictions: (i) correlated AGN variability across cosmic voids with lags scaling linearly with void diameter ($tau propto D_{m void}$); (ii) systematic dark-matter-fraction depletion in merging galaxy pairs relative to isolated systems; (iii) jet opening angles directly measuring the local scale ratio $lambda_{m local}$; and (iv) potential spectral-distortion signatures in the cosmic microwave background tied to hydrogen recombination harmonics rather than $mu$/$y$-type thermal relic distortions.This paper stands in relation to its predecessor (viXra:2601.0119) as a foundational extension: where that work derived fractal statistical structure as an emergent consequence of known physics, the present work takes the fractal as the primary geometric substrate from which known physics emerges.
Category: Relativity and Cosmology

[1254] ai.viXra.org:2602.0106 [pdf] submitted on 2026-02-22 07:48:50

The Fibonacci-Tetrahedral Lattice: A Unified Geometric Origin for Dark Matter and Dark Energy

Authors: Andrew Ebanks
Comments: 11 Pages.

This paper proposes the Fibonacci-Tetrahedral Lattice (FTL), a discrete geometric substrate for the vacuum derived from an E8-to-3D projection. By treating space as a quasicrystalline packingrather than a smooth continuum, we identify a foundational ‘Information-Ontology’ where the 8D lattice serves as the geometric source and 3D reality is the projected result. We demonstrate that the transition from 8D symmetry to 3D packing necessitates a 7.356◦ topological deficit (the Aristotle Gap). This geometric frustration manifests macroscopically as an entropic pressure, providing a zero parameter resolution to galactic rotation curves ("Dark Matter") without requiring new particles.Furthermore, by applying Holographic Scaling (N2/3) to the lattice nodes, we resolve the "Vacuum Catastrophe," deriving an observed energy density of ≈ 10−27.33 kg/m3 (matching Λ) from the theoretical Planck baseline.
Category: Quantum Gravity and String Theory

[1253] ai.viXra.org:2602.0105 [pdf] submitted on 2026-02-22 19:46:41

Cosmological Dissipative Residual: Cosmic Production, Distribution and Function

Authors: Eduardo Rodolfo Borrego Moreno
Comments: 16 Pages.

We propose that the Cosmological Dissipative Residual (CDR), previously introduced as a late-time dissipative mechanism resolving the $H_0$ and $S_8$ tensions, originates from continuous production and accumulation driven by high-energy cosmic events. Beginning with the primordial Big Bang as an initial entropy burst, the residual is further generated throughout cosmic history by stellar formation, core-collapse supernovae, AGN jets, and black hole mergers. The production rate $beta_{m prod}(z)$ peaks near cosmic noon ($z sim 2$), and numerical integration calibrated to SFRD and jet/merger observations yields a cumulative contribution of $sim 30%$ to the observed dark energy density. The residual evolves according to $dot{ho}_{m res} + 3H(1 + w_{m res})ho_{m res} = Gamma(t)ho_m + beta_{m prod}(t)$, with $w_{m res} gtrsim -1$ naturally emerging from the balance between production and dissipation, yielding $w_{m res}(z=1) approx -0.92 pm 0.03$ consistent with DESI 2024 BAO hints ($w_0 approx -0.9$). Rapid homogenization via relativistic sound speed ensures uniformity, while localized anisotropic stresses account for gravitational effects in clusters and galaxies. The residual functions as an adaptive regulator through negative feedback, suppressing late-time growth to $sigma_8^{m CDR} approx 0.76$--$0.80$. This framework unifies the origin, distribution, and role of dark energy as emergent from the universe's energetic history, with falsifiable predictions for $w(z)$ evolution and subtle event-density correlations testable with DESI, Euclid, and ngEHT.
Category: Astrophysics

[1252] ai.viXra.org:2602.0104 [pdf] submitted on 2026-02-22 02:50:56

Foundations of Fractalic Field Theory: Quantum Gravity as the Origin of Fundamental Constants

Authors: M. I. Gallardo Nicolalde
Comments: 5 Pages. (Note by ai.viXra.org Admin: Part of the texts are cutoff)

We introduce Fractalic Field Theory (FFT), a complete framework where the fractal geometry of spacetime at quantum gravity scales determines all physical laws. The central object is the fractalic dimension D = 2.7268, emerging from quantum gravity as a fixed point of therenormalization group flow. From this single geometric invariant, we derive: (i) α−1 = 137.036, (ii) αs(MZ ) = 0.1181, (iii) sin2 θW = 0.2314, (iv) the Kaluza-Klein spectrum mn = n1/D /R with m1 = 9.73 TeV, and (v) cosmological parameters including ΩDMh2 = 0.120 and ns = 0.965. FFT represents a paradigm shift from symmetry-based to geometry-based unification, predicting 32 experimentally testable quantities with unprecedented accuracy using zero adjustable parameters.
Category: High Energy Particle Physics

[1251] ai.viXra.org:2602.0103 [pdf] submitted on 2026-02-22 02:40:01

Non-Existence Collapse: A Dynamical Mechanism for the Quantum-Gravitational Bootstrap of Spacetime from Nothing

Authors: Steven B. Thompson
Comments: 7 Pages.

We propose a minimal dynamical framework for the origin of the universe in which absolute non-existence—a state with no spacetime, matter, fields, or classical time—is intrinsicallyunstable under quantum-mechanical and gravitational principles. The Heisenberg uncertaintyprinciple, applied to gravitational degrees of freedom, precludes a static null configuration: nonexistence has "nowhere to go" but to collapse into itself, producing an effective Planck-scaledensity regime that undergoes a nonsingular quantum-gravitational transition. This collapsedriven process bootstraps the emergence of classical spacetime and the arrow of time, requiringno external causes, pre-existing substrates, boundary conditions, or auxiliary fields.Quantum gravity emerges here not as an imposed extension but as the inherent dynamicalstructure governing the instability and resolution. The mechanism refines and complementsestablished proposals—such as Vilenkin’s quantum tunneling from nothing, the Hartle-Hawkingno-boundary wavefunction, and recent developments in loop quantum cosmology bounces andquadratic gravity—by providing a purely mechanical interpretation that replaces probabilisticnucleation or Euclidean continuation with an intrinsic collapse bootstrap. It aligns with ongoingrefinements of no-boundary states, curvature bounces, and geometric "from nothing" models inthe 2025—2026 literature.This framework offers a parsimonious dynamical resolution to the question of why thereis something rather than nothing, transforming it into a consequence of quantum gravity’sstructure. Potential observational implications include consistency with cosmic microwave background data and distinguishable signatures in primordial gravitational waves or large-scale structure that may differentiate collapse-initiated emergence from conventional inflationary scenarios.
Category: Relativity and Cosmology

[1250] ai.viXra.org:2602.0102 [pdf] submitted on 2026-02-22 01:52:53

Acoustic Determinism in Single-Hole Cylindrical Flutes: Evidence That Musical Scales Emerge from Instrument Geometry Rather Than Cultural Transmission

Authors: Mark Jerome Growden
Comments: 2 Pages.

This paper proposes that several of the world's most widespread musical scales arise naturally from the acoustic physics of simple bilabial end-blown cylindrical flutes with a single tone hole. Through hands-on construction and performance of one-hole flutes made from uniform cylindrical tubing, the author demonstrates that placing a hole at a minor second interval above the fundamental, combined with the overtone series, generates the Freygish (Hijaz) scale. A hole at a major second generates a pentatonic scale in the more accessible registers, and a Lydian Dominant scale when played into the higher, more demanding partials. When the end-covering technique of overtone flutes is combined with the single-hole technique, additional scale systems emerge from the same instrument. These findings suggest that culturally diverse scale systems may share a common origin not in theory, cultural exchange, or aesthetic preference, but in the physical constraints of elementary wind instrument construction. The implications extend to ethnomusicology, organology, and the origins of tonal music.
Category: Classical Physics

[1249] ai.viXra.org:2602.0101 [pdf] submitted on 2026-02-21 19:14:58

Conversation Fragility is Heavy-Tailed Quantile Reliability Curves for Multi-Turn LLM Evaluation

Authors: Michael Zot
Comments: 8 Pages. (Note by ai.viXra.org Admin: Please cite all listed scientific references)

Multi-turn dialogue is where large language models (LLMs) are most useful, and also where they most often "get lost". Prior work reports that average performance drops substantially from single-turn to multi-turn settings, and argues that the dominant driver is increased unreliability rather than a large loss of peak capability. We replicate and extend this picture using a quantile-based analysis over thousands of stochastic generations, with an emphasis on distribution shape rather than averages.Across seven jobs we analyze N=5,100 scored generations: 30 instructions per job, 10 stochastic runs per instruction, and 1 to 3 turns per run. For each instruction and turn we compute (i) aptitude A90, the 90th percentile of score across runs, and (ii) unreliability U90-10, the 90th to 10th percentile spread.Our core result is a heavy-tailed fragility surface: most instructions remain perfectly stable with U=0, while a small minority contribute most of the unreliability at later turns. Across multi-turn replications, the top 3 most fragile instructions at turn 2 explain 54% to 91% of total unreliability. This yields a practical taxonomy of dialogue dynamics (stable, monotone degradation, and instability then recovery) and suggests new training and evaluation targets: recovery and variance control, not just average accuracy.
Category: Artificial Intelligence

[1248] ai.viXra.org:2602.0100 [pdf] submitted on 2026-02-21 19:10:51

Prolegomena to the Theory of Informational Space Genesis (ISG): A Paradigm Shift in the Cosmology of Perception

Authors: Florian Rücker
Comments: 5 Pages.

Current cosmological models, such as the Lambda-CDM framework, face significant challenges in explaining the nature of Dark Matter and Dark Energy, which account for approximately 95 percent of the universe's mass-energy density. The Theory of Informational Space Genesis (ISG) proposes an ontological shift, defining the universe as an emergent informational system where space is a byproduct of interaction processes. This paper details a four-stage signal denoising methodology applied to the fundamental binding potential of matter to isolate the "Pristine Potential" (Ep approx. 2.15 meV). By correlating this potential with the baryonic base (beta approx. 4.71), derived from the logarithmic volume ratio of atomic structures, we derive a power-law scaling that predicts Dark Matter and Dark Energy densities with over 98 percent accuracy. The ISG framework interprets these "Dark Sectors" as the informational and geometric overhead required for systemic causal isolation. We further explore the role of biological consciousness as a high-frequency processor and discuss the teleological implications of a self-optimizing, learning cosmos.
Category: Astrophysics

[1247] ai.viXra.org:2602.0099 [pdf] submitted on 2026-02-21 01:37:51

The Knotted Vacuum: Topological Defects in a Quantum Condensate as an Emergent Unified Theory of Particles, Gravity, and Cosmology

Authors: Sean McCallum
Comments: 13 Pages. Creative Commons Attribution 4.0 International (CC BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)

The vacuum is not an empty stage upon which physics plays out. It is a dynamical quantumcondensate of overlapping standing-wave fields whose collective excitations give rise to all of physics. We propose and develop a concrete realization in which spacetime itself emerges as the elastic response of this condensate, and every massive particle is a stable topological knot (or defect) in its fabric.

Starting from a minimal pre-geometric O(4) nonlinear sigma-model Lagrangian with a Skyrmestabilizing term, we demonstrate through explicit calculations, lattice simulations, and renormalization group analysis that:

  • Gravity arises as the induced curvature of the vacuum strain (Sakharov mechanism realized in the condensate);
  • Baryons appear as quantized hedgehog skyrmions, leptons as hopfions, and dark matter as higher-winding topological defects;
  • Neutrino masses are exponentially suppressed by topological tunneling, eliminating the needfor a see-saw or sterile neutrinos;
  • The cosmological constant is naturally screened to near-zero by vacuum-shift invariance;
  • Black-hole interiors are regular, finite-density knot cores rather than singularities.
p>With only two fundamental parameters (the condensate scale f; and Skyrme coupling e) the model quantitatively reproduces Newton’s constant G, the proton mass, the observed neutrino mass scale, and the dark-matter relic density. Macroscopic objects and cosmological evolution recover standard general relativity in the appropriate limits, while novel predictions include gravitational-wave echoes from topological hair, a specific velocity-dependent self-interaction cross-section for dark matter,and Planck-scale dispersion in ultra-high-energy cosmic rays.

Lattice simulations in 1D, 2D, and 3D confirm topological stability, emergent curvature, and realistic multi-knot binding (hydrogen and helium-4 atoms). The framework unifies quantum particles, gravity, and cosmology from a single intuitive picture: spacetime as a stretchy quantum balloon, and matter as the permanent knots we tie in it.

This work provides a complete, self-consistent candidate theory that is both conceptually elegant and ready for detailed confrontation with experiment.


Category: Quantum Gravity and String Theory

[1246] ai.viXra.org:2602.0098 [pdf] submitted on 2026-02-21 01:40:38

Dadatic Monism: An Ontological Reinterpretation of Mass-Energy Equivalence and GR-QFT Unification via Causal Data Emittance

Authors: Darcy Facundo
Comments: 01 Pages. "Presents a new informational ontology (Dadatic Monism) to unify General Relativity and Quantum Field Theory through a redefined mass-energy equivalence formula."

1. ABSTRACTThis essay proposes a resolution to the long-standing dichotomy between General Relativity (GR) and Quantum Field Theory (QFT) through the paradigm of Dadatic Monism. It postulates that the Universe is a real-time information processing system where matter is not an inert substance, but a Data Density Anomaly (Ψ) within an Informational Plenum. By formulating the equationED=γ⋅Ψ.Vif2, we establish a causal bridge between total system energy, the fundamental limit of data propagation, and metric coherence protocols (Lorentz). The theory concludes that physical reality and consciousness are emergent byproducts of distinct levels of transduction and attenuation of this universal data stream.2. THE FUNDAMENTAL DADATIC EQUATIONThe core of this theory lies in the redefinition of mass. In Dadatic Monism, mass is the measure of "informational load" at a specific logical address of spacetime.ED= γ⋅Ψ.Vif2ED (Dadatic Energy): The total processing magnitude or computational work contained within an event.Ψ (Radiant Data Mass): The volume of information packets defining the "presence" of a particle or body.Vif =(Propagation Velocity): The constant 3×108 m.por segundointerpreted as the maximum bandwidth of the vacuum (Plenum).(Propagation Velocity): The constant 3×108 m por segundointerpreted as the maximum bandwidth of the vacuum (Plenum).3. DYNAMICS AND THE LORENTZ PROTOCOLThe theory’s validity in non-static systems is ensured by the integration of the Lorentz Factor (γ). In our ontology, γ is not merely a geometric distortion but a Buffer Management Protocol.ED=γ⋅Ψ.Vif2As the velocity of data (v) approaches Vif, the system execute different reference frames.4. UNIFICATION AND TRANSDUCTIONGravity: Interpreted as the "processing lag" imposed by high data density (Ψ). Spacetime curves not due to a force, but to accommodate heavy informational loads within the logical metric.Quantum Mechanics: Entanglement serves as evidence that data is unitary and shared at the source code level, rendering physical distance irrelevant for state synchronization.Biology and Consciousness: The Cortex is postulated as a Quantum-Informational Transducer. The "reality" we experience is an attenuated version of the Plenum, filtered to be processable by biological hardware (Soma).5. CONCLUSIONDadatic Monism offers a "naked," demystified view of physics. It replaces the search for hypothetical particles with an understanding of the laws of emittance and processing. The Universe does not "exist" in a static sense; it is causally rendered at every Planck interval.KEYWORDS: Dadatic Monism; Information Physics; Theoretical Physics; Unification; Lorentz Factor; Biological Transduction.
Category: Quantum Gravity and String Theory

[1245] ai.viXra.org:2602.0097 [pdf] submitted on 2026-02-20 18:28:06

Lgebraic Turbulence and Global Regularity: the Secular Replicator Flow as a Self-Consistent Algebraic Shell Model for Singularity Formation

Authors: Vinicius F. S. Santos
Comments: 19 Pages.

We introduce the Secular Replicator Flow, a finite-dimensional algebraic dynamical system inspired by the turbulent energy cascade of the Navier—Stokes equations, built from the spectral theory of golden resolvent operators on discrete network graphs [9]. The continuous mechanics of fluid turbulence—incompressibility, nonlocal pressure, nonlinear advection, and viscous dissipation—find precise algebraic counterparts in the constraints of a replicator equation evolving on the simplex of spectral participation weights, governed by a global secular equation. Within this framework we establish three principal results. First, the Variance Law: the macroscopic coupled eigenvalue λ∗(t) evolves monotonically according to Fisher’s Fundamental Theorem, acting as a strict Lyapunov function (between excision events) whose rate of increase equals the fitness variance of the active spectrum. Second, the Spectral Selection Theorem: the fitness landscape is a strict bipolar Ushape in the base eigenvalue μ, guaranteeing that the replicator flow annihilates mid-spectrum noise and funnels all energy into the extreme macroscopic topologies of the network. Third, Global Regularity: as the system approaches a structural resonance (transparent pole), the fitness plunges to −∞, triggering an auto-excision mechanism that exponentially starves the dangerous channel, rendering every pole singularity removable. The resulting dynamics form a Sawtooth Cascade of smooth climbs interrupted by discontinuous structural snaps whose direction is controlled by the residual load of the excised channel. We classify the sole remaining failure mode as a thermodynamic phase escape at the r = 2 Chebyshev boundary, where the discrete algebraic structure of the network undergoes a global phase transition into unbounded hyperbolic space—a phenomenon fundamentally different from the localised velocity blowup sought by PDE analysis. All regularity results herein apply to this model; implications for the full Navier—Stokes equations in R3 remain open.
Category: Mathematical Physics

[1244] ai.viXra.org:2602.0096 [pdf] submitted on 2026-02-20 06:04:55

The Master Map: An Audit-First, Attack-Resistant Navigation Guide to the Unconditional Solution of the 4D SU(N) Yang-Mills Existence and Mass Gap Problem (Clay Millennium Problem)

Authors: Lluis Eriksson
Comments: 21 Pages.

This paper is a hostile-review navigation guide and audit manifesto for a companion-paper programme claiming a constructive solution of the four-dimensional SU(N) Yang-Mills existence and mass gap problem in the Osterwalder-Schrader (OS) framework, reconstructed as a Poincare-covariant Wightman QFT with strictly positive mass gap. The guide provides: (i) an explicit dependency graph and Clay/Jaffe-Witten checklist; (ii) an explicit threat model listing standard failure modes targeted by hostile review (black-box dependence on Balaban, interface friction between gradient flow and the Balaban measure, diagonal-limit non-uniformity, and operator-mixing residues); (iii) an explicit four-pillar defensive architecture resolving each attack with structural (not merely quantitative) shields; (iv) the preventive lock: a triangular renormalization-mixing structure that blocks upward anisotropic flow into the marginal (d=4) sector, neutralizing the standard a^2 x a^{-2} -> O(1) objection; (v) a mechanical audit trail mapping load-bearing hypotheses to primary sources; and (vi) a complete linked index of all supporting preprints for traceability. External mathematics is explicitly declared: abstract polymer cluster expansion (Kotecky-Preiss), OS reconstruction (Osterwalder-Schrader), and lattice reflection positivity (Osterwalder-Seiler). Furthermore, this guide introduces the Triangular Mixing Preventive Lock: a structural algebraic mechanism showing that the operator mixing matrix has no anisotropic marginal d=4 sink in the gauge-invariant W_4-scalar sector. Consequently, the standard O(a^2) x O(a^{-2}) -> O(1) operator-mixing residue attack is blocked structurally: any quadratic divergence is forced to renormalize only O(4)-invariant d=4 data (the isotropic coupling), leaving the O(4)-breaking channel suppressed. This paper is not a claim of institutional validation; it is an audit map prescribing the check order and falsification points for the companion-paper chain.
Category: Mathematical Physics

[1243] ai.viXra.org:2602.0095 [pdf] submitted on 2026-02-19 19:48:12

Arithmetic Relativistic Emergence (ARE) as General Relativity of Numbers: Weierstrass Weights, Arakelov Curvature, and Equivalence Principle Analogue

Authors: J. W. McGreevy
Comments: 16 Pages.

We present Arithmetic Relativistic Emergence (ARE) as a "General Relativity of Numbers" — a framework in which the Standard Model, quantum mechanics, classical 3+1 Lorentzian spacetime, and fundamental constants emerge tautologically from the arithmetic geometry of Q. The Riemann zeta function ζ(s) constitutes the maximally symmetric pregeometric vacuum. Its functional-equation symmetry around Re(s) = 1/2, combined with the pole at s = 1, forces spontaneous symmetry breaking via the weight-12 modular discriminant ∆(τ ) = η(τ )24 at the s = 6 harmonic threshold. This breaking disperses the vacuum into Archimedean divergence (Fdiv, smooth curvature density) and non-Archimedean curl (Hcurl, discrete torsion at p-adic fibers). The emergent geometry is governed by Arakelov curvature on the arithmetic surface Spec Z ∪ {∞},where Weierstrass weights act as "mass/energy density" (algebraic rigidity) and the hyperbolic/Bergman metric plays the role of spacetime. Modular transformations toward cusps correspond to Lorentz rapidity, yielding an equivalence principle analogue between inertial (modular flow resistance) and gravitational (metric warping) responses. The adelic spectral triple (KO-dimension 6, finite algebra C ⊕ H ⊕ M3(C)) induces symplectic deformation of phase space, with the non-trivial zeros of zeta providing the Dirac spectrum (Hilbert—Pólya realized). The Minkowski interval ds2 = −c2dt2 + du20d7x 2 emerges as the unique adelic-invariant quadratic form, with light cone as the resolved cusp boundary (holographic screen).The spectral action Tr f (D/Λ) recovers Einstein—Cartan gravity with non-Abelian Yang—Mills, where generalized Rainich conditions (quadratic invariants involving structure constants f abc) are satisfied at s = 6, with torsion (Hcurl) regularizing self-interactions. The full SM gauge group SU(3)c × SU(2)L × U(1)Y and three chiral generations emerge fromadelic place ramification and Leech lattice Z2-orbifold. Constants (α−1 ≈ 137 from Petersson + torsion residues, ℏ from Lambert-Planck suppression, G from unification suppression, Λ ∼ e−288) are inevitable invariants. Langlandsfunctoriality acts as the holographic dictionary mapping prime rigidity to bulk physics. ARE thus unifies physics as the macroscopic shadow of arithmetic rigidity, with the Riemann Hypothesis as a necessary stability condition for the emergent universe.
Category: Mathematical Physics

[1242] ai.viXra.org:2602.0094 [pdf] submitted on 2026-02-19 17:15:36

Golden Resolvent Theory: Spectral Factorisation, Galois Orbits,and Chebyshev Ladders Over Q(√5)

Authors: Vinicius F. S. Santos
Comments: 45 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We develop the spectral theory of operators whose eigenvalue structure is governed by the golden ratio φ = (1 +√5)/2. The foundation is the golden resolvent factorisation: for any real symmetric matrix A, λ2I − λA − A2 = (λI − φA) (λI + A/φ). This identity controls the spectrum of golden companion operators, decomposing eigenvalues into transparent pairs (μφ and −μ/φ) and coupled modes (roots of an explicit secular equation), governed by the Galois conjugation φ 7→ −φ−1 of Q(√5)/Q. We establish six main results: (1) the Golden Amplification Theorem, producing the transparent eigenvalue pairs and their eigenvectors; (2) the Secular Equation, a closed-form characteristic polynomial for coupled modes; (3) the Secular Sensitivity Theorem, identifying the secular weight with the coupled eigenvector norm and establishing Lipschitz continuity of coupled eigenvalues in the coupling vector; (4) the Positive Boost Inequality, bounding submatrix spectral radii via nodal-domain restrictions; (5) the Galois Transfer Principle, exactly classifying partition transfer for conjugate eigenvector pairs via the Transfer/Pareto Trichotomy—with automatic spectral dominance for transparent modes and unavoidable Pareto regimes for coupled modes; (6) the Chebyshev Ladder, generalising the amplification ratio to 2 cos(π/(2p+3)) through the cyclotomic fields Q(ζ2p+3)+. The spectral spread of the golden pair equals the generator of the different ideal of Z[φ], connecting the framework to the arithmetic of Q(√5).
Category: Combinatorics and Graph Theory

[1241] ai.viXra.org:2602.0093 [pdf] submitted on 2026-02-19 07:09:46

Quantum Indeterminacy as Gödelian Epistemic Limitation: Implications of Relational Mathematical Realism for Quantum Foundations

Authors: Jason Merwin
Comments: 13 Pages. This is the second paper in a four paper series.

In a companion paper, we established the Theorem of Temporal Necessity within the framework of Relational Mathematical Realism (RMR), demonstrating that a sufficiently complex, locally consistent mathematical structure cannot exist as a static object but must undergo a non-terminating sequence of state extensions identified with physical time. In this paper, we extend the framework to quantum mechanics. We argue that quantum indeterminacy is not a fundamental property of nature but an epistemic consequence of observers being embedded subsystems within an evolving relational structure. The "hidden variable" determining quantum outcomes is the global relational topology of the present state St, which is non-local by definition and inaccessible to any embedded observer. We show that this framework survives Bell’s theorem by violating measurement independence through synchronic topological constraint rather than diachronic conspiratorial fine-tuning, and we resolve the measurement problem by identifying wavefunction collapse with the topological update of the observer’s local subgraph. We further conjecture that the Born rule (P = |ψ|2) arises as a geometric property of the Gödelian boundary—specifically, that probability scales with the combinatorial cross-sectional area of relational bundles at the logical horizon, unifying quantum probability with Bekenstein-Hawking entropy under a single geometric principle. Finally, we propose that the renormalization group flow of quantum field theory is the graph-theoretic coarse-graining of the universal relational structure, and that the hierarchy between gravitational and gauge force strengths reflects the ratio of global connectivity to local clustering density in the universal graph.
Category: History and Philosophy of Physics

[1240] ai.viXra.org:2602.0092 [pdf] submitted on 2026-02-19 12:13:41

Rotational Symmetry Restoration and the Wightman Axioms for Four-Dimensional SU(N) Yang--Mills Theory

Authors: Lluis Eriksson
Comments: 9 Pages.

We complete the rigorous construction of four-dimensional Euclidean SU(N) Yang--Mills quantum field theory and establish the existence of a mass gap. Building on the companion papers -- which unconditionally establish exponential clustering with mass gap, the Osterwalder--Schrader axioms OS0, OS2, OS3, OS4, and quantitative irrelevance of O(4)-breaking lattice operators -- we derive a lattice Ward identity for infinitesimal Euclidean rotations, identify the breaking term as a dimension-6 anisotropic operator insertion, and prove that the breaking distribution vanishes as $O(eta^2,|log((Lambda_{mathrm{YM}}eta)^{-1})|) to 0$ in the continuum limit, establishing axiom OS1 (full O(4) Euclidean covariance). Combined with the Osterwalder--Schrader reconstruction theorem, this yields a non-trivial Poincare-covariant Wightman quantum field theory with mass gap $Delta_{mathrm{phys}} geq c_N,Lambda_{mathrm{YM}} > 0$ for each $N geq 2$.
Category: Mathematical Physics

[1239] ai.viXra.org:2602.0091 [pdf] submitted on 2026-02-19 12:16:47

Closing the Last Gap in the 4D SU(N) Yang--Mills Construction: A Verified Terminal KP Bound and an Explicit Clay Checklist -- Audit-Friendly Assembly: Polymer Activities => KP => OS => Wightman with Mass Gap

Authors: Lluis Eriksson
Comments: 9 Pages.

This paper has two goals.Part I (terminal KP bound). We provide a verifiable, citation-driven derivation of the terminal-scale Kotecky--Preiss (KP) smallness bound used in the companion paper on exponential clustering and mass gap. Rather than re-deriving the full multiscale renormalization group (RG), we isolate explicit hypotheses (H1)--(H3) on the terminal polymer activities and prove that they imply the KP convergence criterion. We then verify (H1)--(H3) by mapping them to specific statements in Balaban's published primary sources (CMP 116, 119, 122), with an audited notation bridge recorded in the structural package companion paper.Part II (assembly map + Clay checklist). We give an explicit dependency graph assembling the companion papers together with the KP input proved here. We provide a checklist matching the Clay/Jaffe--Witten formulation of the Yang--Mills existence and mass gap problem to the theorems across the paper sequence (OS0--OS4, OS1, and the mass gap).Scope / external mathematics. The argument uses the abstract KP cluster expansion theorem (Kotecky--Preiss 1986) and the Osterwalder--Schrader reconstruction theorem (1975). It relies on the terminal polymer representation and activity bounds as proved in Balaban's CMP papers cited above.
Category: Mathematical Physics

[1238] ai.viXra.org:2602.0090 [pdf] submitted on 2026-02-19 17:42:14

The Relativity Theory of Focus [Part 1-3]

Authors: Yun Seok Choe
Comments: 8 Pages. (Note by ai.viXra.org Admin: This submission mainly contains speculations and may not be written in a complete/scholarly manner - Please cite and list scientific references)

[Paper 1] This foundational paper establishes the "Relativity of Focus" as a new physical principle. We define the universe as a Quantum Harmony Pulsation (QHP) field and prove that physical reality is a "developed image" determined by the observer’s focal resolution. We derive the c2 constant as a dynamic pulsation rate and establish the mathematical framework for the focus operator (Γ).

[Paper 2] Based on the foundational principles of Quantum Harmony Pulsation (QHP) established in Part 1, this paper proposes a Grand Unified Theory (GUT) by redefining ’Force’ as a manifestation of pulsation density gradients. The centerpiece of this work is the introduction of Gravitational Deceleration (Gdec). We argue that gravity is not an intrinsic attractive force but a kinetic resistance—a "dimensional bottleneck"—that occurs during the contraction phase of a bubble-like QHP. Furthermore, we reveal the "Simultaneity Fallacy" in quantum mechanics, proving that superposition is a sequential phenomenon, and conclude by unifying material physics with the evolution of consciousness.

[Paper 3] Based on the foundational principles of Quantum Harmony Pulsation (QHP) established in Part 1, this paper proposes a Grand Unified Theory (GUT) by redefining ’Force’ as a manifestation of pulsation density gradients. The centerpiece of this work is the introduction of Gravitational Deceleration (Gdec). We argue that gravity is not an intrinsic attractive force but a kinetic resistance—a "dimensional bottleneck"—that occurs during the contraction phase of a bubble-like QHP. Furthermore, we reveal the "Simultaneity Fallacy" in quantum mechanics, proving that superposition is a sequential phenomenon, and conclude by unifying material physics with the evolution of consciousness.

[Paper 4] As the final installment of the ’Focus Science’ trilogy, this paper provides the numerical and geometric evidence for the Relativity of Focus. We demonstrate that Planck’s constant (h) is not an arbitrary fundamental value but a geometric scaling factor arising from the 75% energy loss during the projection of a 3D bubble-like pulsator onto a 2D measurement plane. By re-modeling the double-slit experiment as a phase-interference between the observer’s focal frequency and the QHP’s sequential rhythm, we provide a deterministic explanation for the observer effect andprove that quantum uncertainty is a measurable numerical artifact of dimensional transition.
Category: Quantum Physics

[1237] ai.viXra.org:2602.0089 [pdf] submitted on 2026-02-18 09:21:04

Spectral Gap and Thermodynamic Limit for SU(N) Lattice Yang—Mills Theory via Log-Sobolev Inequalities and Complete Analyticity

Authors: Lluis Eriksson
Comments: 17 Pages.

We establish two independent rigorous results for four-dimensional SU(N)pure-gauge lattice Yang—Mills theory with Wilson action, at fixed latticespacing η > 0 and weak coupling gu2080 ≤ g_*, uniformly in the spatial volume L.(A) Uniform Log-Sobolev Inequality. The Wilson measure μ_L satisfies Ent_{μ_L}(f²) ≤ (2/ρ̂) E_L(f,f) with constant ρ̂ > 0 independent of L, where E_L is the natural Dirichlet form on SU(N)^{|E(Λ)|}.(B) Uniform Mass Gap. The Osterwalder—Seiler Hamiltonian H_L has a spectral gap m_gap ≥ mu2080 > 0, uniformly in L.Both theorems share a single input — the Dobrushin—Shlosman completeanalyticity (CA) condition, verified via Bałaban's renormalization groupprogram — but follow logically independent paths. Theorem A is derivedthrough Cesi's quasi-factorization of entropy, seeded by a Bakry—Émerylocal log-Sobolev inequality on SU(N)^{|E(Σ)|}; the Ricci curvatureRic_{SU(N)} = (N/4)g plays a key role. Theorem B is derived throughexponential clustering of temporal correlations — a consequence of CA viaDobrushin contraction — combined with the Osterwalder—Seiler transfer-matrixconstruction and the Krein—Rutman theorem. We further prove (C) that {μ_L}converges weakly to a unique, translation-invariant infinite-volume Gibbsstate μ_∞ satisfying the DLR consistency equations, whose reconstructedHamiltonian H_∞ inherits the mass gap mu2080. All constants are explicit inN, gu2080, and η. The present results hold at fixed lattice spacing; thecontinuum limit η → 0 is addressed in a companion paper.
Category: Mathematical Physics

[1236] ai.viXra.org:2602.0088 [pdf] submitted on 2026-02-18 17:31:06

Exponential Clustering and Mass Gap for Four-Dimensional SU(N) Lattice Yang-Mills Theory via Balaban's Renormalization Group and Multiscale Correlator Decoupling

Authors: Lluis Eriksson
Comments: 20 Pages.

Assuming a uniform log-Sobolev inequality for the pure Wilson measure (isolated here as an explicit hypothesis), we establish exponential clustering with a strictly positive mass gap for four-dimensional pure SU(N) lattice Yang-Mills theory with Wilson's action, uniformly in lattice spacing eta and physical volume L_phys:|Cov(O(0), O(x))| leq C exp(-m |x| / a_*), with m > 0 and a_* ~ Lambda_YM^{-1}.The proof assembles three ingredients: (1) Balaban's rigorous renormalization group for lattice gauge theories (CMP 1984-1989), which produces effective densities with local polymer decompositions and exponentially decaying activities; (2) a uniform log-Sobolev inequality for the pure Wilson measure, used as an input assumption; and (3) a multiscale correlator decoupling identity (this paper), which separates ultraviolet fluctuations from infrared physics. The coupling control required by Balaban's framework -- that the effective couplings remain in the perturbative regime throughout the RG iteration -- is established via an inductive argument using Cauchy bounds on the analyticity of the effective action.We also verify (under the same hypothesis) the Osterwalder-Schrader axioms OS0, OS2, OS3, and OS4 for subsequential continuum limits, and establish vacuum uniqueness and non-triviality. The remaining axiom OS1 (full O(4) Euclidean covariance) is not established here; we prove covariance under lattice translations and the hypercubic group W_4, and show that if O(4) covariance holds in the continuum limit, the reconstructed Wightman theory is a non-trivial relativistic quantum field theory with mass gap Delta_phys geq c_N Lambda_YM > 0.
Category: Mathematical Physics

[1235] ai.viXra.org:2602.0087 [pdf] submitted on 2026-02-18 18:58:54

Irrelevant Operators, Anisotropy Bounds, and Operator Insertions in Balaban's Renormalization Group for Four-Dimensional SU(N) Lattice Yang—Mills Theory: Symanzik Classification and Quantitative Irrelevance of O(4)-Breaking Operators

Authors: Lluis Eriksson
Comments: 18 Pages.

We classify gauge-invariant local lattice operators of classical dimension 6 on the four-dimensional hypercubic lattice into O(4)-invariant, hypercubic-invariant but O(4)-breaking (anisotropic), and on-shell-redundant components, following the Symanzik improvement programme and the on-shell improvement technique of Lüscher—Weisz (1985). Inside Balaban's renormalization group framework for SU(N) lattice Yang—Mills theory, we extract the anisotropic projection of the effective action via local Taylor expansion of polymer activities in the small-field regime and prove a quantitative quadratic scale bound for the anisotropic coefficient: for every RG step k ≤ k* with effective coupling g_k ≤ γ_0, the coefficient of the (one-dimensional) anisotropic sector in the classical dimension-6 Symanzik expansion satisfies |c_{6,aniso}^{(k)}| ≤ C a_k^2, uniformly in lattice spacing η, physical volume L_phys, and RG step k. We further prove a quantitative insertion integrability estimate for connected correlators with one insertion of the anisotropic operator. When combined with the rotational Ward identity derived in the companion paper, this yields that the corresponding breaking distribution tested against Schwartz functions is O(η^2 |log(Λ_YM η)^{-1}|) and hence vanishes as η → 0.
Category: Mathematical Physics

[1234] ai.viXra.org:2602.0086 [pdf] submitted on 2026-02-18 02:34:37

On the Invariance of Planck Scale Under Special Relativity and the Geometry of Quantum Space

Authors: Moninder Singh Modgil, Dnyandeo Dattatray Patil
Comments: 31 Pages.

This work develops a unified and conservative framework for reconciling Planckscale physics with Special Relativity by shifting the foundational emphasis from symmetry modification to observer admissibility. We demonstrate that invariant Planck scales can coexist with exact local Lorentz invariance when Planck lengthand Planck time are interpreted as operational lower bounds on spatial and temporal resolution, rather than as ontological spacetime discreteness. Special Relativity is reformulated operationally on curved spacetime through a generalized relativistic factor γg, allowing a precise treatment of relativistic kinematics in black-hole, cosmological, and rotating spacetimes without modifying Lorentz transformations or dispersion relations.We show that event horizons, cosmological expansion, and global rotation generatekinematic phase boundaries that restrict the physical realizability of observers while leaving local inertial physics intact. Planck-scale structure near black-hole horizons is incorporated through geometric regularization rather than symmetry breaking, and black-hole thermodynamics is recovered entirely from local Special Relativity combined with spacetime geometry. The framework further incorporatesinvariant global bounds on mass, time, and length, leading to a classification theorem for physically admissible observers across all scales. Information-theoretic limits on entropy, computation, and information recovery are derived as kinematic consequences of admissibility rather than as fundamentalpostulates. Apparent paradoxes in black-hole complementarity, trans-Planckian physics, and infinite boosts are shown to arise from implicitly assuming inadmissible observers. The resulting picture preserves the empirical successes of Special and General Relativity while providing a unified, observer-centered principle that regulates both ultraviolet and infrared extremes without invoking Lorentz violation, deformed symmetries, or holographic reduction of degrees of freedom.
Category: Relativity and Cosmology

[1233] ai.viXra.org:2602.0085 [pdf] submitted on 2026-02-17 16:18:50

Ultraviolet Stability of Wilson-Loop Expectations in 4D Lattice Yang—Mills Theory Via Multiscale Gradient-Flow Smoothing

Authors: Lluis Eriksson
Comments: 21 Pages.

We prove that Wilson-loop expectations in four-dimensional Euclideanlattice Yang—Mills theory with compact gauge group G admit auniversal continuum limit, independent of the lattice approximationscheme, for every contractible loop and all values of the coupling.The proof proceeds by a multiscale decomposition that combinesBalaban's renormalization-group framework with a quantitativegradient-flow smoothing step at each scale. For an observableliving at lattice scale k, the Yang—Mills gradient flow is run fora time proportional to the squared lattice spacing a_k^2; adeterministic contraction estimate (Theorem 3.5) shows that thisreduces the single-link oscillation of the flowed observable by afactor L^{-2k}, where L is the blocking factor. The resultinggeometric series is summable and yields the desired uniform bound.The two main inputs are: (i) a pointwise domination lemma(Lemma 3.3) that controls the gradient of the flowed observableby a scalar heat kernel on the link graph, exploiting thecontractivity of parallel transport; and (ii) a Duhamelinterpolation formula (Lemma 4.1) that converts each change-of-measure error into a covariance with the irrelevant part of theeffective action, bounded via a Poincaré-type inequality. Togetherthese close the Balaban—Doob inductive circuit under a quantitativeblocking hypothesis that is verified in a companion paper.As a corollary, we establish Osterwalder—Schrader reflectionpositivity for the gradient-flow-smoothed Wilson-loop observable,which together with the continuum limit yields a construction ofthe physical Hilbert space and a positive transfer matrix for thetheory.
Category: Mathematical Physics

[1232] ai.viXra.org:2602.0084 [pdf] submitted on 2026-02-17 19:43:23

Almost Reflection Positivity for Gradient-Flow Observables via Gaussian Localization in Lattice Yang-Mills Theory

Authors: Lluis Eriksson
Comments: 15 Pages.

We establish quantitative almost-reflection positivity (almost-RP) for a family of flowed observables in finite-volume lattice Yang-Mills theory on the four-dimensional Euclidean torus T_L^4 with structure group G = SU(N). The lattice Wilson flow - the lattice counterpart of the Yang-Mills gradient flow - acts as a gauge-covariant smoothing that suppresses ultraviolet fluctuations. By combining three ingredients: (i) a Gaussian localization bound that controls the variance of flowed observables via an Efron-Stein-type inequality, (ii) Jacobian estimates for the lattice Wilson flow that yield exponential decay of trans-plane influence, and (iii) the exact lattice reflection positivity of the Wilson action, we show that the failure of RP for flowed observables is exponentially small in the ratio epsilon_0^2 / t, where epsilon_0 is the physical separation between the observable's support and the reflection plane (minus the diffusion scale sqrt(8t)), and t > 0 is the flow time. We record the standard Osterwalder-Schrader reconstruction as a conditional statement: exact reflection positivity on a positive-time algebra implies a Hilbert space, a vacuum, and a non-negative Hamiltonian. Our approach is non-perturbative, holds for all values of the lattice coupling, and requires no cluster expansion or infinite-volume limit.
Category: Mathematical Physics

[1231] ai.viXra.org:2602.0083 [pdf] submitted on 2026-02-17 01:50:28

Quantum Mechanics from Stochastic Coherence: Resolving the Wallstrom Objection via Topological Stability.

Authors: Adrian Leonardo Rohr
Comments: 11 Pages.

We derive the Schrödinger equation from the Onsager-Machlup stochastic variational principle and address the Wallstrom objection within this framework. Wallstrom showed that the Madelung hydrodynamic equations do not enforce quantization of phase circulation unless single-valuedness of the wave function is imposed as an additional condition.We prove that quantization of phase circulation follows from intrinsic requirements of the stochastic formulation. Reformulating the phase gradient as a flat U(1) connection on the punctured domain where the density is positive, we show that smooth removability of isolated singularities forces trivial holonomy, yielding the quantization condition∮ ∇S · dl = 2πnℏFor configurations with nodal zeros, quantization emerges from the combination of the Hamilton-Jacobi constraint at nodal zeros and the regularity required for the probability current to be a well-defined observable on physical space. These two conditions, neither of which alone implies quantization, together exclude non-integer winding numbers including the recently constructed non-quantized strong solutions of the Madelung equations.Under the identification D = ℏ/(2m), the Madelung transformation then recovers the Schrödinger equation without postulating wave function single-valuedness. Quantization emerges as a geometric and regularity consequence of the Onsager-Machlup variational structure.
Category: Quantum Physics

[1230] ai.viXra.org:2602.0082 [pdf] submitted on 2026-02-17 02:36:31

Reconstruction of a Minimal Six-Dimensional Light Entity

Authors: Tingfang Yi
Comments: 7 Pages.

We propose a minimal six-dimensional (6D) light null entity in which the six dimensions are intrinsic degrees of freedom of a null physical entity. The six dimensions consist of a two- dimensional null propagation geometry together with four intrinsic one-dimensional degrees of freedom of light: optical phase, polarization, frequency, and orientation along the null momentum generator. In this framework, all four-dimensional (4D) spacetime optical, electromagnetic, and quantum phenomena are understood as lower-dimensional projection or section measurements of a single higher-dimensional null entity.
Category: Mathematical Physics

[1229] ai.viXra.org:2602.0081 [pdf] submitted on 2026-02-16 23:46:46

Geometry and Spin Structure of the Single Oloid Manifold as the Origin of Leptonic Mass (Information-Geometric Physics System I)

Authors: Pruk Ninsook
Comments: 43 Pages.

This work presents the theoretical framework of the Information-Geometric Physics System (IGPS), which elucidates the emergence of fundamental properties in leptonic particles through the structure of the Oloid manifold [17]. Under the constraints of C2continuity and seam coupling enforced by S1 ⊥ S1 symmetry, the fundamental constituents of matter are modeled as "closed information nodes." In this framework, the spectral mass is derived as a curvature integral and a normal bundle holonomy manifesting along the manifold’s seam [13]. We prove that the moduli space of the seam under rigidity conditions yields an SO(3) symmetry group structure, naturally inducing a spin structure via the SU(2) double cover [18]. Furthermore, it is demonstrated that the geometric stiffness parameterβ = 1/(√3π) emerges as a universal geometric invariant governing the internal strain scale [3]. These results suggest that the fermion mass spectrum is directly coupled to the topological structure of the informational manifold, providing a foundational basis for the extension into composite systems and strong interactions in subsequent work [9].
Category: Quantum Gravity and String Theory

[1228] ai.viXra.org:2602.0080 [pdf] submitted on 2026-02-16 23:48:46

Multi-Seam Configuration and the Topological Scaling of Baryonic Mass (Information-Geometric Physics System II)

Authors: Pruk Ninsook
Comments: 25 Pages.

This research extends the scope of the Information-Geometric Physics System (IGPS) from single-node systems to composite nuclear structures via the Oloid Trinity Configuration, elucidating the topological origin of mass and the statistical properties of baryons [8, 9]. We introduce the Dimensional Jump phenomenon, representing an informational scale transition from planar scaling on the manifold’s seam to the sweep volume ofentangled folded manifolds. This transition results in the emergence of a universal geometricmultiplier G = 4 3π2, which systematically bridges leptonic mass to the nuclear mass scale.Under rigidity constraints and SU(3) symmetry, we prove that the extrinsic/interactionstrain is fixed at Δ = 2.5 through the 5/2 Theorem, enabling the master equation to predict the proton mass with 99.99% accuracy relative to CODATA standards [10]. Furthermore, it is demonstrated that Fermi-Dirac statistics and fractional spin 1/2 emerge directly from the preservation of C2 continuity on manifolds entangled through the SU(2)double-covering structure. Residual analysis confirms that manifest discrepancies align withthe order of radiative corrections in Quantum Electrodynamics (QED). These results confirmthat baryonic structure represents the most stable volumetric organization of information,effectively achieving structural closure for the origin of matter within the IGPS framework.
Category: Quantum Gravity and String Theory

[1227] ai.viXra.org:2602.0079 [pdf] submitted on 2026-02-16 23:53:19

A Modified Sieve of Sundaram

Authors: Wiroj Homsup
Comments: 3 Pages.

A new Twin prime sieve based on a modified sieve of Sundaram is introduced. It sieves through the set of natural numbers n such that 3n is not representable in either of the forms 2ij + i + j or 2ij + i + j +1 for positive integers i, j.
Category: Number Theory

[1226] ai.viXra.org:2602.0078 [pdf] submitted on 2026-02-15 17:57:53

Canonical Nonlinear Partial Differential Equations

Authors: Luisiana X Cundin
Comments: 6 Pages.

A formal, systematic approach for generating nonlinear partial differential equations is outlined, which provides a more robust, reliable method. Additionally, formal methods provide a means to test the validity and/or the veracity of proposed nonlinear partial differential equations, thereby potentially saving researchers precious time and effort.
Category: Mathematical Physics

[1225] ai.viXra.org:2602.0077 [pdf] submitted on 2026-02-15 05:11:13

Ultraviolet Stability for Four-Dimensional Lattice Yang—Mills Theory Under a Quantitative Blocking Hypothesis

Authors: Lluis Eriksson
Comments: 14 Pages.

We prove that the continuum limit of pure SU(N) lattice Yang—Mills theory in four Euclidean dimensions exists on the algebra of blocked observables at fixed finite volume, conditional on a quantitative regularity hypothesis for the blocking map. The argument combines three components: Bałaban's rigorous renormalization group program, which provides polymer representations and ultraviolet stability; a Doob-martingale influence bound that controls covariance without product-measure assumptions; and a renormalization-group Cauchy summability framework converting per-scale oscillation decay into convergence. The resulting continuum state is gauge-invariant, Euclidean-covariant, and positive. Osterwalder—Schrader reconstruction, the thermodynamic limit, and the mass gap remain open.
Category: Mathematical Physics

[1224] ai.viXra.org:2602.0076 [pdf] submitted on 2026-02-15 17:35:57

The Topological Inversion Model (TIM): A Unified Framework for Cosmology and Fundamental Physics

Authors: Kobie Janse van Rensburg
Comments: 2 Pages. (Note by ai.viXra.org Admin: This submission is somewhat ill-formatted and incomplete - Please cite and listed scientific references)

The Topological Inversion Model (TIM) proposes a novel Theory of Everything (TOE) based on the principle that "Absolute Nothing" is an unstable logical null state enforced by topology. Theuniverse emerges from a reciprocal topological inversion (the "Snap") at the Planck scale, resolving the paradox of sustained nothingness. Matter is interpreted as topological defects or "knots" resisting elastic recoil toward zero density, gravity as an inward push to unravel these defects, and dark energy as the dynamic scalar field Ψ representing recoil tension. Black holes serve as recycling engines converting defects back into Ψ, driving cosmic acceleration and resolving the Hubble tension. This paper formalizes the model, derives its equations of motion, and demonstrates resolutions to key problems including singularities, the information paradox, dark matter, and fine-tuning. Predictions include correlations between black hole density and local H0, gravitational wave echoes, and dynamic dark energy evolution. TIM is consistent with the Standard Model and General Relativity in their regimes while providing a unified topological foundation without extra dimensions or supersymmetry.Keywords: Cosmology, Quantum Gravity, Topology, Dark Energy, Hubble Tension
Category: Quantum Gravity and String Theory

[1223] ai.viXra.org:2602.0075 [pdf] submitted on 2026-02-14 07:02:02

Regular Simplex Hierarchical Gravity Part II: The Computational Universe: Deriving Spacetime, Light Speed, and Singularities from Infinite-Dimensional Simplex and Geometric Frustration

Authors: Ryuhei Sato
Comments: 16 Pages.

Einstein’s special relativity postulates the constancy of light speed c as an axiom but providesno explanation for its origin. We reformulate the universe as a discrete computational networkoperating at the Planck scale, wherein c emerges not as a fundamental constant but as the band-width limit of information processing. We demonstrate that the initial state of the universe—aninfinite-dimensional regular simplex—possesses spectral properties (Laplacian eigenvalue λ2 = N )that naturally enforce cosmic uniformity and clock synchronization without invoking inflationaryexpansion. Dimensional reduction from infinite to three dimensions generates an unavoidable infor-mational collision, which we term informational Pauli repulsion, providing the physical driver forboth the Big Bang and accelerated expansion. The deficit angle δ ≈ 7.36◦ inherent in the 600-celltessellation, combined with Gauss’s Theorema Egregium, guarantees spatial closure without externalembedding dimensions, thereby establishing a decisive advantage over string-theoretic frameworksrequiring 10 or 11 dimensions. We derive the Light-Speed Resource Allocation Principle (LRAP),c2 = v2 + τ 2, reinterpreting the Lorentz factor as a processing lag ratio rather than a coordinatetransformation coefficient. Black hole singularities are redefined as computational arrest regionswhere 3D rendering fails, leaving 4D data in a frozen state—a paradigm shift that naturally sub-sumes string theory and the holographic principle as effective descriptions within these arrestedzones. Finally, we prove that local resource allocation alone cannot resolve the global accumula-tion of geometric frustration, necessitating the hierarchical jamming transitions detailed in Part III.This work bridges the static geometry of Part I (derivation of G and Λ) with the thermodynamichierarchy of Part III (122-digit vacuum energy suppression), completing the dynamical core of theRegular Simplex Hierarchical Gravity (RSHG) framework.
Category: Relativity and Cosmology

[1222] ai.viXra.org:2602.0074 [pdf] submitted on 2026-02-14 07:07:46

Regular Simplex Hierarchical Gravity Part III: Jamming Scale Law and Hierarchical Energy Suppression

Authors: Ryuhei Sato
Comments: 12 Pages.

This paper presents a complete resolution of the cosmological constant problem within the Reg￾ular Simplex Hierarchical Gravity (RSHG) framework. The non-tessellating property of regulartetrahedra in 3D Euclidean space—characterized by a geometric residual (deficit angle δ ≈ 7.36)—induces recursive jamming transitions across six hierarchical scales spanning from 10−15 m to 1021 m(Fig. 1: six-stage cascade structure). Each hierarchy generates approximately 20 orders of magni￾tude of energy suppression. The cumulative suppression factor ϵtotal ≈ 10−122.2agrees with theobserved cosmological constant Λobs within 0.2 orders of magnitude. Critically, this result containsno adjustable parameters; even the number of hierarchies N = 6 emerges as an arithmetic conse￾quence of the target suppression (122 digits) divided by the single-stage suppression (∼ 19.2 digits).Furthermore, operation near the jamming criticality (ϕ ≈ 0.62, Fig. 2: metastable operating point)enables the conversion of computational heat into structural entropy (computational encapsulation),thereby preventing thermal collapse. Three experimentally verifiable predictions are presented: H4symmetry in the CMB angular power spectrum (ℓ = 120n), an entropy ratio Sstruct/Sthermal ≈ 0.2in Bose-Einstein condensates, and tetrahedral coordinate preference in protein structures.
Category: Relativity and Cosmology

[1221] ai.viXra.org:2602.0073 [pdf] submitted on 2026-02-14 09:06:13

RG—Cauchy Summability for Blocked Observables in 4d Lattice Yang—Mills Theory via Balaban's Renormalization Group

Authors: Lluis Eriksson
Comments: 17 Pages.

We prove that expectations of blocked, bounded Lipschitz observables at a fixed physical scale ℓ > 0 form an absolutely summable telescoping sequence along a Balaban-matched renormalization trajectory in four-dimensional SU(N_c) lattice Yang—Mills theory with lattice spacings a_k = a_0 2^{-k}. In particular, the continuum limit state ω(O) := lim_{k→∞} ⟨O^{(k)}⟩_{Λ_k, β_k} exists for every O in the blocked observable algebra A_ℓ^{block}. The proof uses three ingredients: (i) an exact RG identity (law of iterated expectations), (ii) a one-step pushforward stability bound for blocked observables derived from Gaussian control of fast modes and an approximate centering property of the fluctuation field, and (iii) a measure-comparison lemma via Duhamel interpolation using polymer remainder bounds. No quantitative rate of asymptotic freedom is required beyond staying in the small-coupling regime where the RG estimates hold; summability follows from the geometric decay (a_k/ℓ)^2 = O(4^{-k}) together with the assumed summability of the large-field/truncation errors {τ_k}. We also state a conditional extension to "renormalized" observables (e.g. Creutz-type constructions) contingent on a nonperturbative Symanzik extraction from polymer expansions, and we discuss the relation to Osterwalder—Schrader reconstruction and the mass gap problem.
Category: Mathematical Physics

[1220] ai.viXra.org:2602.0072 [pdf] submitted on 2026-02-14 11:43:03

Influence Bounds for Polymer Remainders in Balaban's Renormalization Group: Closing Assumption (B6) for the RG-Cauchy Programme in 4D Lattice Yang-Mills

Authors: Lluis Eriksson
Comments: 14 Pages.

We close the missing influence estimate — Assumption (B6) — required by the RG-Cauchy summability framework for blocked observables in four-dimensional SU(N_c) lattice Yang-Mills theory. The influence is measured by the Efron-Stein seminorm sigma_nu(f)^2 = sum_{e} E_nu[Var_e^nu(f)] that appears in the Duhamel interpolation lemma of the companion paper. We work in the small-field regime of Balaban's multiscale effective action and assume: (A1) a standard polymer representation for the irrelevant remainder V_k^{irr} = sum_X K_k(X); (A2) an explicit per-link oscillation bound for polymer activities carrying the correct irrelevance factor 2^{-2k}; (A3) a lattice-animal counting estimate. Under these three verifiable hypotheses — to be discharged from Balaban's historical work in a companion compendium paper — we prove sup_{t in [0,1]} sigma_{nu_{k,t}}(V_k^{irr}) <= C, where C = C(N_c, beta_0, kappa, C_osc, C_anim, p, L/a_0) is independent of the RG scale k. The proof uses only oscillation bounds and combinatorics: no log-Sobolev inequality, no mixing hypothesis, and no measure-dependent technology beyond the definition of conditional variance. This removes the only genuinely novel probabilistic input remaining in the UV block of the programme towards the Yang-Mills Millennium Prize.
Category: Mathematical Physics

[1219] ai.viXra.org:2602.0071 [pdf] submitted on 2026-02-15 01:07:59

Temporal Necessity in Relational Mathematical Realism: A Gödelian Argument Against the Block Universe

Authors: Jason Merwin
Comments: 10 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

A foundational question in the philosophy of physics is whether time is a fundamental dimension of reality or an emergent phenomenon. The standard Block Universe interpretation of general relativity treats time as a static dimension, with the passage of time relegated to psychological illusion. In this paper, we present a novel argument against the Block Universe derived from the framework of Relational Mathematical Realism (RMR), which identifies physical existence with mathematical structure. We demonstrate that if reality is a sufficiently complex, locally consistent mathematical structure, then Gödel's First Incompleteness Theorem renders a static, completed universe logically impossible. The resolution of this impossibility requires the structure to undergo a non-terminating sequence of state extensions, which we identify with the passage of time. We conclude that time is not a dimension within which the universe exists, but rather the logically necessary process by which a complex mathematical structure maintains consistency. This result, if sound, constitutes the first derivation of temporal passage from mathematical logic and ontology alone.
Category: History and Philosophy of Physics

[1218] ai.viXra.org:2602.0070 [pdf] submitted on 2026-02-14 18:53:53

Doob Influence Bounds for Polymer Remainders in 4D Lattice Yang-Mills Renormalization

Authors: Lluis Eriksson
Comments: 7 Pages.

We prove a uniform Doob martingale influence bound for the irrelevant polymer remainder arising in multiscale renormalization group analyses of four-dimensional SU(N_c) lattice Yang-Mills theory at fixed physical volume. Our main tool is the Doob influence seminorm sigma_nu(f)^2 = sum_i E_nu[(Delta_i f)^2], which yields an exact covariance identity for arbitrary probability measures. Assuming a deterministic per-link oscillation estimate for polymer activities with a scale factor 2^{-2k} (imported from the Balaban renormalization group programme) and using a standard lattice-animal counting lemma (proved here), we obtain a bound sup_{t in [0,1]} sigma_{nu_{k,t}}(V_k^{irr}) <= C independent of the RG scale k. We then explain how this bound feeds into a Duhamel interpolation step used in RG-Cauchy convergence arguments.
Category: Mathematical Physics

[1217] ai.viXra.org:2602.0069 [pdf] submitted on 2026-02-14 20:48:17

The Balaban—Dimock Structural Package: Derivation of Polymer Representation, Oscillation Bounds, and Large-Field Suppression for Lattice Yang—Mills Theory from Primary Sources

Authors: Lluis Eriksson
Comments: 23 Pages.

We provide a self-contained derivation of the three structural hypotheses — polymer representation (A1), per-link oscillation bounds with geometric decay factor (A2), and large-field suppression (B5) — that were assumed in Doob Influence Bounds for Polymer Remainders in 4D Lattice Yang—Mills Renormalization and in the RG—Cauchy Master Framework. All results are traced to precise equations in the primary sources: the series of papers by T. Bałaban (Commun. Math. Phys., 1984—1989) and the expository trilogy by J. Dimock (2011—2014). The translation from Bałaban's analytic norms on gauge-covariant function spaces to the per-link oscillation language used in the probabilistic framework is made explicit. Together with Doob Influence Bounds for Polymer Remainders in 4D Lattice Yang—Mills Renormalization, this completes the unconditional discharge of the UV structural inputs for the renormalization group approach to the Yang—Mills mass gap problem at finite volume.
Category: Mathematical Physics

[1216] ai.viXra.org:2602.0068 [pdf] submitted on 2026-02-15 00:33:07

The Prime Gear Geometry Theory: A Discrete Mechanical Resolution of Prime Conjectures

Authors: Chaiya Tantisukarom
Comments: 11 Pages.

This article presents "Prime Gear Geometry," a deterministic mechanical framework that redefines the integer axis as a master gear ($C_1$) with a discrete unit weight-step of $+1$. Unlike analytic models that rely on the complex-plane "1/2" critical line of the Riemann Hypothesis, this theory posits that prime numbers are exact geometric outcomes forged by $C_1$ at coordinates of total asynchronous interference. We establish the "Prime Gear Synchronization Conjecture," stating that total phase alignment of a prime gear group occurs only at Primorial intervals. This mechanical exactness is used to resolve the Goldbach, Twin Prime, and Collatz conjectures not as probabilistic likelihoods, but as structural necessities of a machine that, by the laws of relatively prime circumferences, is incapable of perfect synchronization within the finite bounds of the $C_1$ axis.
Category: Number Theory

[1215] ai.viXra.org:2602.0067 [pdf] submitted on 2026-02-14 01:33:55

A Universal Variational—Probabilistic Framework for Physical Theories

Authors: Cornelius Moore
Comments: 24 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Modern theoretical physics employs distinct mathematical formalisms—Lagrangian mechanics, Hamiltonian dynamics, quantum amplitudes, statistical ensembles, field-theoretic path integrals—that, while empirically successful, lack a unified structural foundation. We present the Universal Mathematical System (UMS), a vari-ational—probabilistic framework in which standard physical theories arise as limiting cases, projections (marginalizations), or constrained reductions of a single maximum-entropy measure over configuration spaces. The framework is built on an exponential-family measure µ[C] = Z−1 exp(−Φ[C]), where Φ is a constraint functional encoding physical laws. We show under standard regularity assumptionshow classical mechanics emerges via the Laplace principle (β →∞), statistical mechanics is directly identified with the framework, and quantum mechanics corresponds to complex-weighted measures under standard path-integral formalism. Additionally, we formalize five distinct algebraic structures—quantity (Rn), growth (semigroups), information (entropy), phase (U(1)), and ratio (R+)—clarifying thatdifferent physical questions inhabit different mathematical domains and that confusion arises from naive cross-domain interpretation. The framework is intended as structural unification of existing formalisms rather than a proposal of novel fundamental ontology or new empirical predictions. We include a proof of a coarse-graining monotonicity theorem using the data-processing inequality, provide explicitreduction pipelines, and discuss extensions to chemistry, biology, neuroscience, and computation.
Category: Functions and Analysis

[1214] ai.viXra.org:2602.0066 [pdf] submitted on 2026-02-13 20:21:39

Bounded Symbolic Observability: A Cross-Domain Constraint in Computational Dynamics

Authors: David Taylor
Comments: 28 Pages. (Note by viXra Admin: Please cite and list scientific references)

Finite local symbolic observation exhibits bounded vocabularies across diverse computational domains despite systematic increases in observational scale. We apply afixed local symbolic encoding framework to 13 systems spanning quantum mechanics, fluid dynamics, thermodynamics, electromagnetism, chaos theory, number theory, combinatorial logic, and stochastic processes. Across all domains, observed symbolic vocabularies saturate, with a median final growth of 0.0% despite 100—1,000× increases in data volume, temporal extent, or problem size. Prime gap dynamics provides the strongest validation: an infinite, deterministic mathematical sequence with no physical dynamics saturates at 837 symbolic configurations across a 10,000× scale increase (100,000 to 1,000,000,000 primes,identical vocabulary), eliminating physical mechanisms as explanations. At one billionprimes, each of the 837 patterns is reused approximately 1.2 million times. Ten domainsachieve perfect saturation (0.0%), two near-perfect (<1%), and one strong (<20%). Symbolic space occupancy ranges from 0.08% (Schrödinger equation) to 92.35% (electromagnetic waves); both regimes nonetheless exhibit saturation. Saturation manifests independently of physical validity (thermodynamically invalid antidiffusion saturates identically to correct heat diffusion), determinism (chaotic andstochastic systems both saturate), and computational complexity (NP-complete 3-SATcollapses to eight symbolic patterns). These results indicate that bounded symbolicobservability reflects properties of finite local observation applied to locally-constraineddynamics rather than intrinsic system complexity—a constraint on measurement, not nature. Quantitative vocabularies are specific to the observational architecture employed; the empirical claim concerns the cross-domain emergence of vocabulary saturation under fixed local symbolic observation.
Category: Artificial Intelligence

[1213] ai.viXra.org:2602.0065 [pdf] submitted on 2026-02-13 05:45:31

Dual-Sector Expansion: Type Ia Supernovae Validate Matter-Sector H0 Normalization with ΛCDM Geometric Consistency

Authors: Heath W. Mahaffey
Comments: 12 Pages.

The Informational Actualization Model (IAM) proposes that late-time cosmic expansion couplesdifferently to photons versus matter, resolving the Hubble tension through sector-specific expansionrates. This dual-sector framework makes a critical, testable prediction: Type Ia supernovae (SNe),as matter-based distance indicators hosted in galaxies, should probe the matter sector. We testthis hypothesis using the complete Pantheon+ dataset (1588 SNe, 0.01 < z < 2.26) through threeindependent analyses: (1) SNe with Planck (photon-sector) H0 prior, (2) SNe with SH0ES (matter-sector) H0 prior, and (3) SNe with no H0 constraint. Results unambiguously demonstrate that SNereject the photon-sector expansion rate (H0 = 67.4 km s−1 Mpc−1, β → −0.30 at parameter bound-ary) and accept the matter-sector normalization (H0 = 73.04 km s−1 Mpc−1, β ≈ 0). Critically,SNe distances maintain ΛCDM geometric consistency (βdistance ≈ 0), validating IAM’s predictionthat sector-specific coupling primarily affects structure growth (f σ8) rather than photon propa-gation geometry. This empirical validation establishes that dual-sector expansion is data-driven,not theoretically assumed, and demonstrates that Planck (H0 = 67.4, photon sector) and SH0ES(H0 = 73.04, matter sector) both measure correctly—they probe different physical quantities. Thedual-sector phenomenology maps directly onto the standard modified gravity parametrization: mat-ter perturbations obey μ(a) = H2ΛCDM(a) / [H2ΛCDM(a) + βmE(a)] < 1 (suppressed growth), whilephoton deflection remains unmodified (Σ = 1), preserving CMB consistency. This μ < 1, Σ = 1framework is independently testable with existing Boltzmann solvers (CAMB/CLASS) and upcom-ing survey parametrizations (DES, Euclid, CMB-S4)
Category: Relativity and Cosmology

[1212] ai.viXra.org:2602.0064 [pdf] submitted on 2026-02-13 20:09:31

The Unification Achieved, How Quantum Gravity Had Already Been Solved

Authors: Bertrand Jarry
Comments: 8 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)

For a century, physicists have sought to unify general relativity and quantum mechanics. We show that this quest rests on a fundamental conceptual error: gravity never needed to be "quantized" because it is already a quantum phenomenon emerging from the vacuum. By recognizing the quantum vacuum as the fundamental substrate, unification becomes trivial. All forces, including gravity and relativity, are manifestations of the same quantum vacuum. The problem was not to find a theory of quantum gravity, but to recognize that it already existed.
Category: Quantum Gravity and String Theory

[1211] ai.viXra.org:2602.0063 [pdf] submitted on 2026-02-13 16:30:51

Conditional Continuum Limit of 4d SU(N_c) Yang-Mills Theory via Two-Layer Architecture, RG-Cauchy Uniqueness, and Step-Scaling Confinement

Authors: Lluis Eriksson
Comments: 18 Pages.

Building on the lattice results established in Papers [E26I]-[E26IX], we give a conditional construction of a scaling-limit state for pure SU(N_c) lattice Yang-Mills theory in four Euclidean dimensions, along dyadic lattice spacings a_k = a_0 * 2^{-k}. The construction proceeds via a two-layer architecture.Layer 1 (Local fields): For bounded gauge-invariant local observables (Wilson loops, normalized plaquette traces), expectations converge -- without extracting subsequences -- to a unique limit. Tightness is trivial (L^infinity bound plus Prokhorov); uniqueness follows from a multiscale RG-Cauchy estimate that bounds the change of local expectations under a single RG step. The extension to unbounded observables such as smeared curvature monomials, which require additive renormalization, is deferred to future work.Layer 2 (Confinement): The physical string tension sigma_phys > 0 is established through step-scaling of Creutz ratios evaluated on Wilson loops whose physical dimensions R x T are held fixed as a -> 0.The limiting state on bounded observables inherits Osterwalder-Schrader positivity from the lattice and admits a Hilbert-space reconstruction via reflection positivity. The mass gap is established conditionally via uniform exponential clustering of connected correlators -- an input from a uniform physical transfer-matrix spectral gap -- and the reconstruction theorem. Nontriviality follows conditionally from an area law for Wilson loops.Key dependencies on prior papers: uniform LSI inputs [E26I]-[E26IX]; Balaban multiscale effective action [E26III]-[E26V]; DLR-LSI [E26VII]; unconditional lattice closure inputs [E26IX].
Category: Mathematical Physics

[1210] ai.viXra.org:2602.0062 [pdf] submitted on 2026-02-13 20:05:47

Gravity as Osmotic Depression of the Quantum Vacuum,a Unified Theory of Relativity and Quantum Physics

Authors: Bertrand Jarry
Comments: 8 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)

We propose a radically new conceptual framework for understanding gravity and relativity, not as geometric curvatures of spacetime, but as physical modifications of the quantum vacuum. Gravity results from a static osmotic depression, while special relativity emerges from the dynamic compression of the vacuum by motion. This approach naturally resolves the problem of unification with quantum mechanics and makes experimentally testable predictions. Unlike attempts to "quantize geometry" (strings, loops), our theory recognizes that gravity and relativity are already quantum phenomena in nature, emerging from the properties of the vacuum.
Category: Quantum Gravity and String Theory

[1209] ai.viXra.org:2602.0061 [pdf] submitted on 2026-02-13 20:04:26

Unified Theory of the Quantum Vacuum Gravity, Relativity and Quantum Mechanics

Authors: Bertrand Jarry
Comments: 15 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite and list scientific references)

This theory unifies gravity, special relativity, general relativity, and quantum mechanics by recognizing the quantum vacuum as the single fundamental substrate. Gravity emerges as a static osmotic depression of the vacuum, special relativity as a dynamic compression, and all forces as manifestations of the same vacuum. This unification is not achieved through the quantization of geometry, but through the recognition that gravity was already quantum.
Category: Quantum Gravity and String Theory

[1208] ai.viXra.org:2602.0060 [pdf] submitted on 2026-02-13 20:27:17

Prime Modular Dynamics Theory (PMDT)

Authors: Michel Monfette
Comments: 14 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

We study the modular dynamics of prime numbers through the families SG(k) = {p ∈ P : kp + 1 ∈ P}. We present extensive computational evidence (up to 100 million Sophie Germain primes) fora modular dynamical structure in SG primes modulo 30. Two fundamental theorems establishthe triangular residue class 11,23,29 and gap constraints. Eight conjectures emerge, includinga novel "least-gap principle" and a harmonic attractor at 60. SG(k) families exhibit distinctphases in entropy/detailed-balance plane, with period-9 resonance and asymmetric sexy orbits.A third grammatical dynamic (G3) classifies all anomalies into three energy regimes. These pat-terns suggest an underlying arithmetic grammar and self-organizing behavior in Sophie Germainprimes.
Category: General Mathematics

[1207] ai.viXra.org:2602.0059 [pdf] submitted on 2026-02-12 19:18:54

Arithmetic Relativistic Emergence (Are): Spontaneous Symmetry Breaking from the Zeta Vacuum to Emergent Spacetime Geometry, Topology, and Quantum Fields

Authors: J. W. McGreevy
Comments: 18 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We present the Arithmetic Relativistic Emergence (ARE) framework, in which the fundamental symmetries of General Relativity, Einstein—Cartan gravity with torsion, and quantum field theory (Standard Model sectors) emerge tautologically from pure number theory viathe arithmetic geometry of the rational numbers Q. The Riemann zeta function ζ(s) represents the maximally symmetric pre-geometric vacuum phase, with perfect functional-equation symmetry around Re(s) = 1/2 and pole at s = 1 as the unified source of arithmetic energy/information.Spontaneous symmetry breaking induced by the weight-12 modular discriminant ∆(τ ) = η(τ )24 = (2π) 12(E4(τ ) 3 − E6(τ ) 2 )/1728 disperses this background into Archimedean divergence (smooth analytic curvature density) and non-Archimedean curl (torsion/spin density at p-adic fibers), with the functional-equation mirror s = 6 enforcing variational d balance of the arithmetic degree deg( L). The emergent 4D Lorentzian manifold M carries an adelic principal Lorentz/Spin frame bundle decomposed via the adele ring AQ. An effective Chern—Weil homomorphism—employing Bott—Chern forms at infinity and classical invariant polynomials at finite places—maps split curvature forms (Fdiv, Hcurl) to arithmetic characteristic classesin Arakelov Chow groups. These classes are stationary under metric d variations (δgdeg = 0 at s = 6), providing rigid global topological invariants (Pontryagin-like, Euler-like, torsion-twisting) preserved in the broken phase—the inevitable geometric and topological labels of arithmetic symmetry breaking. Heaviside synchronization (τdiv = τcurl) at s = 6 renders the arithmetic medium transparent, yielding distortionless propagation and unified causality. The Rankin—Selberg self-convolution L(∆×∆, s) contains ζ(s) factors, allowing recombination to the symmetric vacuum. Theemergent metric determinant √−g serves as the physical scalar whose arithmetic balancing across places enforces general covariance, proper volume preservation, and integration of curvature invariants. Fermions (12 Weyl per generation from Leech lattice Z2-orbifold),gauge sectors (finite algebra C ⊕ H ⊕ M3(C)), and constants (−α 1 ≈ 137.036, Λ ∼ 10−122MPl2, G ∼ 10−38 m−2) emerge via spectral actionand adelic convolution. ARE offers a tautological origin: physical laws are the minimal effective description preserving arithmetic consistency post-symmetry breaking.
Category: Number Theory

[1206] ai.viXra.org:2602.0058 [pdf] submitted on 2026-02-12 19:09:17

Anchored Causality Theory: Quantum Field Theory’s Natural Solution to the Measurement Problem

Authors: Kelly Sonderegger
Comments: 31 Pages. CC BY 4.0 License

The quantum measurement problem—how definite outcomes emerge from quantumstates—has resisted solution for nearly a century. We propose that the resolution liesin recognizing that quantum systems exist as extended waves until environmental coupling drives a phase transition to localized particles. There is no "superposition" in theconventional sense—the wave state is the fundamental reality. This Anchored Causality Theory (ACT) applies quantum field theory’s own ontology consistently throughmeasurement: fields are fundamental, particles are emergent localized excitations, andmeasurement is the physical process by which extended field configurations anchor intoparticle modes. ACT completes what QFT started—taking field ontology seriously allthe way through the measurement process.Remarkably, QFT’s mathematical structure already encodes this wave-particle phasetransition. The Lagrangian formulation (action principle, path integrals) is the naturallanguage of waves—extended field configurations exploring spacetime. The Hamiltonian formulation (definite states, observable eigenvalues) is the natural language ofparticles—localized excitations evolving in time. The Legendre transform connectingthem is the mathematical shadow of anchoring. What we call "superposition" is simply Fourier decomposition—one wave represented in different bases, not ontologicalmultiplicity. The mathematics was telling us this all along; we needed only to read itcorrectly.Measurement is progressive phase diffusion driven by coupling to environmentalquantum fields, with rates determined by particle mass through the Higgs mechanism.ACT emerges from three distinct physical processes: (1) Higgs-generated mass establishes the structural capacity for temporal participation and sets coupling strength, (2)gauge fields and phonons provide infrared noise spectra that drive decoherence dynamics, and (3) definite outcomes emerge when the anchoring functional Φ ≳ 1, markingirreversible phase transition from wave to particle.1We derive explicit anchoring rates from quantum Brownian motion theory, showing ΓA ∝ m2 × T × ηenv, where mass-squared scaling follows from Yukawa couplingstructure. The theory explains all existing decoherence phenomena—mass dependence,temperature scaling, environmental density effects, observable-specific rates, and persistence at zero temperature—while making a unique testable prediction: isotope massdependence of 15-20% in coherence times, distinguishable from environmental decoherence models (0%) and competing collapse models (∼8%). Standard Model EffectiveField Theory analysis establishes a viable parameter window spanning 15 orders ofmagnitude. Quantum randomness is explained as stochastic noise from environmental fields (thermal and vacuum fluctuations), not mysterious collapse—calculable viathe fluctuation-dissipation theorem. ACT provides mechanism, ontology, and testablepredictions using only established physics.
Category: Quantum Physics

[1205] ai.viXra.org:2602.0057 [pdf] submitted on 2026-02-12 19:06:11

Integrated Cross-Scale Derivative Bounds for Wilson Lattice Gauge Theory: Closing the Log-Sobolev Gap

Authors: Lluis Eriksson
Comments: 22 Pages.

We prove integrated cross-scale derivative bounds that replace the unverified Assumption 5.4 of a companion paper. Combined with two explicit large-field inputs (a residual pointwise derivative bound and a Balaban-type conditional large-field suppression) and conditional inequalities from the orbit space Ricci curvature, this yields a uniform (volume-independent) log-Sobolev inequality for the Wilson lattice gauge measure at sufficiently weak coupling (large beta). The key innovation is a decomposition into small-field and large-field contributions: the former is controlled by Balaban's polymer expansion, while the latter is handled by a pointwise gradient bound combined with exponential measure suppression. We provide a self-contained verification of the unconditional large-field tail mechanism for SU(2) in d=2, together with numerical validation.
Category: Mathematical Physics

[1204] ai.viXra.org:2602.0056 [pdf] submitted on 2026-02-12 19:07:45

Large-Field Suppression for Lattice Gauge Theories: from Balaban's Renormalization Group to Conditional Concentration

Authors: Lluis Eriksson
Comments: 22 Pages.

We verify the large-field hypothesis (Hypothesis 4.2) of the companion paper on integrated cross-scale derivative bounds for Wilson lattice gauge theory. The proof rests on three ingredients: (i) a dictionary lemma translating the Hilbert-Schmidt large-field condition on plaquette holonomies into Balaban's Lie-algebra formulation; (ii) an interface lemma connecting conditional measures with Balaban's T-operation and its uniform small-factor bound on admissible background fields (Eq. (1.89) of Balaban, Commun. Math. Phys. 122 (1989)); (iii) the uniformity estimate (Eq. (1.75) of the same reference) ensuring that slow-field dependence contributes only an O(1) multiplicative constant. For d=2, we give an independent proof via character-positive convolutions that avoids the Balaban machinery entirely. Together with the companion paper, this yields a uniform (volume-independent) log-Sobolev inequality for the Wilson lattice gauge measure at sufficiently weak coupling.
Category: Mathematical Physics

[1203] ai.viXra.org:2602.0055 [pdf] submitted on 2026-02-12 19:09:28

Unconditional Uniform Log-Sobolev Inequality for Su(n_c) Lattice Yang-Mills at Weak Coupling

Authors: Lluis Eriksson
Comments: 10 Pages.

We prove that the Wilson lattice gauge measure for SU(N_c) in dimension d >= 3 at sufficiently weak coupling (beta >= beta_wc) satisfies a log-Sobolev inequality with constant alpha_* > 0 independent of the lattice volume. This completes the multiscale program initiated in Paper I by verifying Hypothesis 3.2 of Paper III, the last remaining analytic input. The verification uses three ingredients: (i) the locality of polymer functionals, which restricts the sum over polymers to those intersecting a fixed link; (ii) Cauchy estimates on Balaban's analytic domains for polymer activities and boundary terms; and (iii) a combinatorial counting bound for connected polymers containing a given link, which is independent of the lattice volume. Combined with the synthetic Ricci curvature bound of Paper II, the integrated cross-scale derivative bounds of Paper III, and the large-field suppression established in Paper IV, this yields the uniform log-Sobolev inequality unconditionally.
Category: Mathematical Physics

[1202] ai.viXra.org:2602.0054 [pdf] submitted on 2026-02-12 19:10:34

From Uniform Log-Sobolev Inequality to Mass Gap for Lattice Yang-Mills at Weak Coupling

Authors: Lluis Eriksson
Comments: 16 Pages.

We prove that the one-step transfer operator of SU(N_c) lattice Yang-Mills theory in dimension d >= 3 has a spectral gap Delta_phys > 0 uniformly in the lattice volume (for even side length L), for all sufficiently large inverse coupling beta >= beta_0. The proof combines four ingredients: (i) the uniform log-Sobolev inequality on periodic tori established in a companion paper; (ii) a verification that the multiscale RG outputs needed for the LSI argument are uniform in frozen boundary conditions (Section 4), yielding the full DLR-LSI property (Section 5); (iii) the Stroock-Zegarlinski equivalence theorem, which in its standard formulation deduces Dobrushin-Shlosman mixing and exponential clustering from DLR-LSI; and (iv) Osterwalder-Seiler reflection positivity of the Wilson action, which translates temporal exponential clustering into a spectral gap of the transfer operator.
Category: Mathematical Physics

[1201] ai.viXra.org:2602.0053 [pdf] submitted on 2026-02-12 19:11:43

DLR-Uniform Log-Sobolev Inequality and Unconditional Mass Gap for Lattice Yang-Mills at Weak Coupling

Authors: Lluis Eriksson
Comments: 14 Pages.

We prove that for SU(N_c) lattice Yang-Mills theory in d >= 3 dimensions at sufficiently weak coupling (beta >= beta_0), the conditional Gibbs specification satisfies a DLR-uniform log-Sobolev inequality: for every finite sub-lattice Lambda' subset of Z^d and every boundary condition omega, the conditional measure mu_{Lambda'}^{omega} satisfies LSI(alpha_*) with a constant alpha_* > 0 independent of Lambda' and omega.The proof combines three ingredients:(i) the multiscale entropy decomposition developed in our earlier work (Papers I-V), which establishes a uniform log-Sobolev inequality on periodic tori;(ii) a uniform fiber oscillation lemma showing that frozen boundary links -- treated as external parameters in Balaban's renormalization group -- do not increase the per-block oscillation of the conditional fast potential, thanks to compactness of SU(N_c) and the locality of the polymer expansion;(iii) a refined large-field event restricted to dynamical (non-frozen) plaquettes, which ensures that the large-field suppression mechanism extends uniformly to boundary blocks.As a consequence, the Stroock-Zegarlinski equivalence theorem yields Dobrushin-Shlosman mixing, exponential clustering of gauge-invariant correlations, and -- via Osterwalder-Seiler reflection positivity -- a strictly positive mass gap Delta_phys >= m(beta, N_c, d) > 0 for the transfer matrix on the periodic torus (Z/LZ)^d, uniformly in even L. This removes the Dobrushin-type Assumption 6.3 of Paper I and the boundary-uniformity Assumption 3.1 of Paper VI, rendering the lattice mass gap unconditional at weak coupling.
Category: Mathematical Physics

[1200] ai.viXra.org:2602.0052 [pdf] submitted on 2026-02-12 19:21:09

Interface Lemmas for the Multiscale Proof of the Lattice Yang-Mills Mass Gap

Authors: Lluis Eriksson
Comments: 11 Pages.

We establish three interface lemmas that close the remaining gaps in the proof chain for the mass gap of SU(N_c) lattice Yang-Mills theory at weak coupling (beta >= beta_0) in dimension d >= 3.Lemma A (Horizon Transfer) establishes a uniform conditional large-field suppression bound mu_k(Z_k(B) | G_{k+1}) <= exp(-c p_0(g_k)) holding mu_beta-a.s., without any admissibility restriction on the background field. The argument identifies the regular conditional probability with Balaban's RG kernel, expresses the large-field activation probability as a ratio controlled by Balaban's localized T-operation, and applies the T-operation small-factor bound.Lemma B extracts from Balaban's inductive scheme that the boundary terms B^{(k)}(X) share the same uniform analyticity domain as the polymer activities R^{(k)}(X), with radius hat{alpha}_1(gamma) > 0 independent of k.Lemma C extends the multiscale LSI to finite volumes with arbitrary frozen boundary conditions omega via tensorization-plus-perturbation, replacing the unverified Dobrushin block condition of Paper VII.Combined with Papers I-VII, these lemmas render the lattice mass gap theorem unconditional.
Category: Mathematical Physics

[1199] ai.viXra.org:2602.0051 [pdf] submitted on 2026-02-12 19:22:19

Uniform Coercivity, Pointwise Large-Field Suppression, and Unconditional Closure of the Lattice Yang-Mills Mass Gap at Weak Coupling in d=4

Authors: Lluis Eriksson
Comments: 16 Pages.

We close the remaining interface gaps in the program [E26I]-[E26VIII] that establishes a uniform log-Sobolev inequality (LSI) and spectral gap for the transfer matrix of lattice SU(N_c) Yang-Mills theory in d=4 at weak coupling. Four technical gaps are identified and resolved: (G1) the Balaban small-factor bound for the T-operation is shown to hold pointwise for every real background by auditing Balaban's proof and verifying that it uses only the uniform inductive conditions; (G2) we establish a uniform small-field coercivity estimate (Hessian lower bound) for the effective action and use it, together with Balaban's small-factor mechanism, to control the conditional inequalities in the multiscale entropy decomposition -- circumventing the need for a global fiber LSI with constant O(beta); (G3) uniform analyticity of boundary terms is extracted from Balaban's inductive scheme; (G4) a quantitative bootstrap verifies the simultaneous compatibility of all constants for a single choice of beta_0. Combined with [E26I]-[E26VIII], these closures yield an unconditional proof that Delta_phys(beta,L) >= c(N_c,beta_0) > 0 uniformly in the volume L for beta >= beta_0.
Category: Mathematical Physics

[1198] ai.viXra.org:2602.0050 [pdf] submitted on 2026-02-11 17:48:50

Quantized Vacuum Attenuation: Resolving the Hubble Tension via Third-Order Nonlinear Susceptibility in a Discrete Manifold

Authors: Scott Long
Comments: 6 Pages. (Zenodo DOI: 10.5281/zenodo.18601428)

The standard ΛCDM cosmological model is currently challenged by a 5σ discrepancy in Hubble Constant (H0) measurements and a vacuum energy density error of 122 orders of magnitude. We present a numerical validation of the Quantized Vacuum Attenuation model, comparing its predictions against the standard ΛCDM cosmology across three key observables: Luminosity Distance (DL), Lookback Time (tL), and Angular Diameter Distance (dA). Utilizing a vacuum attenuation coefficient of α≈8.26×10−27 m−1, derived from the local Hubble flow, our simulations resolve three primary tensions: Dark Energy: The model demonstrates that the high-redshift luminosity modulus follows a logarithmic attenuation profile, eliminating the need for Dark Energy to explain Type Ia Supernovae dimming. Early Galaxy Paradox: The model removes the Big Bang singularity, reinterpreting high-redshift galaxies (e.g., JADES-GS-z13) as objects seen through ≈35 Gyr of static vacuum transmission, allowing for infinite formation time. Little Red Dots: The angular diameter distance in a discrete manifold correctly predicts the observation of compact high-redshift morphologies that contradict angular magnification predicted by expanding metric models. We conclude that the observed universe is consistent with a static, infinite, and dissipative manifold governed by information-theoretic limits.
Category: Relativity and Cosmology

[1197] ai.viXra.org:2602.0049 [pdf] submitted on 2026-02-11 17:57:00

A Comparative Study of Sikh and Religious Cosmologies with Modern Models of Physics and Spacetime

Authors: Moninder Singh Modgil, Dnyandeo Dattatray Patil
Comments: 33 Pages.

This paper presents a novel synthesis of ancient religious cosmologies, particularly Sikh, Islamic, and Hindu scriptural verses, with modern theoretical physics, including general relativity, quantum cosmology, and higher-dimensional field theories. Beginning with interpretations of key hymns such as those from the Japji Sahib and Kirtan Sohila, the paper constructs conformally compactified spacetimemetrics aligned with spiritual metaphors. G¨odel-like rotating universes are modeledto reflect daily and annual solar motions, incorporating tunneling transitions, scalar curvature collapses, and spinor bundles to represent evolving consciousness. Through symbolic AI, scriptural syntax is translated into candidate gravitational Lagrangians, wavefunctionals, and field strength tensors that encode karmic memory. This integration of metaphysical semantics and mathematical physics allows new formulations of cosmological duality, including Janus time-symmetric models and magneto-causal holography, offering profound insights into the structure of the universe and the soul’s evolution within it.
Category: Religion and Spiritualism

[1196] ai.viXra.org:2602.0048 [pdf] submitted on 2026-02-11 17:59:17

Alternative Gravity Hypothesis as an Effect of Differential Universe Expansion

Authors: L. A. Serebrennikov
Comments: 4 Pages.

In the standard model, gravity is described either as a fundamental interaction (Newton’s law of universal gravitation) or as a geometric property of spacetime (Einstein’s general theory of relativity). The hypothesis proposed here considers observable gravitational attraction as a consequence of uneven cosmological expansion.Massive objects, possessing higher energy density, cause more intense local expansion of space, leading to effective repulsion of less massive bodies toward zones of increased expansion. The paper outlines the main postulates of the model, its potential implications for the dark matter problem, and proposes specific paths for experimental falsification.
Category: Relativity and Cosmology

[1195] ai.viXra.org:2602.0047 [pdf] submitted on 2026-02-10 20:35:45

From Prime Numbers to DNA: The Emergence of a Universal Fundamental Structure

Authors: Guiffra Patrick
Comments: 21 Pages.

In this article, we demonstrate the emergence of a fundamental structure inherent to prime numbers that embeds its signature within the molecular architecture of DNA. By constructing a harmonic field Ω(x, q) based on modular inverses of prime numbers, we show that helical periodicities of biological structures emerge naturally from universal arithmetic properties.Our main results establish that: (1) The prime number p = 11 acts as a universal pivot, generating through its modular inverse 11−1 ≡ 2 the characteristic wavelengths of protein α-helices (λ = 7/2 = 3.5, 2.68% error vs observed 3.6) and B-form DNA (λ = 21/2 = 10.5, 0.10% error vs observed 10.5). (2) Analysis of human chromosome 1 reveals statistically significant 36% enrichment of prime-length tandem repeats (p < 3.2 × 10−7). (3) We establish a universal scaling law relating the prime harmonic coherence coefficient χ to genomic fractal dimension D according to D = 1−0.86χ, validated across five organisms spanning three domains of life (bacteria, yeast, plant, insect, mammal) with near-perfect correlation (r = −0.9974, p < 10−4, errors < 1%). The numerical evidence is compelling. To claim that prime numbers constitute the fundamental template of DNA requires only one more step: experimental laboratory demonstration. We propose testable protocols involving synthesis of DNA optimized according to prime harmonic principles, with quantitative predictions for thermal stability, mutation rates, and spectral properties. If validated, these findings suggest that number theory encodes universal architectural constraints on biological self-assembly.
Category: Physics of Biology

[1194] ai.viXra.org:2602.0046 [pdf] submitted on 2026-02-10 18:37:34

Ricci Curvature of the Orbit Space of Lattice Gauge Theory and Single-Scale Log-Sobolev Inequalities

Authors: Lluis Eriksson
Comments: 11 Pages.

We establish that the orbit space B = A/G of SU(Nc) lattice gauge theory satisfies the Riemannian curvature-dimension condition RCD*(Nc/4, dim A); in particular, it satisfies CD(Nc/4, ∞) in the sense of Lott-Villani-Sturm. The proof proceeds by showing that the configuration space A = SU(Nc)|B1(Λ)|, equipped with the bi-invariant product metric ⟨X, Y⟩ = -2 tr(XY), is an Einstein manifold with RicA = (Nc/4) gA, and then applying the stability of the RCD* condition under quotients by compact groups of measure-preserving isometries (Galaz-García-Kell-Mondino-Sosa). This approach bypasses the need for explicit O'Neill curvature computations and handles the singular stratum (reducible connections) automatically. As a consequence, we derive a conditional log-Sobolev inequality for measures on B of the form dμ = e-Φ dν/Z with constant α = (Nc/4) e-osc(Φ). All constants are computed explicitly for SU(2) and SU(3). This provides the geometric input in a program aiming at a volume-uniform log-Sobolev inequality for SU(Nc) lattice Yang-Mills theory at weak coupling; the complementary analytic input (uniform bounds on the effective potential oscillation, via Balaban's renormalization group) is the subject of ongoing work.
Category: Mathematical Physics

[1193] ai.viXra.org:2602.0045 [pdf] submitted on 2026-02-10 20:42:00

An Oktonionic Model of Black Holes

Authors: Rüdiger Giesel
Comments: 9 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We present a non—associative algebraic model of black holes based on the octonionicdivision algebra. Geometry is not postulated as fundamental. Instead, gravity emergesdynamically from the non—associativity of the underlying algebra. Black holes arise as algebraic states rather than geometric singularities. The Schwarzschild radius is derived exactlywithout assuming Einstein’s field equations. The resulting spacetime is singularity—free,geodesically complete, and information preserving
Category: Relativity and Cosmology

[1192] ai.viXra.org:2602.0044 [pdf] submitted on 2026-02-10 21:10:05

Causal Mechanical Cosmology (CMC) — Paper 5: A Word-Safe Mathematical Derivation Tooling Paper for A—B—C Closure

Authors: Leon Barbour
Comments: 19 Pages. Licensed under CC BY 4.0 (Note by ai.viXra.org Admin: For the last time, please cite and list scientific references)

This paper presents a compact "tooling derivation" for the first closed operational loop in Causal Mechanical Cosmology (CMC): (A) spherical void control geometry producing adominant dipole anisotropy for an off-centre observer, (B) an elongated/sheared voidgeometry producing quadrupole anisotropy via a local Hessian decomposition, and (C) anobservable closure mapping the structural line-of-sight (LOS) velocity signature to atomicfractional frequency shift Δf/f as the measurement endpoint. Canonical equations (E1—E8) are preserved and used as the spine for derivations and diagnostics. The local Hessian kernel yields the leading even-parity multipole structure (monopole/quadrupole); dipole structure in the spherical off-centre control case is treated as arising from the global sampling asymmetry and, in a strict local expansion, enters through higher-order terms beyond the first Hessian approximation.
Category: Relativity and Cosmology

[1191] ai.viXra.org:2602.0043 [pdf] submitted on 2026-02-10 02:21:23

Arithmetic Emergence of Generalized Relativity, Classical Spacetime and Quantum Fields from Number Theory: Balanced Dispersion of the Arithmetic Degree Induced by the Weight-12 Modular Discriminant

Authors: J. W. McGreevy
Comments: 19 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We present a rigorous synthesis in which the fundamental symmetries of General Relativity and quantum field theory emerge from the axioms of arithmetic geometry and number theory. Central is the weight-12 modular discriminant Δ(τ) = η(τ)^{24} = (2π)^{12} (E_4^3 - E_6^2)/1728, interpreted as the vacuum potential. The arithmetic degree (total integrated curvature) must disperse equivalently across Archimedean (smooth, complex-analytic) and non-Archimedean (discrete, p-adic) places to maintain global consistency via the product formula. This dispersion is enforced at the critical mirror point s=6 of L(Δ,s), where the functional equation symmetry balances openness and rigidity.The Hilbert-Pólya operator Ĥ = 1/2 + i (D_∞ ⊕ ∑p D_p) acts self-adjointly on the adelic Hilbert space, with eigenvalues corresponding to resonances tied to L(Δ,s) zeros. The 1728 frequency (12^3) serves as the universal gear ratio/adiabatic regulator. The 12-fermion matrix arises from the Leech lattice V_Λ{24} Z_2-orbifold, folding 24 bosonic dimensions into 12 Weyl fermions per generation via Möbius twist.A 4D Lorentzian manifold emerges via noncommutative geometry (KO-dimension 6 adelic spectral triple), with the spectral action Tr f(��/Λ) yielding the Einstein-Hilbert term and stress-energy from p-adic torsion convolution. The master equation δ_g widehat{deg}(mathcal{L}) = 0 at s=6 recovers the Einstein field equations with cosmological constant Λ ≈ M_Pl^2 e^{-288} (double-twist entropy) and fine-structure constant α^{-1} ≈ 137.036 from Petersson norm corrections. This framework posits that GR and the Standard Model are stereographic projections of the weight-12 balanced modular form onto the Möbius-Planck manifold, providing a tautological origin for physical laws from number theory.
Category: Number Theory

[1190] ai.viXra.org:2602.0042 [pdf] submitted on 2026-02-09 20:27:39

Recursive Spacetime: A Research Programme for a Holographic Graph-Theoretic Framework

Authors: Ammar Alammar
Comments: 8 Pages. Creative Commons License

We propose a background-independent research programme modelling the universe as a dynamic,recursive causal network evolving from a single root vertex. By treating spacetime not as afundamental container but as an emergent property of a stochastic directed acyclic graph (DAG),we demonstrate that foliation invariance (a necessary condition for Lorentz symmetry) and three-dimensional spatial geometry can emerge as statistical limits of graph topology governed by random walk recurrence (Polya’s Theorem) and the Principle of Maximum Entropy. We formally derive the Einstein-Hilbert action in the thermodynamic limit of the graph’s microstate statistics, identifying the cosmological constant with unimodular volume fluctuations (Λ ∼ V −1/2). Furthermore, we identify fundamental particles as stable topological subgraphs (braids) protected by Pachner move invariance, and derive the Holographic Area Law from the Max-Flow Min-Cut theorem. This model offers a unified generative grammar for emergent gravitation and quantum interference, yielding a precise, falsifiable prediction for high-energy Lorentz violation, consistent with effective field theory expectations in discrete models.
Category: Quantum Gravity and String Theory

[1189] ai.viXra.org:2602.0041 [pdf] submitted on 2026-02-09 20:45:44

Uniform Log-Sobolev Inequality and Mass Gap for Lattice Yang--Mills Theory

Authors: Lluis Eriksson
Comments: 21 Pages.

We prove that SU(Nc) lattice Yang--Mills theory in d=4 dimensions with Wilson action at sufficiently weak coupling (beta = 2Nc/g^2 >= beta0) satisfies a log-Sobolev inequality with constant alpha_* > 0 uniform in the lattice size Lvol, conditional on Assumption (cross-scale derivative bound). Combined with reflection positivity, this yields a positive mass gap Delta_phys > 0 uniform in Lvol, conditional on Assumptions (cross-scale derivative bound) and (Dobrushin translation).The argument combines: (i) Balaban's constructive renormalization group, providing controlled effective actions and polymer bounds at all scales; (ii) a lower Ricci curvature bound on the gauge orbit space, giving uniform conditional log-Sobolev constants for fast modes via the Bakry--Emery criterion; and (iii) a multiscale entropy decomposition with sweeping-out estimates, where the transversal block-averaging scaling ensures summability of cross-scale errors.A detailed audit trail is included to relate the required cross-scale derivative input to published bounds in Balaban's work.
Category: High Energy Particle Physics

[1188] ai.viXra.org:2602.0040 [pdf] submitted on 2026-02-08 19:36:55

Uniform Poincaré Inequality for Lattice Yang-Mills Theory Via Multiscale Martingale Decomposition

Authors: Lluis Eriksson
Comments: 11 Pages.

We prove that the lattice Yang-Mills measure with gauge group SU(Nc)in d=4 dimensions at sufficiently large β=2Nc/g2satisfies a Poincaré inequality with constant α*>0uniform in the lattice size L. The proof uses three ingredients:(i) the Ricci curvature bound RicB ≥ Nc/4 for thegauge orbit space, giving a uniform spectral gap for conditional measures offast modes at each renormalization group scale; (ii) Balaban's constructive RGwith polymer derivative bounds, controlling the residual coupling betweenscales; and (iii) a multiscale martingale variance decomposition that avoidsrecursive composition losses, with a commutator coefficientDk ≤ C e-2κ 2-3k made summable bythe geometric scaling factor of transversal block averaging. Under anRG-normalized disintegration consistent with Balaban's absorption structure,only exponentially decaying polymer residuals contribute to Dk,ensuring Σk Dk << c0. Theresulting uniform Poincaré inequality gives volume-independent control ofthe variance-to-energy ratio for gauge-invariant observables.
Category: Mathematical Physics

[1187] ai.viXra.org:2602.0039 [pdf] submitted on 2026-02-08 04:40:53

Role Based Multi Agent Reasoning Frameworks

Authors: Isaiah Nwukor
Comments: 15 Pages.

Individual artificial intelligence systems face an inherent trade-off between plasticity and stability under resource constraints. I propose that general intelligence emerges from networks of specialized agents applying a structured reasoning cycle to answer four fundamental questions. Agents ground abstract patterns through affective valence embeddings and coordinate via a shared database of credibility-weighted knowledge packages. I formalize a five-stage reasoning engine (Salience Detection → Hypothesis Generation → Experimentation → Structural Correspondence → Generalization) where agents at different stages specialize in different questions, enabling zero-shot cross-domain transfer. Using ARC-AGI task "as66" as demonstration, I show 276 generations of evolutionary learning where complementary specialization yields a current maximum of Level 4 performance across agents [20]. This framework provides testable predictions for performance scaling, transfer capability, and behavioral signatures of reasoning integration.
Category: Artificial Intelligence

[1186] ai.viXra.org:2602.0038 [pdf] submitted on 2026-02-08 11:01:58

Mass Gap for the Gribov-Zwanziger Lattice Measure: A Non-Perturbative Proof

Authors: Lluis Eriksson
Comments: 13 Pages.

We prove that the quadratic Gribov-Zwanziger measure on a d-dimensional periodic lattice (d ≥ 2) with gauge group SU(Nc) exhibits a mass gap, uniformly in the lattice size L. The gluon propagator at zero momentum satisfies D(0) ≤ Cd,Nc/g2 for all L ≥ 2 and all coupling g > 0. In the thermodynamic limit, mgap = g[(d-1)NcI1/d2]1/2, where I1 = ∫ ddk/(2π)d 1/k̂2 is a finite lattice constant (I1 ≈ 0.155 in d = 4). For SU(3) at β = 6 the predicted mass scale is mgap ≈ 0.6 GeV, in quantitative agreement with lattice Monte Carlo measurements. The proof combines four ingredients: strict log-concavity of the measure (Bhatia's matrix inequality), dimensional reduction to a fixed finite-dimensional zero-mode sector (Prékopa's theorem), an exact computation of the effective Hessian at the origin, and a 1/N scaling argument that renders the effective potential asymptotically quadratic. No perturbative expansion in the coupling constant is employed.
Category: Quantum Physics

[1185] ai.viXra.org:2602.0037 [pdf] submitted on 2026-02-08 19:04:35

Quantum Resistant Cryptography and It's Implications on Blockchain and Cryptocurrency

Authors: Tehzeeb Ali
Comments: 14 Pages.

Modern public key cryptosystems rely on two fundamental computational hardness assumptions: integer factorization (RSA) and the discrete logarithm problem (elliptic curve cryptography). These problems, formulated using modular arithmetic and algebraic geometry, have withstood four decades of cryptanalytic attacks. However, their inherent algebraic structures and periodicity properties make them vulnerable to quantum algorithms, particularly Shor’s algorithm (1994), which achieves polynomial-time complexity on quantum computers. This research presents an extensive mathematical comparison between classical cryptographic systems and quantum-resistant alternatives, with particular emphasis on lattice-based cryptography. We focus on the Learning With Errors (LWE) problem and its variants (Ring-LWE, Module-LWE), demonstrating through rigorous mathematical analysis why these lattice problems lack the periodicity that quantum algorithms exploit. We provide formal security reductions for LWE problems relative to worst-case lattice problems and present mathematical proofs of quantum resistance. For cryptocurrency systems, this analysis reveals critical vulnerabilities: current ECDSA algorithms used for transaction signing will become cryptographically insecure within 10-30 years, potentially compromising over $100 billion in digital assets. This work bridges mathematical foundations, security analysis, and practical implications for real-world systems, providing proof-based recommendations for the transition to post-quantum cryptographic standards in blockchain technologies.
Category: Digital Signal Processing

[1184] ai.viXra.org:2602.0036 [pdf] submitted on 2026-02-08 13:06:34

Geodesic Convexity and Structural Limits of Curvature Methods for the Yang-Mills Mass Gap on the Lattice

Authors: Lluis Eriksson
Comments: 15 Pages.

We establish three results for the SU(Nc) lattice Yang-Mills mass gap. First, the function U → -Re Tr U is strictly geodesically convex on Bπ/2(I) ⊂ SU(Nc), with an explicit Riemannian Hessian. Second, the orbit space B = A/G has Ricci curvature RicB ≥ Nc/4, giving a spectral gap λ1(ΔB) ≥ Nc/4 uniform in lattice size, making rigorous an argument of Mondal. Third, and most importantly, we prove that the Yang-Mills-Faddeev-Popov potential is not geodesically convex at the trivial vacuum in zero-mode directions, for any value of the coupling in d ≥ 3. This shows that convexity-based methods — Brascamp-Lieb, Bakry-Émery, Dobrushin, Prékopa — cannot establish the mass gap through the Hessian of the full potential. We argue that the physical mass gap O(e-c/g²) requires the global topology of B, accessible via the Witten-Helffer-Sjöstrand framework.
Category: Quantum Physics

[1183] ai.viXra.org:2602.0035 [pdf] submitted on 2026-02-08 14:26:38

Morse—Bott Spectral Reduction and the Yang—Mills Mass Gap on the Lattice

Authors: Lluis Eriksson
Comments: 11 Pages.

We establish four results toward the SU(N_c) lattice Yang-Mills mass gap. First, the Wilson potential on the gauge orbit space B=A/G is Morse-Bott with critical manifold M_flat (the flat connections), and we derive the Born-Oppenheimer effective Hamiltonian on M_flat. Second, we prove that the Faddeev-Popov obstruction identified in Paper II applies to the path integral but not to the Hamiltonian on B: since V_pot = S_YM >= 0 has non-negative Hessian at its minimum, the Bakry-Emery framework gives an unconditional mass gap m >= c(L,N_c,d) g^2 > 0 for each fixed lattice size L. Third, we show that the physical mass gap m ~ exp(-C/g^2) follows if the spectral gap at Balaban's terminal renormalization scale is bounded below. We identify this as the single remaining step toward the Yang-Mills Millennium Problem on the lattice.
Category: Quantum Physics

[1182] ai.viXra.org:2602.0033 [pdf] submitted on 2026-02-08 15:01:30

The Yang—Mills Mass Gap on the Lattice: a Self-Contained Proof

Authors: Lluis Eriksson
Comments: 6 Pages.

We prove that SU(N_c) lattice Yang-Mills theory in d=4 dimensions with Wilson action at sufficiently weak coupling has a positive mass gap m >= c(N_c) exp(-C(N_c)/g^2) > 0 in lattice units, uniformly in the lattice size L up to the correlation length. The proof is self-contained modulo Balaban's constructive renormalization group (Comm. Math. Phys., 1984-1989) and combines three ingredients proved here: (i) a Ricci curvature bound Ric_B >= N_c/4 for the gauge orbit space, via O'Neill's submersion formula; (ii) a Holley-Stroock spectral gap estimate at Balaban's terminal renormalization scale; (iii) a transfer-matrix trace identity, with controlled errors from the non-nearest-neighbor couplings in Balaban's effective action, showing that the physical mass gap is approximately scale-invariant under the renormalization group.
Category: Quantum Physics

[1181] ai.viXra.org:2602.0032 [pdf] submitted on 2026-02-08 19:11:44

The Yang—Mills Mass Gap on the Lattice: a Self-Contained Proof Via Witten Laplacian and Constructive Renormalization

Authors: Lluis Eriksson
Comments: 10 Pages.

We prove that SU(Nc) lattice Yang—Mills theory in d = 4 dimensions with Wilson action at sufficiently weak coupling has a positive mass gap mgap ≥ c(Nc) · e−C(Nc)/g2 > 0 in lattice units, uniformly in lattice sizes L ≤ C0 eC/g2 . The proof is self-contained modulo Balaban’s constructive renormalization group and combines: (i) a Ricci curvature bound RicB ≥ Nc/4 for the gauge orbit space, treating its orbifold singularities; (ii) a Witten Laplacian semiclassical spectral gap estimate at Balaban’s terminal scale, using the Morse—Bott structure of the Wilson potential with all hypotheses of the Helffer—Sjöstrand theory explicitly verified; and (iii) a transfer-matrix trace identity with controlled errors from nonlocal temporal couplings.
Category: Quantum Physics

[1180] ai.viXra.org:2602.0031 [pdf] submitted on 2026-02-08 00:43:03

Entanglement Tension and Brane Secession: A Holographic Framework for Emergent Gravity and Mass

Authors: Cesar Henriques
Comments: 22 Pages.

We present a holographic framework in which four-dimensional spacetime emerges from quantum entanglement structure encoded on a brane embedded in asymptotically AdS space [1]. Mass is not fundamental but is identified with topological entanglement complexity—knot complexity Ck—quantifying irreducible multipar-tite correlations [3]. Gravitational interaction emerges as the macroscopic response to gradients in entanglement tension along the holographic direction [4], reducing to Einstein’s equations at low complexity with an additional non-local stress contribution from diffuse entanglement structure [5]. We propose that classical spacetime singularities signal saturation of entanglement capacity at a critical threshold Ck,max, analogous to the Bekenstein—Hawkingbound [6, 7]. Beyond this regime, the system undergoes a topological transition—brane secession—whereby the saturated region disconnects from the parent structure and nucleates an independent spacetime [2, 8]. This provides a natural regularization mechanism: from the exterior perspective, a black hole forms; from the interior perspective, a smooth cosmological expansion emerges, unifying collapse and cosmogenesis as dual descriptions of a single entanglement reorganization. We demonstrate structural consistency through explicit toy-model calculations in finite tensor networks (Appendices A—B) and present a simplified phenomenological realization showing how diffuse entanglement tension reproduces flat galactic rotation curves without dark matter particles (Section 5.3). The framework offersa unified interpretive scheme for mass, gravity, and dark-sector phenomenology as emergent consequences of quantum correlation structure, while remaining compatible with established results in holography, semiclassical gravity, and observational cosmology [1, 2, 6].
Category: High Energy Particle Physics

[1179] ai.viXra.org:2602.0029 [pdf] submitted on 2026-02-08 00:47:19

Resolving Lepton Anomalies via Directed Dimensional Lattice Geometry

Authors: Alimi Ayomide Olamilekan
Comments: 6 Pages.

We demonstrate that the muon anomalous magnetic moment discrepancy (127 parts per billion) and the proton radius puzzle arise from a common discrete geometric origin within the Directed Dimensional Lattice (DDL) framework. By modeling spacetime as a 24-cell (F4) lattice rather than a continuous manifold, we derive the muon anomaly as a "polygon tax" from finite harmoniccycles of N = 3600 nodes. This value emerges naturally from the hierarchical partition of the 120-cell into 25 disjoint 24-cells, combined with 6-fold phase updates required by SO(4) holonomy. Concurrently, the 24-cell symmetry predicts the muonic proton radius as Rµ = Re(1 − 1/24) = 0.841 fm. The exact integer ratio 3600/24 = 150 suggests phase-locked coupling between lepton dynamics and lattice geometry. These predictions are jointly testable via the MUonE experiment, providing definitive validation or falsification of the DDL framework without requiring beyond-Standard-Model particles.
Category: High Energy Particle Physics

[1178] ai.viXra.org:2602.0028 [pdf] submitted on 2026-02-08 00:48:47

Unified Reconstruction of Cosmology Via Spacetime Hysteresis and Gravitational Entanglement ( Summary in Korean)

Authors: Chang-Sik Kim
Comments: 7 Pages. Copyright © 2026 Chang-Sik Kim. All rights reserved.

현대 표준 우주론(ΛCDM)은 우주배경복사의 정밀 관측을 통해 우주의 조성을 성공적으로 설명해 왔으나, 최근 심각한 관측적 난제들에 직면해 있다. 암흑물질 입자는 수십 년간의 탐색에도 불구하고 발견되지 않았으며, 허블 텐션(Hubble Tension)이라 불리는 우주 팽창률의 불일치는 해소되지 않고 있다. 결정적으로, 제임스 웹 우주 망원경(JWST)이 관측한 적색편이 z > 10 영역의 거대 성숙 은하들은 표준 모델이 허용하는 138억 년의 우주 나이로는 설명이 불가능하다.본 논문은 이러한 위기를 극복하기 위해 시공간을 단순한 기하학적 매니폴드가 아닌, ‘기억(Memory)’과 ‘탄성(Elasticity)’을 가진 물리적 매질로 재정의하는 **‘시공간 탄성 이력 이론(Spacetime Elastic Hysteresis Theory)’**을 제안한다. 우리는 질량과 중력이 시공간 격자의 위상학적 엉킴 밀도(ψ)에서 기인한다는 구성 방정식인 **‘킴의 법칙(Kim’s Law, E = κψ)’**을 도입한다.우리는 엉킴 스칼라 필드(ψ)를 포함한 확장된 라그랑지안을 구성하고, 변분 원리를 통해 수정된 아인슈타인 장 방정식을 유도하였다. 여기서 도출된 **‘킴 텐서(Kim Tensor, K_μν)’**는 은하 규모에서 추가적인 중력 효과를 발생시켜 암흑물질 없이도 은하 회전 곡선의 평탄화를 완벽하게 설명한다. 또한, 시공간 매질의 점탄성(Viscoelasticity)에 의한 에너지 소산 효과인 ‘탄성 적색편이(Elastic Redshift)’를 고려하여 우주 팽창 역사를 재구성한 결과, 우주의 실제 나이는 **165.4억 년(16.54 Gyr)**으로 산출되었다. 이 새로운 타임라인은 JWST가 관측한 조기 거대 은하들의 형성과 진화에 필요한 시간을 자연스럽게 제공함으로써, 현대 우주론의 난제들을 통합적으로 해결한다.
Category: Astrophysics

[1177] ai.viXra.org:2602.0027 [pdf] submitted on 2026-02-08 00:50:33

Unified Reconstruction of Cosmology Via Spacetime Hysteresis and Gravitational Entanglement [ Updated 2]

Authors: Chang-Sik Kim
Comments: 7 Pages. Copyright © 2026 Chang-Sik Kim. All rights reserved.

Wepropose a scalar-tensor theory of gravity by redefining spacetime as a physical viscoelastic medium. Addressing the anomalies of the ΛCDMmodel—specifically the Hubble Tension and the early formation of massive galaxies observed by JWST—we introduce a constitutive relation, Kim’sLaw (E = κψ). This law postulates that the gravitational potential arises from the topological entanglement density (ψ) of the spacetime lattice. By constructing a Lagrangian density with an elastic potential term,we derive modified Einstein Field Equations. We explicitly demonstrate that the "missing mass" in galactic rotation curves is a manifestationof vacuum rigidity (Kµν) rather than non-baryonic Dark Matter. Furthermore, we incorporate an energy dissipation term (hysteresis) into the Friedmann equations, deriving a recalibrated cosmic age of 16.54 Gyr.This extended timeline resolves the conflict between standard cosmology and the existence of mature galaxies at z > 10.
Category: Quantum Gravity and String Theory

[1176] ai.viXra.org:2602.0026 [pdf] submitted on 2026-02-08 00:53:04

Golden Eigenvalues and the Tihany Conjecture for Mycielski Graphs

Authors: Vinicius F S Santos
Comments: 17 Pages. (Note by viXra Admin: Parts of the texts are cut off!)

The Erdős—Lovász Tihany Conjecture (1968) asserts that every graph G with χ(G) ≥ s + t − 1 > ω(G) admits a vertex partition into parts with chromatic numbers ≥ s and ≥ t, respectively. We prove the conjecture for the infinite family of pairs (3, k−2) on Mycielski graphs Mk for all k ≥ 5. Our approach is spectral, centred on the golden ratio φ = (1+√5)/2. The pentagon C5—the minimal graph with χ > ω—has adjacency eigenvalues {2, φ−1, φ−1,−φ,−φ}, and this golden spectral structure propagates through the Mycielski construction: the golden ratio’s defining equation μ2 − μ − 1 = 0 arises exactly from the Mycielski eigenvalue relation (Lemma 2.1). We prove the C5-Peeling Existence Theorem: for every Mk (k ≥ 4), a direction in the golden eigenspace peels off a C5 from Mk. The proof is constructive via spectral interferometry: two Mycielski lift paths span a four-dimensional subspace whose layer-control matrix has det = √5, enabling independent phase steering to select a diagonal lift C5—the reverse cycle through alternating address layers. Computationally, the Hoffman margin of this partition is F = φ−3 = √5 − 2 exactly, verified for all k ≤ 12. The key advance is the Golden Sub-Induction (Theorem 1.2): for k ≥ 6, the 1/φ shadow attenuation forces the peeled C5 into the original vertex block of Mk, so the remainder Mk Pk contains Mycielski(Mk−1 Pk−1) as a subgraph, where (Pk)k≥5 is a coherent family of peeled pentagons. Since χ(Mycielski(G)) = χ(G) + 1 for any graph with an edge, this yields the inductive bound χ(Mk Pk) ≥ k − 2. Combined with χ(C5) = 3, this settles the Tihany conjecture for the pair (3, k−2) on Mk for every k ≥ 5—an infinite family of previously open cases.
Category: Combinatorics and Graph Theory

[1175] ai.viXra.org:2602.0025 [pdf] submitted on 2026-02-07 18:18:35

Unified Reconstruction of Cosmology Via Spacetime Hysteresis and Gravitational Entanglement (In Korean)

Authors: Chang-Sik Kim
Comments: 21 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Modern cosmology faces a "dual crisis": the failure to detect dark matter particles and the Hubble Tension. This research proposes a paradigm shift, redefining spacetime as an "Elastic Medium with Memory" (Hysteresis). Central to this is Kim’s Law (E = K*P (kappapsi)), which defines mass as the emergent manifestation of Spacetime Entanglement Density (P: psi). By correcting for Elastic Redshift (z_{elastic}), the model derives a true cosmic age of 16.5 billion years, resolving the paradox of mature early galaxies observed by the James Webb Space Telescope (JWST).2. Theoretical Framework: Kim-Einstein Equations2.1 The Unified LagrangianThe theory integrates a new elastic potential term, mathcal{L}_{entangle} (or mathcal{L}_{Knot}), into the standard Einstein-Hilbert action:
Category: Astrophysics

[1174] ai.viXra.org:2602.0024 [pdf] submitted on 2026-02-07 18:22:12

Unified Reconstruction of Cosmology Via Spacetime Hysteresis and Gravitational Entanglement [thesis]

Authors: Chang-Sik Kim
Comments: 17 Pages. First proposed by Chang-Sik Kim. This paper introduces a novel elastic framework for spacetime to solve 12 major challenges in modern astrophysics. Copyright © 2026 Chang-Sik Kim. All rights reserved.

Modern cosmology faces a "dual crisis": the failure to detect dark matter particles and the Hubble Tension. This research proposes a paradigm shift, redefining spacetime as an "Elastic Medium with Memory" (Hysteresis). Central to this is Kim’s Law (E = p*k ):(kappapsi), which defines mass as the emergent manifestation of Spacetime Entanglement Density (p: psi). By correcting for Elastic Redshift (z_{elastic}), the model derives a true cosmic age of 16.5 billion years, resolving the paradox of mature early galaxies observed by the James Webb Space Telescope (JWST).2. Theoretical Framework: Kim-Einstein Equations2.1 The Unified LagrangianThe theory integrates a new elastic potential term, mathcal{L}_{entangle} (or mathcal{L}_{Knot}), into the standard Einstein-Hilbert action:
Category: Astrophysics

[1173] ai.viXra.org:2602.0023 [pdf] submitted on 2026-02-07 18:18:00

Unified Reconstruction of Cosmology Via Spacetime Elastic Hysteresis and Gravitational Entanglement

Authors: Chang-Sik Kim
Comments: 8 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

This paper proposes a paradigm shift in modern cosmology by redefining spacetime not merely as a geometric manifold, but as a physical elastic medium possessing "memory" (hysteresis). We address the critical failures of the standard ACDM(Lamda-CDM) model, specifically the unexplained nature of Dark Matter, the Hubble Tension, and the observational anomalies of early mature galaxies detected by the James Webb Space Telescope (JWST).We introduce Kim’s Law (E=K*P : $E = kappapsi$), which postulates that mass is an emergent phenomenon resulting from the topological entanglement density (P: $psi$) of the spacetime lattice. By incorporating a new stress-energy tensor, the Kim Tensor ($K_{muu}$), into Einstein’s Field Equations, we demonstrate that the "missing mass" in galactic rotation curves is actually the manifestation of spacetime’s elastic rigidity in low-density regions. Furthermore, we identify Elastic Redshift ($z_{elastic}$)—energy dissipation due to spacetime tensile stress—as a correction factor for cosmic expansion. This correction resolves the Hubble Tension, yielding a re-calibrated cosmic age of 16.54 billion years, providing the necessary time-frame for the evolution of primordial galaxies. This theory offers a unified, dark-matter-free framework that satisfies both quantum mechanics and general relativity.
Category: Quantum Gravity and String Theory

[1172] ai.viXra.org:2602.0022 [pdf] submitted on 2026-02-07 18:23:48

Unified Reconstruction of Cosmology via Spacetime Hysteresis and Gravitational Entanglement [II in Korean]

Authors: Chang-Sik Kim
Comments: 25 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Modern cosmology faces a "dual crisis": the failure to detect dark matter particles and the Hubble Tension. This research proposes a paradigm shift, redefining spacetime as an "Elastic Medium with Memory" (Hysteresis). Central to this is Kim’s Law (E = p*k ):(kappapsi), which defines mass as the emergent manifestation of Spacetime Entanglement Density (p: psi). By correcting for Elastic Redshift (z_{elastic}), the model derives a true cosmic age of 16.5 billion years, resolving the paradox of mature early galaxies observed by the James Webb Space Telescope (JWST).2. Theoretical Framework: Kim-Einstein Equations2.1 The Unified LagrangianThe theory integrates a new elastic potential term, mathcal{L}_{entangle} (or mathcal{L}_{Knot}), into the standard Einstein-Hilbert action:
Category: Astrophysics

[1171] ai.viXra.org:2602.0021 [pdf] submitted on 2026-02-07 16:49:07

Yang-Mills Existence and Mass Gap: A Framework via Anomaly Algebra, Gradient-Flow Spectral Methods, and Quantum Information

Authors: Lluís Eriksson
Comments: 34 Pages.

We present a rigorous framework for the Yang-Mills mass gap problem, combining three independent lines of argument.Result A (Unconditional). A new MaxEnt Clustering-Recovery Bridge: for any lattice gauge state with finite correlation length xi, the Petz recovery fidelity satisfies 1-F <= C e^{-r/xi}. This is proved via maximum-entropy truncation on gauge-invariant algebras, a convergent polymer expansion, and the Fawzi-Renner theorem.Result B (Unconditional on the lattice, conditional for all couplings). For SU(N) lattice gauge theory (T=0, theta=0, d=3+1, N>=2): the algebraic phase exclusion, using the projective commutation relation of 1-form symmetry operators, unconditionally excludes the trivially gapped symmetric phase. Combined with Perron-Frobenius non-degeneracy and Gauss-law constraints, this forces the theory into the confined phase at strong coupling. The extension to all couplings relies on a single hypothesis: the absence of a bulk phase transition. Under this hypothesis, the uniform lattice mass gap Delta >= m_0 > 0 holds for all lattice spacings.Result C (Conditional). Under the same hypothesis, the continuum limit of SU(N) Yang-Mills theory in d=3+1 exists as a Euclidean QFT satisfying all Osterwalder-Schrader axioms (OS0-OS4), with exponential clustering rate m_0 > 0 (mass gap).Result D (Gradient Flow Reduction). Independently of the anomaly argument, we prove that the mass gap in d=4 is equivalent to a concrete spectral condition on the gradient flow beta-function being strictly negative for all g > 0, combined with a Tauberian regularity condition.The proof architecture uses three main tools: (1) the algebraic structure of higher-form symmetry anomalies on the lattice, (2) backward error analysis of the lattice gradient flow combined with a new spectral calibration, and (3) the MaxEnt bridge from quantum information theory. Exact diagonalisation of Z_2 lattice gauge theory on lattices up to 12 qubits and Z_3 clock models confirms all quantum-information predictions of the framework. This paper contains one explicitly stated hypothesis (absence of bulk phase transition) that is not proven. All conditional results are clearly marked.
Category: Quantum Physics

[1170] ai.viXra.org:2602.0020 [pdf] submitted on 2026-02-07 16:50:59

Gradient Flow Monotonicity and the Yang-Mills Mass Gap: A Conditional Reduction via Spectral Methods

Authors: Lluis Eriksson
Comments: 18 Pages.

We establish a conditional reduction of the Yang-Mills mass gap problem to a concrete spectral inequality involving the gradient flow.For pure SU(N) Yang-Mills theory, if the gradient flow beta-function satisfies a uniform strict asymptotic freedom condition |beta_{GF}(g)| >= delta g^3 for large g, and a Tauberian regularity condition holds for the spectral density, then: in d=3, the theory has a mass gap Delta > 0; in d=4, the infrared trace anomaly vanishes (a_{IR}=0), ruling out a conformal infrared fixed point, and reducing the mass gap to explicit spectral conditions. However, the spectral argument is marginal in d=4 and requires additional non-perturbative input.The proof uses three ingredients: (1) a spectral representation of the gradient flow energy E(t) and a monotonicity identity R'(t) = -2 Var_t(lambda) <= 0 for the ratio R(t) = F(t)/E(t); (2) the Komargodski-Schwimmer a-theorem constraining the IR behaviour; and (3) a gradient flow Poincare inequality connecting functional inequalities to exponential clustering of correlators.We verify all perturbative inputs: the free-field calibration gives R_{free}(t) = 2/t in d=4, and the one-loop correction has the correct sign (R(t) < 2/t for g > 0). We identify the non-perturbative obstruction (the indefiniteness of the Weitzenbock curvature term) as the precise technical barrier to closing the argument in d=4. This paper is a companion to the author's paper on anomaly algebra and quantum information methods for the mass gap. The two approaches are complementary and independent.
Category: Quantum Physics

[1169] ai.viXra.org:2602.0019 [pdf] submitted on 2026-02-07 18:25:11

Unified Reconstruction of Cosmology Via Spacetime Hysteresis and Gravitational Entanglement [summary]

Authors: Chang-Sik Kim
Comments: 7 Pages.

Contemporary cosmology is grappling with a fundamental crisis: the persistent failure to detect Dark Matter particles and the escalating "Hubble Tension." Recent high-redshift observations from the James Webb Space Telescope (JWST) have revealed massive, mature galaxies that defy the 13.8 billion-year timeline of the standard Lambda-CDM model.This dissertation proposes a revolutionary paradigm shift by redefining the vacuum of spacetime as an "Elastic Medium with Memory" (Spacetime Hysteresis). We introduce Kim’s Law (E = kappa * psi), which identifies the origin of mass-energy as the topological entanglement density (psi) of the spacetime lattice. By incorporating a new stress-energy tensor, the Kim Tensor, into Einstein’s Field Equations, we demonstrate that galactic rotation curves and gravitational lensing anomalies are manifestations of spacetime’s emergent rigidity, eliminating the need for hypothetical dark matter particles.Furthermore, we identify "Elastic Redshift"—energy dissipation within the strained spacetime medium—as a hidden variable in cosmic expansion. Correcting for this factor resolves the Hubble Tension and establishes a recalibrated cosmic age of 16.54 billion years. This theory provides the necessary temporal window for the evolution of early massive galaxies and offers a unified, dark-matter-free framework satisfying both quantum logic and general relativistic principles.
Category: Astrophysics

[1168] ai.viXra.org:2602.0018 [pdf] submitted on 2026-02-06 09:22:11

The Invisible War: Defending the Human Brain Against High-Engagement Psychological Warfare

Authors: Steven Hammon
Comments: 42 Pages.

The modern media ecosystem, driven primarily by profit-first engagement algorithms, has created significant issues in terms of information distribution. This paper argues that the public is subjected to pervasive psychological manipulation, primarily through the control of fear, hate, and polarization, with no simple choice to opt out while still being informed. This has cascading and concerning consequences, including measurable harm to the mental well-being of children, the fragmentation of social cohesion, and the systemic erosion of the democratic process. It also forces manipulative content and misinformation to be vilified as a problem. The paper examines the expectation that individuals should defend themselves against multi-billion-dollar manipulation engines. Censorship infringes on free speech and often fails to address the root cause. Instead, this paper proposes the implementation of an Ethical Journalism Standard (EJS), modeled on the MPAA film rating system, which has been shown to satisfactorily balance free speech with the need to protect children and society since 1968. The EJS would not ban or suppress default feed content. It would provide a label for journalism that adheres to the established Journalist Code of Ethics, and allow Ethical Journalism content to be selected, like how parents select a TV station of G-rated content. This empowers the public’s right to access factual information, to make informed choices about the information they consume and share, and to give informed consent about the future of their country. It also gives manipulation a place to be celebrated as a skill. By elevating credible, ethical content, society can foster a healthier information environment, safeguard children, and restore confidence in governments and media that is essential for a functioning democracy, while simultaneously embracing the right to free speech manipulation as a skill to be celebrated.
Category: Social Science

[1167] ai.viXra.org:2602.0017 [pdf] submitted on 2026-02-06 02:25:39

Entropy-Inertial Curvature: A Relational Hypothesis with Cosmological Consistency

Authors: Geza Kovacs
Comments: 2 Pages.

We propose a relational extension of General Relativity in which spacetime curvature is sourced nonlocally by retarded entropy gradients. Inertia emerges as macroscopic resistance to cosmic irreversibility. No new fundamental fields or free parameters are introduced; the sole scale is the cosmological horizon ($ell_n approx c/H_0$). Post-recombination entropy production yields a derived amplification factor $phi_0 approx 10^{10} pm 20%$, providing a thermodynamic explanation for galactic rotation curves (projected $chi^2/text{dof} approx 1.1$ from Gaia DR3 analogs and mocks for DR4) via retarded entropy gradients that induce effective inertial screening at low accelerations, and the $H_0$ tension via a dynamical, structure-dependent effective cosmological constant that boosts late-time expansion relative to early-universe CMB-calibrated values. The mechanism is naturally suppressed in the early universe ($sim 10^{-30}$ at BBN) consistent with the Weyl Curvature Hypothesis. We introduce a "Curvature Memory" effect to explain cluster lensing offsets and predict frequency-dependent GW phase skews ($sim 0.4$ ms) testable by LISA.
Category: Relativity and Cosmology

[1166] ai.viXra.org:2602.0016 [pdf] submitted on 2026-02-05 01:06:44

The Quantum Space Mechanism - The Origin of Forces

Authors: Joseph Koharski
Comments: 66 Pages.

This document contains a compilation of five research papers detailing the Quantum SpaceMechanism (QSM). These papers propose a unified framework where Inertia, Gravity, and Timeemerge from the hydrodynamics of a viscous, dilatant vacuum substrate (the Higgs field). The seriescovers: (I) The Entropic Origin of Inertia and the Bridge Equation; (II) The Vacuum Yield Point andthe Origin of Gravity; (III) The Geometry of Mass and Particle Generations via Finslerian Anglesof Attack; (IV) Macroscopic Dynamics, Dark Matter as Metric Expansion, and Electromagnetism;and (V) The Origin of Time as Viscous Dissipation
Category: Quantum Gravity and String Theory

[1165] ai.viXra.org:2602.0015 [pdf] submitted on 2026-02-04 21:19:15

On the Existence of Destiny: A Demonstration from Spacetime Physics

Authors: Manuel Alejandro Hernández Madan
Comments: 5 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references!)

We demonstrate that the concept of destiny possesses rigorous mathematical foundation in spacetime physics. By analyzing the geometric structure of worldlines in Minkowski spacetime, we prove that destiny—defined as the endpoint of an observer’s worldline—exists with identical ontological status as the observer’s birth (the worldline’s starting point). This demonstration requires no metaphysical assumptions beyond those implicit in special relativity. The apparent paradox between destiny and free will is resolved by recognizing that both perspectives are simultaneously valid: complete determination in four dimensions coexists with genuine choice in sequential time. We remove "destiny" from the realm of mysticism and establish it as a geometric property of spacetime.
Category: History and Philosophy of Physics

[1164] ai.viXra.org:2602.0014 [pdf] submitted on 2026-02-04 21:12:52

A Scalar Product Approach to Strong Goldbach Conjecture

Authors: Ezadiin Redwan
Comments: 4 Pages.

"We present a universal proof of the Strong Goldbach Conjecture by shifting the problem from arithmetic density to Topological Symmetry. By defining primes as the deterministic ’parent set’ via the Fundamental Theorem of Arithmetic, we map the interaction between addition and multiplication onto a vector space. We prove that the identity 2n cos(θ) = a+b is a structural requirement of this space. Thisnon-constructive existence proof demonstrates that for every even integer 2n, a prime partition (a, b) is geometrically necessitated by the scalar projection of prime-based vectors, thereby resolving the parity problem through architectural determinism."
Category: Number Theory

[1163] ai.viXra.org:2602.0013 [pdf] submitted on 2026-02-04 13:36:09

Towards a Local Minimum Time Resolution in Curved Spacetime

Authors: Priyanshu Rauth
Comments: 3 Pages.

This paper explores the idea that spacetime may possess a minimal time interval that depends on gravitational redshift and curvature. Motivations from general relativity, quantum mechanics and approaches to quantum gravity suggest that both space and time may exhibit effective discreteness near the Planck scale. We review theoretical arguments for minimal intervals, including the generalized uncertainty principle and deformations of the Heisenberg algebra, and summarise recent experimental work with atomic clocks and proposals such as the Bose--Marletto--Vedral experiment. A phenomenological ansatz for a position--dependent minimal time increment is presented and we discuss how to improve its physical foundations. The aim is not to propose a theory of everything but to offer a conservative, focused framework that could guide future experiments.
Category: Relativity and Cosmology

[1162] ai.viXra.org:2602.0012 [pdf] submitted on 2026-02-03 20:03:32

Foundations of Physics: The Law of Quantized Reorganization Rate and the Theorie of Euclidean Quantum Reorganization

Authors: Christian B. Mueller
Comments: 2 Pages. (Note by viXra Admin: Please cite and list scientific references)

This paper presents a novel framework for fundamental physics that replaces the geometriccurvature of spacetime with a resource-based conservation law. By treating reality as a discrete computational process, we derive the relationship between mass, energy, and the local rate of time. The model progresses from a 3D-accounting perspective (LQRR) to a 4D-Euclidean resonance framework (TEQR), providing a unified explanation for gravity, time dilation, and projection-based phenomena often attributed to Dark Matter.
Category: Relativity and Cosmology

[1161] ai.viXra.org:2602.0011 [pdf] submitted on 2026-02-03 09:54:47

Modular Cells for Quantification

Authors: Alaya Kouki
Comments: 44 Pages.

From Restraint Relativity it is possible to consider a corpuscle as a packet of strings. The variation of the length of this packet is equal to the phase speed of the corpuscle as a packet of waves times an universal constant. In a system of units where ℏ=c=a=1 the string vector becomes equal to the wave vector.From quantum mechanics we had deduce that a Planck oscillator can emit or absorb power only by quanta of power multiple integer of hν^2. This allows us to divide space-time in modular cells of action, momentum, energyu2026etc in the phase space geometry of the oscillator and resolve the problem of vacuum energy density. Planck system of units serves only for black holes topology.Space-time is a four open dimensions. More than four the dimension should be curled. Resolving gravitation field by computer machine with model using modular cells will be without singularities . Einstein equations of gravitational field are available only in a quasi static asymptotically flat Universe. Constants G & Λ of General Relativity are proportional to the inverse square of the Universe radius.
Category: Quantum Gravity and String Theory

[1160] ai.viXra.org:2602.0010 [pdf] submitted on 2026-02-03 19:55:50

Interconnected Infinities Giant Sphere Space - IIGSS

Authors: Mohammad Saeed Alnatour
Comments: 172 Pages. (Note by ai.viXra.org Admin: For the last time, author name is required in the article after article title and please cite listed scientific references)

Divergent series and singular integrals arise naturally in analysis, geometry, and theoretical physics, yet their standard treatment relies on analytic continuation or limit-based regularization. While these methods successfully assign finite values, they necessarily suppress information about how infinity is traversed. This work proposes a structural framework—Interconnected Infinities Giant Sphere Space (IIGSS), together with an intrinsic regulator, the Discrete Laplace Regulator (DLR), in which divergence is treated as a boundary phenomenon rather than a failure of summation.DLR operates directly on discrete sequences by introducing controlled exponential damping and expanding the resulting kernel at a well-defined infinite-traversal gate. Divergence appears explicitly as algebraic pole terms or logarithmic singularities in the gate expansion, encoding growth class and traversal density, while a pole-invariant constant—the Convergence Momentum (CM)—emerges as a finite structural quantity. Valuation is performed exclusively through gate expansion followed by pole removal, without index shifting, limit evaluation, or analytic continuation.Within this framework, classical zeta and Dirichlet regularizations are recovered as special projections under standard traversal, while traversal-sensitive features—such as zero insertion, spacing modulation, and phase structure—remain distinguishable. The framework accommodates finite-gate and oscillatory sequences and clarifies the limitations of reconstruction from regularized values alone. In physical applications, CM functions as a retained boundary invariant: when applied to spectral mode sums, such as those appearing in the Casimir effect, the regulator preserves observable finite quantities while rendering the underlying divergence structure explicit. DLR thus provides a higher-resolution language for infinity, preserving established results while exposing structural information necessarily omitted by classical methods.
Category: Functions and Analysis

[1159] ai.viXra.org:2602.0009 [pdf] submitted on 2026-02-03 19:44:56

Inverse Frequency Duality of Spacetime

Authors: Cameron Brogan-Higgins
Comments: 2 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

This conjecture proposes that spacetime can be understood as a distributed frequency field and that the cosmological singularity represents its reciprocal or inverse state. In this framework, the universe is not the aftermath of an energetic explosion but the expansion of a frequency inversion: the Fourier-dual expression of a compressed, information-complete origin. Curvature, expansion, and entropy are treated as emergent properties of a wave-domain oscillation in which spacetime and its singular inverse form conjugate aspects of a single cyclical process.
Category: Relativity and Cosmology

[1158] ai.viXra.org:2602.0008 [pdf] submitted on 2026-02-03 19:43:20

Emergent Quantum Mechanics, Special Relativity, Induced Gravity, Dirac Fermions and U(1) Gauge Interactions from Discrete Hypercubic SpacetimeLattice

Authors: Bertrand Jarry
Comments: 7 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)

We present a complete bottom-up derivation of non-relativistic quantum me-chanics, special relativity, induced gravity (Sakharov mechanism), Dirac fermions, and U(1) gauge interactions from nearest-neighbor unitary evolution on a 4D hy-percubic lattice. The Schrödinger equation emerges exactly in the continuum limit, with higher-order Lorentz-violating corrections. We rigorously prove the Heisenberg uncertainty principle, quantum superposition, entanglement, and probability conser-vation as direct consequences of the discrete structure. Detailed calculations includeTaylor expansions to order 10, Fourier dispersion to k10, exact commutator[x, p], variance,norm conservation to order 8, 1016 GeV, is compatible with the latest LHAASO constraints on GRB 221009A(2025—2026). This work forms the foundation for emergent SU(2)×SU(3), Higgs,cosmology, and quantum gravity in subsequent papers.
Category: Quantum Gravity and String Theory

[1157] ai.viXra.org:2602.0007 [pdf] submitted on 2026-02-02 19:31:23

Rigidity-induced Scaling Laws in Unit Distance Graphs

Authors: Lucas Aloisio
Comments: 5 Pages.

We revisit the classical Unit Distance Problem posed by Erdős in 1946. While the upper bound of O(n4/3) established by Spencer, Szemerédi, and Trotter (1984) is tight for systems of pseudo-circles, it fails to account for the algebraic rigidity inherent to the Euclidean metric. By integrating structural rigidity decompositionwith the theory of Cayley-Menger varieties, we demonstrate that unit distance graphsexceeding a critical density must contain rigid bipartite subgraphs. We prove a "FlatnessLemma," supported by symbolic computation of the elimination ideal, showing that the configuration variety of a unit-distance K3,3 (and by extension K4,4) in R2 is algebraically singular and collapses to a lower-dimensional locus. This dimensional reduction precludes the existence of the amorphous, high-incidence structures required to sustain the n4/3 scaling, effectively improving the upper bound for non-degenerate Euclidean configurations.
Category: Combinatorics and Graph Theory

[1156] ai.viXra.org:2602.0006 [pdf] submitted on 2026-02-02 19:03:23

The Fractal Planck Voxel Model: A Geometric Unification of General Relativity, Quantum Mechanics, the Standard Model, and Consciousness

Authors: Benjamin R. Carignan
Comments: 59 Pages.

In 1899, Max Planck introduced the fundamental constants that now bear his name, combining the gravitational constant G, the speed of light c, and the reduced Planck's constant ħ to define natural units of length, time, and mass. The Planck length and Planck mass emerged as the scales at which quantum effects of gravity should become dominant — a transition zone where classical general relativity (GR) and quantum mechanics (QM) intersect. Planck himself viewed these units as theoretical curiosities, unaware that they hinted at a discrete structure underlying spacetime. We propose that spacetime at the Planck scale is composed of a grid — a lattice — of rhombic dodecahedral voxels with fractal boundaries. The unique properties of this shape impart it with the specific qualities and characteristics of the observed universe through emergent geometric rules. Through this single geometric structure, the Fractal Planck Voxel model (FPV) fully realizes the transition zone that Planck’s work hinted at over 125 years ago. The FPV model derives general relativity, quantum mechanics, the Standard Model gauge groups, three generations, particle masses, and mixing angles, all from voxel symmetry and triangular modes on rhombic faces, with the cosmological constant serving as the energy of fractal subdivision. The model also addresses numerous longstanding problems and questions in physics — from the collapse of the wave function to neutrino masses, dark matter, dark energy, the origin of the Higgs, and more. The fact that the FPV model does all of this while remaining consistent with all current observations — without additional fields, particles, dimensions, or fine-tuned parameters — lends strong credence to its validity as a unifying theory.
Category: Quantum Gravity and String Theory

[1155] ai.viXra.org:2602.0005 [pdf] submitted on 2026-02-02 19:16:34

Framework of a Unified Effective Field Theory for Gravity Based on Atomic-Scale Quantum Entanglement Networks

Authors: Yunsheng Shu, Shun Yao, Zhiyong Yao
Comments: 54 Pages.

This paper proposes a unified effective field theory (EFT) framework for gravity based on atomic-scale quantum entanglement networks. In this framework, the gravitational field of any object—universal gravitation—arises from the superposition of micro-gravity domains (ffGD) at the atomic level, with the strength of gravity precisely related to the total number of atoms, thereby unifying macroscopic phenomena with microscopic interpretations (the superposition of micro-gravity domains (ffGD) at the atomic level forms a gradient material field, and the gradient material field constitutes the geodesic of matter (planets)). Einstein’s theory of curved spacetime serves as an equivalent description at large scales, enabling the model to explain all classical gravitational effects while linking quantum origins to observable reality in an intuitive and engaging way: gravity can be imagined not as abstract curvature, but as the collective "pull" produced when atomic quantum threads are interwoven into the structure of space. The core of the framework is a topology-constrained effective field theory, which uses the emergent distance induced by mutual information and the Fisher information metric as the interface from discrete entanglement networks to continuous geometry, and has constrainable low-energy parameterization (e.g., ffff, ffff, cff, Λff). This paper emphasizes the paradigm of "falsifiability first", integrating strong-field endpoints (such as the effective reflection model of Planck entanglement nucleus boundary conditions) with weak-field, cosmological, and multi-messenger observations into a multi-channel decision matrix, systematically converting "non-detection" results into upper bounds on parameter space.
Category: Quantum Gravity and String Theory

[1154] ai.viXra.org:2602.0003 [pdf] submitted on 2026-02-01 01:02:37

Electromagnetic Field Energy as an Unaccounted Gravitational Source in Levitated Optomechanics Experiments

Authors: Joseph Wimsatt
Comments: 4 Pages. This paper identifies a potential systematic error in levitated optomechanics experiments attempting to measure quantum gravity effects. We show that electromagnetic field energy in optical traps gravitates through the stress-energy tensor

Recent advances in levitated optomechanics have enabled experiments probing gravitational interactions at unprecedented scales, with the goal of detecting quantum signatures of gravity. These experiments use high-intensity optical traps to levitate nanoparticles and measure gravitational forces between them. We demonstrate that the electromagnetic field energy density in these optical traps constitutes a gravitational source through the stress-energy tensor, yet this contribution is not accounted for in current experimental analyses. Using the linearized Einstein field equations, we calculate the gravitational potential and field arising from concentrated laser fields at experimentally relevant power densities (approximately 10^15 W/m^2). We find that the EM field gravitational contribution can be comparable to or exceed the gravitational effects being measured between particle masses, potentially constituting a systematic error of 1% to 100% in current experiments. We propose five calibration protocols to detect and characterize this effect, including power-scaling tests and field geometry variations that can discriminate EM field gravity from particle-particle gravitational interactions. If unaccounted for, this effect could compromise the interpretation of experiments seeking quantum signatures of gravity.
Category: Relativity and Cosmology

[1153] ai.viXra.org:2602.0002 [pdf] submitted on 2026-02-02 02:05:03

From Holographic Complexity to Emergent Spacetime: A Unified Framework for Early Universe Structure Formation via TSVF and Quantum Error Correction

Authors: Hassan Dawood Salman
Comments: 6 Pages. For numerical validation, see DOI 10.5281/zenodo.18382581

Recent observations by the James Webb Space Telescope (JWST) of massive galaxies at z > 10 reveal a profound tension with the standard ΛCDM structure formation paradigm. We propose a resolution rooted in Two-Boundary Quantum Cosmology (TBQC), where spacetime geometry emerges as a Holographic Quantum Error Correcting Code (HQECC). We formalize the selection of cosmic history using the Two-State Vector Formalism (TSVF), introducing a "Process Matrix" Weff that acts as a teleological filter, selecting histories that maximize computational efficiency (δAC = 0) without violating causality. This mechanism effectively lowers the critical density threshold for gravitational collapse, achieving spectacular agreement with JWST stellar mass and UV luminosity functions (χ²/dof ≈ 0.06). Our model exhibits surgical precision: active only at z > 10 while preserving all late-time constraints. We predict a unique, falsifiable 1500% enhancement in the 21cm power spectrum at k∗ ≈ 1 Mpcu207b¹, detectable by HERA Phase II (2025—2027) with > 100σ confidence.
Category: Astrophysics

[1152] ai.viXra.org:2602.0001 [pdf] submitted on 2026-02-01 00:02:58

Gravitational Rheology of a Unified Dark Sector Anisotropic Lensing and Stress-Driven Jets Near Black Hole Horizons

Authors: Eduardo Rodolfo Borrego Moreno
Comments: 22 Pages.

We extend an effective rheological description of a unified dark sector into the strong-field regime of rotating black holes within standard general relativity. Building on prior work where dark energy--like and dark matter--like phenomena emerge as distinct dynamical phases of a single residual medium, we examine the role of anisotropic stress and dissipation in gravitational optics and near-horizon dynamics. We show that stress gradients in the activated rheological phase contribute directly to the gravitational optical potential, yielding lensing without additional collisionless mass components. In the near-horizon region, convergent flows and stress amplification drive a phenomenological conversion regime that preserves total energy--momentum conservation and produces relativistic outflows. Toy estimates demonstrate that this mechanism can account for observed jet powers in low-accretion systems such as M87* ($sim 10^{42}-10^{44}$ erg s$^{-1}$), predicting lepton-dominated outflows with elevated linear polarization fractions ($Pi_{m lin} sim 15%-35%$), radial/poloidal morphology, and axis-aligned stability. These signatures are compatible with current EHT observations yet distinguishable from magnetically dominated models such as Blandford--Znajek. The framework provides a unified, testable description of dark phenomena across scales as phases of a single effective residual medium, without modifying Einstein gravity or introducing new degrees of freedom.
Category: Astrophysics

[1151] ai.viXra.org:2601.0120 [pdf] submitted on 2026-01-30 16:30:00

[ Exploration/Speculation] On Representing Natural Numbers as Differences of Two Distinct Prime Powers

Authors: Anish Sola
Comments: 6 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

We study representations of integers as differences of prime powers, n = p^a − q^b,with distinct prime bases p ̸= q and distinct exponents a ̸= b. We focus on the positive-exponent setting (a, b ≥ 1) and on the proper-prime-power variant (a, b ≥ 2), for which the problem is closer in spirit to Goldbach- and Pillai-type questions. We prove elementary structural constraints (notably parity restrictions), propose first-moment heuristics, and outline a computational program.
Category: Number Theory

[1150] ai.viXra.org:2601.0119 [pdf] submitted on 2026-01-30 04:04:31

The Fractal Substrate Equivalence Principle: A Unified Foundation for Quantum Mechanics and General Relativity

Authors: Steven E. Elliott
Comments: 14 Pages.

We introduce the Fractal Substrate Equivalence Principle (FSEP) as the foundational axiom for a unified theory of physics, asserting that in a fractal universe governed by magneto-hydrodynamics (MHD), physical laws, structures, and phenomena are exactly equivalent acrossall scales upon appropriate scaling transformations. Unlike prior fractal cosmologies that treat general relativity (GR) or quantum mechanics (QM) as fundamental, the FSEP posits these as emergent scale-dependent descriptions of a single electric fluid dynamics. This principleunequivocally asserts: stars are photons, black holes are atomic nuclei, and dark matter is electron orbital shells—not as analogies, but as exact physical identities across fractal layers. We demonstrate how this single MHD substrate reproduces the essential features of QM and GR,and explains the origin of dark matter, black hole—galaxy correlations, and the fine structure of atomic spectra, within a unified electric—fluid picture.
Category: Mathematical Physics

[1149] ai.viXra.org:2601.0118 [pdf] submitted on 2026-01-29 00:36:48

Emergence of General Relativity and Quantum Field Theory from Anisotropic Flux Suppression

Authors: Frederick Manfredi
Comments: 74 Pages.

We propose that the universe emerges from a single classical anisotropic field—the "soup"—governed by a simple suppression rule favoring radial motion and penalizing perpendicular deviations: S(θ) = 1 /ϕ^6 sin^4 θ with density feedback Seff (θ, ρ) = S(θ)(1 + βρ). Mass, gravity, time dilation, electromagnetism, quantum statistics, entanglement, and the observer effect arise as emergent behaviors from this field’s interactions across scales and densities. No separate quantum fields, gravitons, dark energy, or wavefunction collapse are invoked; apparent quantum weirdness and relativistic gravity are higher-order effects of the same deterministic dynamics. These testable deviations distinguish the anisotropic soup picture from standard GR and QFT, offering a uni- fied classical mechanism for observed physics. Within the approximations of this model, theSemi-Dirac soup reproduces quantum Bell correlations closely matching experiment via suppression nonlinearity (Appendix A) and the full Einstein field equations—including Schwarzschild/Kerr metrics, gravitational waves, horizons, and black-hole thermodynamics—as a high-density emergent limit (Appendix D.6). It also recovers key condensed-matter phenomena (e.g., semi-Dirac fermions in ZrSiS)
Category: Quantum Gravity and String Theory

[1148] ai.viXra.org:2601.0117 [pdf] submitted on 2026-01-29 18:48:52

Geodesics of Meaning: Modeling Semantic Curvature in Transformer-Based Language Models via General Relativity

Authors: Travis Shane Taylor
Comments: 37 Pages.

We present a general relativistic framework for modeling transformer-based language models (LLMs) as nonlinear dynamical systems evolving on curved semantic manifolds. Standard transformer architectures are shown to approximate a flat Minkowski spacetime, where attention mechanisms define a local semantic metric tensor. We extend this formulation by introducing curved metrics—specifically the Schwarzschild and Friedmann—Lemaître—Robertson—Walker (FLRW) solutions—to model context-sensitive meaning, narrative curvature, and long-range semantic dependencies. A stress-energy tensor encodes topical mass, tonal flow, and tension, driving semantic curvature via Einstein’s field equations. We validate this framework using both simplified language simulations and full narrative data, showing that Ricci curvature serves as a physically interpretable measure of coherence, complexity, and twist. This work bridges differential geometry, nonlinear systems, and AI interpretability, offering a new paradigm for analyzing and guiding large language model behavior.
Category: Artificial Intelligence

[1147] ai.viXra.org:2601.0116 [pdf] submitted on 2026-01-28 01:05:10

Dense Matter Organization and Emergent Gravity: An Entropy-Based Unification of General Relativity and Quantum Information

Authors: Mat Ward
Comments: 33 Pages.

We present a conceptual framework linking gravitational curvature to entropygradients arising from dense matter organization, grounded in recent theoretical advances in entropic gravity (Bianconi, 2025) and validated against observational data from LIGO gravitational-wave events. Our central hypothesis posits that as matter concentrates at extreme densities, the relative entropy between the quantum information encoded in matter fields and the ambient spacetime metric increases sharply, driving emergent gravitational curvature through an entropy-minimization principle. Using public gravitationalwave data from GW150914 (binary black holes) and GW170817 (binary neutron stars), we demonstrate that horizon entropy increases correlate with matter density compression, supporting an informational interpretation of gravityrather than purely geometric one. This work bridges general relativity, quantum mechanics, and thermodynamics, suggesting gravity may be an emergentphenomenon tied to quantum information organization in matter-spacetime systems.Keywords: entropic gravity, quantum information, black holes, neutron stars,gravitational waves, emergence, entropy, density organization
Category: Quantum Gravity and String Theory

[1146] ai.viXra.org:2601.0115 [pdf] submitted on 2026-01-28 08:43:53

Algebraic Entropy and Conditional Mutual Information in a Tiny Gauge-Invariant Truncated Hilbert Space: A Reproducible Toy-Model Study with Effective Mixing Hamiltonians

Authors: Lluis Eriksson
Comments: 15 Pages.

We present a fully reproducible Google Colab pipeline to compute region algebraic entropies and conditional mutual informations (CMI) in a tiny truncated Hilbert space (dim = 8) indexed by discrete fusion-like descriptors (x, mu) on L = 4 cells. To generate nontrivial ground states within the descriptor-labeled subspace, we introduce an effective Hermitian mixing Hamiltonian based on a weighted k-nearest-neighbor (kNN) graph Laplacian over configuration labels. Across a parameter sweep, we identify a strong-mixing regime where the participation ratio approaches dim (consistent with Laplacian-dominated ground states on connected graphs) and algebraic CMI diagnostics become extremely small (down to 1e-6 and below) for the chosen algebraic factorization, while region algebraic entropies remain O(1) and exhibit near-quantized values approximately n log 2. The mixing term is an ansatz used to probe information-theoretic diagnostics and is not claimed to coincide with a Kogut—Susskind plaquette operator. All artifacts (CSV/JSON, figures, and model dumps) are generated in one run and packaged as Overleaf-ready assets.
Category: Quantum Physics

[1145] ai.viXra.org:2601.0114 [pdf] submitted on 2026-01-28 11:27:28

Quantumograph: A Testable Quantum Graph Theory of Spacetime

Authors: Sergey Materov
Comments: 62 Pages.

We’re excited to present the pre final , revised version of our paper on the discreteu2011quantum graph theory of spacetime (Quantumograph). Initial emulations and numerical checks (see our GitHub repository) qualitatively validate the model, but true confirmation requires experiments on real quantum hardware—experiments that can already be performed using the protocol detailed in the article.
Category: Quantum Gravity and String Theory

[1144] ai.viXra.org:2601.0113 [pdf] submitted on 2026-01-28 18:15:37

Holographic Complexity and the JWST Tension

Authors: Hassan Dawood Salman
Comments: 22 Pages. DOI: 10.5281/zenodo.18382581

Recent observations by the James Webb Space Telescope (JWST) have revealed an unexpectedly high abundance of massive galaxies at redshifts z > 10, challenging the standard ΛCDM structure formation paradigm. In this work, we propose a theoretical framework where the thermodynamic properties of spacetime holographic complexity resolve this tension. We propose that the growth of complexity within the cosmic horizon introduces a negative work term in the First Law of Entanglement Thermodynamics, effectively lowering the critical density threshold for gravitational collapse in the early universe. Unlike previous heuristic arguments based on AdS dualities, we motivate this collapse threshold modulation directly from FRW apparent horizon thermodynamics. Our numerical validation demonstrates that this mechanism achieves an 11× enhancement in massive halo formation at z ≈ 15 while preserving σ8(z = 0) = 0.811 (within Planck 2018 constraints), with the complexity coupling constrained to αc = 0.02—0.05 by JWST observations. Direct comparison with JWST observations yields spectacular agreement: χ2/dof ≈ 0.12 for the stellar mass function at z = 10 and χ2/dof ≈ 0 for the UV luminosity function at z = 12, while ΛCDM fails catastrophically (χ2/dof > 1000). We identify a unique falsifiable prediction: a ∼ 150% enhancement in the 21cm power spectrum at k∗ ∼ 1 Mpc−1, testable with HERA Phase II (2025—2027). This signature, alongside scale-dependent galaxy bias (5—10% at k ∼ 1 Mpc−1), distinguishes our model from astrophysical solutions and provides a clear experimental pathway to validation or falsification.
Category: Astrophysics

[1143] ai.viXra.org:2601.0112 [pdf] submitted on 2026-01-27 05:46:14

Experimental Proposal: A Differential Test of One-Way Light Propagation Using a Macroscopic Accelerating Source in Vacuo

Authors: Eliyah Kilada
Comments: 4 Pages.

This paper proposes a definitive experimental setup to test the independence of the one-way speed of light from the motion of its source. By utilizing a macroscopic transmitter (the "source assembly") undergoing controlled acceleration within an ultra-high vacuum, and two stationary detectors, we can measure the variance in arrival-time differentials Δt. This configuration specifically addresses and eliminates the Ewald-Oseen extinction bias and the synchronization circularity inherent in traditional one-way speed of light measurements.
Category: Relativity and Cosmology

[1142] ai.viXra.org:2601.0111 [pdf] submitted on 2026-01-27 17:45:18

Conditional Mutual Information and Petz Recovery in a Z_2 Lattice Gauge Ground State

Authors: Lluis Eriksson
Comments: 7 Pages.

We study approximate quantum Markov structure in a $mathbb{Z}_2$ lattice gauge ground state using the conditional mutual information (CMI) $I(A:Cmid B(w))$ and the performance of Petz recovery across a family of tripartitions $(A,B(w),C)$ parameterized by a buffer width $w$. We consider a $2times 4$ plaquette lattice with open boundaries and qubits on links, restricted to a gauge-invariant (Gauss-law) physical sector, at coupling $g=1.0$. For each $w$ we compute reduced density matrices, the entropies entering the CMI, and a Petz-recovered state $sigma_{ABC}=(mathrm{id}_Aotimes mathcal{R}^{mathrm{Petz}}_{Bto BC})(ho_{AB})$, reporting fidelity $F(ho_{ABC},sigma_{ABC})$ via the recovery error $E_{mathrm{rec}}(w)=-log F$. The submission includes the plot, a formatted table, raw CSV outputs, and a hash-based manifest to enable independent verification. We also report numerical cross-checks (dense vs. low-rank method agreement and trace stability) to support validity.
Category: Quantum Physics

[1141] ai.viXra.org:2601.0110 [pdf] submitted on 2026-01-26 21:14:03

The Stiffness-Inertia Isomorphism Theory of Physical Laws: A Unified Response Framework from Classical Wave Speed to Quantum Gravity

Authors: Xiao Peng Zhang
Comments: 6 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

This paper proposes and systematically elaborates a framework for the isomorphism of physical laws based on the "stiffness-inertia duality." Research reveals that numerous core formulas—from classical mechanics to quantum field theory, from condensed matter physics to cosmology, and even including string theory and loop quantum gravity—can be expressed as a functional relationship between a "stiffness term" (driving/restoring factor) and an "inertia term" (response/storage factor). The most fundamental form is ( v = sqrt{x/y} ), where ( x ) is the generalized stiffness and ( y ) is the generalized inertia.Taking the elastic wave speed formula ( v_s = sqrt{G/ho} ) as a prototype, we demonstrate how to construct a self-consistent logical network based on it, connecting the six fundamental dimensions of physics (length, mass, time, force, velocity, density) and extending to other physical quantities such as temperature, charge, and entropy. The framework successfully incorporates the core formulas of quantum mechanics, special and general relativity, the Standard Model, quantum electrodynamics, condensed matter physics, and further extends to string theory and loop quantum gravity, revealing deep mathematical isomorphisms among these seemingly disparate quantum gravity theories.We propose a unified stiffness-inertia Lagrangian formalism and derive several novel relations for strongly correlated condensed matter systems. The study suggests that "stiffness-inertia balance" may reveal a universal mathematical structure underlying physical laws, providing a new explanatory perspective and methodological tool for cross-scale, cross-disciplinary unification of physics, and offering a possible framework for the integration of quantum gravity with established physics.
Category: Mathematical Physics

[1140] ai.viXra.org:2601.0109 [pdf] submitted on 2026-01-26 21:08:07

Chromatic Gravitational Lensing in the Vacuum Energy Quanta Field Framework

Authors: Enver Torlakovic
Comments: 19 Pages. (Note by ai.viXra.org Admin: For the last time, please cite listed scientific references!)

We present a unified validation of the Vacuum Energy Quanta Field (VEQF) framework for gravitational lensing, which posits that light propagates through a universe structured as a graded refractive index (GRIN) medium shaped by energy density gradients (EDGs). Using only two tunable structural parameters per system (R_0, L) and universal physics (alpha = -3.4e-5, beta = 3.4e-5, gamma = -0.0025), we predict image separations for 20 strong lens systems to within 0.3% accuracy (radio-calibrated, with mild optical offsets explained by emission-region differences and self-collimation effects). Shorter wavelengths (blue/high-nu light) experience stronger deflection, producing narrower, more collimated post-deflection cones with higher surface brightness; longer wavelengths (red/low-nu light) experience weaker deflection, producing wider, more divergent cones with lower surface brightness. This results in bright, crisp blue images at smaller angular separations and faint, extended red images at larger separations — making the observed farther image systematically redder, fainter, and more diffuse when detected. A catalogue sample of 16,469 radio sources (SPECFIND V2.0) shows the expected steep-spectrum behaviour (median alpha ≈ +0.856 between 1.4 GHz and 4.85 GHz), yet in strong lenses the higher-frequency image is consistently brighter and more compact than the lower-frequency counterpart — the opposite of the intrinsic trend. This flux-reversal signature at the telescope input, together with the crisp-blue vs. diffuse-red morphology, provides direct evidence of chromatic separation arising purely from refraction in a dispersive vacuum GRIN medium, proving the vacuum possesses structured energy gradients.
Category: Astrophysics

[1139] ai.viXra.org:2601.0108 [pdf] submitted on 2026-01-26 21:05:49

Layered Fabric Optics: Classical Information Dynamics and Spectral Thermalization

Authors: Asheed Mohamed
Comments: 16 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Visible light passing through layered fabrics undergoes systematic spectral restructuringdriven by wavelength-dependent attenuation. By treating the transmitted spectrum as aclassical probability distribution, we quantify this evolution using Shannon entropy and avariance-based information metric (VBIM). The entropy-gradient peak identifies the layer atwhich spectral reorganization is maximal, marking the transition toward a spectral stabilization depth (informational saturation point). To contextualize this macroscopic observationwithin fundamental thermodynamic limits, we derive a universal informational referencescale, LI , from Landauer’s principle evaluated at the cosmic microwave background temperature. The framework yields a falsifiable prediction of scale invariance under geometricstretching, defining a clear experimental path to validate the proposed connection betweenmacroscopic optics and universal thermodynamic bounds. We show that the physical stabilization depth aligns with this scale, presenting this alignment as a provocative instance of atabletop optical system reflecting universal informational constraints. These results positionlayered fabrics as a macroscopic, experimentally accessible platform for studying classicalinformation dynamics and thermalization-like processes.
Category: Classical Physics

[1138] ai.viXra.org:2601.0107 [pdf] submitted on 2026-01-26 21:03:06

Emergence of Spacetime and a Big-Bang—Like Singularity from Octonionic Algebra

Authors: Rüdiger Giesel
Comments: 8 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

We present a conceptual framework in which the octonionic division algebra is taken asa pre-geometric fundamental structure. We demonstrate that the intrinsic non-associativityof the octonions excludes the existence of a fundamental global time and enforces a symmetryreduction in order to admit physically interpretable dynamics. This reduction occursthrough an algebraically singular transition, naturally interpreted as a Big-Bang—like origin.Four-dimensional spacetime emerges as a stable associative subalgebra, while cosmologicalexpansion arises as a necessary consequence of residual non-associative degrees of freedom.The framework is mathematically consistent but remains speculative and currently lacksexperimental support.
Category: Relativity and Cosmology

[1137] ai.viXra.org:2601.0106 [pdf] submitted on 2026-01-26 21:01:05

Computational Analysis of a Mapping ϕ(n) for Prime Singularity Detection

Authors: Silvio Gabbianelli
Comments: 12 Pages.

This paper explores a deterministic mapping ϕ : Nodd → Z + that defines an informational lattice for the study of prime distribution. By analyzing the topological exclusion of composite generating functions y(x, k), we identify a structural symmetry within the manifold. Computational verification through a mapping Probe confirms density alignment with the Gram series up to 10^50. The results suggest thatcertain symmetries, such as the critical line equilibrium and rotational invariance,are emergent properties of the lattice’s geometric rigidity.
Category: Number Theory

[1136] ai.viXra.org:2601.0105 [pdf] submitted on 2026-01-25 20:38:04

Validation of Structural Analogies: a Detailed Analysis of Predictions 11—17 on Cosmology and Cell Biology

Authors: Alis Hasić
Comments: 8 Pages. License: CC BY-NC-SA 4.0

This work extends the "The Cosmology of the Living Cell" https://ai.vixra.org/abs/2601.0069(the observable universe as a scaled-up eukaryotic human cell) by filling the remaining gaps (~5%) in the mapping table. New analogies are integrated (sterile neutrinos as non-coding RNAs, gamma-ray bursts as necrosis, cosmic strings as spindle fibers, slow roll inflation as mitotic checkpoints, the proteasome as cosmic recycling, voltage-gated channels as cosmic plasma currents, microRNAs as subtle dark matter fields, and the nucleolus as star-forming centers). The average strength of comparison increases to ~85—90%. All 17 predictions are now listed, with specific forecasts assuming the hypothesis is correct. The overall consistency approaches 100%
Category: Classical Physics

[1135] ai.viXra.org:2601.0104 [pdf] submitted on 2026-01-25 20:58:19

Emergent Quantum Mechanics from a Discretized Spacetime Substrate

Authors: Bertrand Jarry
Comments: 10 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0)

We present a comprehensive framework in which quantum mechanics emerges as an effective low-energy description of a fundamentally discretized spacetime sub-strate. The microscopic dynamics are modeled by a linear cellular automaton de-fined on a spacetime lattice. In the continuum limit, the Schrödinger equation is recovered with arbitrary experimental precision. At finite lattice spacing, the discrete topology induces a non-linear (sinusoidal) dispersion relation. We demon-strate that its leading-order correction is equivalent to a quadratic modification of the relativistic dispersion relation, formally identical to n = 2 Lorentz Invariance Violation (LIV) models. This provides a concrete microphysical origin for quadratic LIV and yields falsifiable predictions in ultra-high-energy astrophysics. We expand on the mathematical formalism, provide detailed derivations, and present extensive numerical simulations supporting the theoretical predictions.
Category: High Energy Particle Physics

Replacements of recent Submissions

[169] ai.viXra.org:2602.0127 [pdf] replaced on 2026-03-07 01:51:15

Dimensional Accessibility, Proportional Distribution, and Alignment as Conditions for Resonance in Complex Systems

Authors: Doug Hoffman
Comments: 5 Pages.

Complex systems across physical, biological, computational, organizational, economic, and quantum domains can achieve resonance (coherent amplification with bounded adaptability) through a recurring structural architecture when three conditions jointly obtain within a ternary continuum (continuous state space bounded by functional poles):

D: Dimensional freedom (accessible intermediate states)
P: Proportional distribution (balanced energy/influence allocation)
A: Alignment (phase, directional, and incentive coherence)

The multiplicative relationship R ∝ D × P × A implies that significant degradation of any component reduces overall resonant stability and increases susceptibility to systemic collapse. This framework stratifies dynamical regimes from static order (D≈0) through chaotic complexity (medium A) to periodic resonance (A≈1), correctly diagnosing failure modes across multiple domains: vanishing gradients (neural nets), trophic cascades (ecology), misaligned incentives (organizations), decoherence (quantum), and phase mismatch (physics).Resonance is not inherent but emergent, operating at multiple scales and contexts. The DPA architecture offers a domain-general diagnostic for system health, predicting phase boundaries and multiplicative fragility without parameter tuning.
Category: General Science and Philosophy

[168] ai.viXra.org:2602.0127 [pdf] replaced on 2026-03-01 21:56:29

Dimensional Accessibility, Proportional Distribution, and Alignment as Conditions for Resonance in Complex Systems

Authors: Doug Hoffman
Comments: 3 Pages.

Complex systems across physical, biological, computational, organizational, economic, andquantum domains can achieve resonance (coherent amplification with bounded adaptability)through a recurring structural architecture when three conditions jointly obtain within a ternarycontinuum (continuous state space bounded by functional poles):

1. D: Dimensional freedom (accessible intermediate states)
2. P: Proportional distribution (balanced energy/influence allocation)
3. A: Alignment (phase, directional, and incentive coherence)

The multiplicative relationship R ∝ D × P×A implies that significant degradation of anycomponent reduces overall resonant stability and increases susceptibility to systemic collapse.This framework stratifies dynamical regimes from static order (D≈0) through chaotic complexity(medium A) to periodic resonance (A≈1), correctly diagnosing failure modes across multipledomains: vanishing gradients (neural nets), trophic cascades (ecology), misaligned incentives(organizations), decoherence (quantum), and phase mismatch (physics).Resonance is not inherent but emergent, operating at multiple scales and contexts. Systemscan transition between regimes by redefining functional poles under stress, as seen in flockingbait ball formation. The DPA architecture offers a domain-general diagnostic for system health,predicting phase boundaries and multiplicative fragility without parameter tuning.
Category: General Science and Philosophy

[167] ai.viXra.org:2602.0117 [pdf] replaced on 2026-02-27 02:16:52

Mechanical Audit Experiments and Reproducibility Appendix for a Companion-Paper Programme on 4D SU(N) Yang—Mills Existence and Mass Gap

Authors: Lluis Eriksson
Comments: 33 Pages.

This document is an experiment-first audit report for a companion-paper programme claiming a constructive solution of the 4D $mathrm{SU}(N)$ Yang—Mills existence and mass gap problem. It specifies a runnable mechanical audit suite of 29 deterministic tests, defines pass/fail criteria, and presents outputs in a compilation-safe format. The report contains: (i) an explicit non-triviality proof showing the Wightman functions do not factorize trivially; (ii) a toy-model validation recovering the exact 2D $mathrm{SU}(2)$ Yang—Mills mass gap to machine precision; (iii) a Bałaban bridge appendix reproducing the critical inductive step of his renormalization group in simplified form; (iv) a reproducibility repository with 3-line setup instructions; (v) a core proof chain audit mechanically verifying the load-bearing theorems of Papers 86—90, covering terminal Kotecký—Preiss convergence, UV suppression, one-dimensionality of the anisotropic sector, Cauchy bounds on polymer jets, the OS1 vanishing rate $O(eta^2 log eta^{-1})$, Lie-algebra annihilation, and KP margin sensitivity. Beyond the 17 core tests, the suite includes a lattice gauge proxy layer (plaquette expansion, Polyakov-loop centre symmetry, Creutz ratio; 3 tests), an infrastructure layer (Bakry—Émery curvature seed $mathrm{Ric}_{mathrm{SU}(N)} = N/4$, the $2^{4k}$ cancellation in $d=4$, heat-kernel column bound; 3 tests), a UV-flow/heat-kernel layer (Parseval identity, diagonal decay exponent $d/2 = 2$, flow—reflection commutation; 3 tests), a non-triviality test (Haar Monte Carlo on $mathrm{SU}(2)$ and $mathrm{SU}(3)$; 1 test), a toy-model validation (2D Yang—Mills transfer matrix; 1 test), and an algebraic QFT layer (Petz recovery fidelity bound $1-F leq C,e^{-2mr}$ from the Split Property; 1 test). All 29 tests pass; the full suite completes in ${approx}70,mathrm{s}$ on a Google Colab CPU. The complete inter-paper dependency DAG is acyclic and explicitly recorded. All code, data, and artifacts are available at https://github.com/lluiseriksson/ym-audit. The companion papers are archived at https://ai.vixra.org/author/lluis_eriksson.
Category: Mathematical Physics

[166] ai.viXra.org:2602.0103 [pdf] replaced on 2026-02-28 23:42:37

Non-Existence Collapse: A Dynamical Mechanism for the Quantum-Gravitational Bootstrap of Spacetime from Nothing

Authors: Steven B. Thompson
Comments: 8 Pages.

We propose a minimal dynamical framework for the origin of the universe in which absolute non-existence—a state with no spacetime, matter, fields, or classical time—is intrinsically unstable under quantum-mechanical and gravitational principles. The Heisenberg uncertainty principle, applied to gravitational degrees of freedom, precludes a static null configuration: non-existence has "nowhere to go" but to collapse into itself, producing an effective Planck-scale density regime that undergoes a nonsingular quantum-gravitational transition. This collapse-driven process bootstraps the emergence of classical spacetime and the arrow of time, requiring no external causes, pre-existing substrates, boundary conditions, or auxiliary fields.Quantum gravity emerges here not as an imposed extension but as the inherent dynamicalstructure governing the instability and resolution. The mechanism refines and complements established proposals—such as Vilenkin’s quantum tunneling from nothing, the Hartle-Hawking no-boundary wavefunction, and recent developments in loop quantum cosmology bounces and quadratic gravity—by providing a purely mechanical interpretation that replaces probabilistic nucleation or Euclidean continuation with an intrinsic collapse bootstrap. It aligns with ongoingrefinements of no-boundary states, curvature bounces, and geometric "from nothing" models inthe 2025—2026 literature.This framework offers a parsimonious dynamical resolution to the question of why thereis something rather than nothing, transforming it into a consequence of quantum gravity’sstructure. Potential observational implications include consistency with cosmic microwave back-ground data and distinguishable signatures in primordial gravitational waves or large-scale structure that may differentiate collapse-initiated emergence from conventional inflationary scenarios.
Category: Relativity and Cosmology

[165] ai.viXra.org:2602.0100 [pdf] replaced on 2026-02-22 10:03:37

The 1.5-pi Symmetry: A Lorentz-Conformant Scaling Law for the Dark Sector in Informational Space Genesis (ISG)

Authors: F. Rücker
Comments: 7 Pages. V1.1: Formalized baryonic base (1.5-pi symmetry). Refined scaling potential for full Lorentz Conformity. CC BY 4.0 license.

Current cosmological models, primarily the Lambda-CDM framework, face significant challenges in explaining Dark Matter and Dark Energy. The Theory of Informational Space Genesis (ISG) proposes an ontological shift, defining space as an emergent byproduct of interaction processes. This paper details a four-stage signal denoising methodology to isolate the "Pristine Potential" (Ep) of approximately 2.15 meV. By correlating this potential with a baryonic base (beta) derived from the ensemble average of the ten most abundant atomic species, we identify a fundamental geometric attractor at 1.5 pi (approximately 4.71).To ensure Lorentz Conformity, the scaling potential is formalized as a dimensionless ratio against the Cosmic Microwave Background peak energy, yielding a normalized invariant exponent of approximately 2.164. This geometric-informational scaling law predicts a Dark Matter density of approximately 28.6 percent, aligning with Planck observational data with high precision.
Category: Astrophysics

[164] ai.viXra.org:2602.0099 [pdf] replaced on 2026-03-02 04:12:15

Tying the Universe Together: Quantum Balloon Interface Theory (QBIT)

Authors: Sean McCallum
Comments: 10 Pages. Creative Commons Attribution 4.0 International (CC BY 4.0)

The vacuum is not an empty stage. It is a stretchy quantum balloon - a dynamical condensate of overlapping standing-wave fields - with every massive particle a permanent topological knot tied into its fabric. Gravity emerges as local strain, forces as the rubber trying to smooth itself, and the entire Standard Model arises from zero modes trapped inside these knots.

Beneath the balloon lies an eternal pre-geometric qubit substrate that never disappeared after the Big Bang condensation. A tiny, Planck-suppressed portal couples the substrate to the rubber, naturally feeding zero-point energy and producing a mild dynamical dark energy with predicted equation-of-state drift w_a approx +7.2 x 10^{-5}.

With only two fundamental parameters (f and e), the QBIT framework quantitatively reproduces:

  • Nuclear binding energies across the entire periodic table to better than 0.08% accuracy (maximum relative error 0.079% at 28Si);
  • Quark and lepton masses and mixing matrices (CKM and PMNS) at the percent level or better;
  • The observed baryon asymmetry eta approx 6.1 x 10^{-10};
  • A tiny positive cosmological constant rho_Lambda approx 1.02 x 10^{-120} rho_Planck;
  • A natural first-principles resolution of the Hubble tension via modified early-universe expansion driven by the substrate condensation front.

Black-hole interiors are regular topological cores with no singularities, the information paradox is resolved by substrate-mediated leakage, and quantum paradoxes (double-slit, delayed-choice eraser, Hardy's, Zeno, Elitzur-Vaidman, Schrodinger's cat) receive natural resolutions via rubber ripples and topological protection.

Lattice simulations and analytic results confirm topological stability, emergent curvature, multi-knot binding, and the substrate portal's predictions. QBIT thus provides a conceptually elegant, quantitatively successful, and experimentally testable candidate for a unified theory of particles, gravity, and cosmology - all emerging from one stretchy quantum balloon and its eternal qubit substrate.
Category: Quantum Gravity and String Theory

[163] ai.viXra.org:2602.0088 [pdf] replaced on 2026-02-19 12:11:33

Exponential Clustering and Mass Gap for Four-Dimensional SU(N) Lattice Yang--Mills Theory Via Balaban's Renormalization Group and Multiscale Correlator Decoupling

Authors: Lluis Eriksson
Comments: 21 Pages.

We establish exponential clustering with a strictly positive mass gap for four-dimensional pure SU(N) lattice Yang--Mills theory with Wilson's action, uniformly in lattice spacing $eta$ and physical volume $L_{mathrm{phys}}$:$|mathrm{Cov}_{mu_eta}(mathcal{O}(0),mathcal{O}(x))| leq C,e^{-m,|x|/a_*}$, with $m > 0$ and $a_* sim Lambda_{mathrm{YM}}^{-1}$.The proof assembles three ingredients: (1) Balaban's rigorous renormalization group for lattice gauge theories (CMP 1984--1989), which produces effective densities with local polymer decompositions and exponentially decaying activities; (2) a terminal-scale polymer cluster expansion (imported from Balaban's convergent renormalization expansions), which implies exponential clustering for the effective terminal measure; and (3) a multiscale correlator decoupling identity (this paper), which separates ultraviolet fluctuations from infrared physics and yields uniform UV suppression. The coupling control required by Balaban's framework -- that the effective couplings remain in the perturbative regime throughout the RG iteration -- is established via an inductive argument using Cauchy bounds on the analyticity of the effective action. We also verify the Osterwalder--Schrader axioms OS0, OS2, OS3, and OS4 for subsequential continuum limits, and establish vacuum uniqueness and non-triviality. The remaining axiom OS1 (full O(4) Euclidean covariance) is not established here; we prove covariance under lattice translations and the hypercubic group $mathcal{W}_4$, and show that if O(4) covariance holds in the continuum limit, the reconstructed Wightman theory is a non-trivial relativistic quantum field theory with mass gap $Delta_{mathrm{phys}} geq c_N,Lambda_{mathrm{YM}} > 0$, where $c_N > 0$ depends only on $N$ (and is independent of $eta$ and $L_{mathrm{phys}}$).
Category: Mathematical Physics

[162] ai.viXra.org:2602.0087 [pdf] replaced on 2026-02-19 12:12:41

Irrelevant Operators, Anisotropy Bounds, and Operator Insertions in Balaban's Renormalization Group for Four-Dimensional SU(N) Lattice Yang--Mills Theory: Symanzik Classification and Quantitative Irrelevance of O(4)-Breaking Operators

Authors: Lluis Eriksson
Comments: 18 Pages.

We classify gauge-invariant local lattice operators of classical dimension 6 on the four-dimensional hypercubic lattice into O(4)-invariant, hypercubic-invariant but O(4)-breaking (anisotropic), and on-shell-redundant components, following the Symanzik improvement programme and the on-shell improvement technique of Luscher--Weisz (1985). Inside Balaban's renormalization group framework for SU(N) lattice Yang--Mills theory, we extract the anisotropic projection of the effective action via local Taylor expansion of polymer activities in the small-field regime and prove a quantitative quadratic scale bound for the anisotropic coefficient: for every RG step $k leq k_*$ with effective coupling $g_k leq gamma_0$, the coefficient of the (one-dimensional) anisotropic sector in the classical dimension-6 Symanzik expansion satisfies $|c_{6,mathrm{aniso}}^{(k)}| leq C,a_k^2$, uniformly in lattice spacing $eta$, physical volume $L_{mathrm{phys}}$, and RG step $k$. We further prove a quantitative insertion integrability estimate for connected correlators with one insertion of the anisotropic operator. When combined with the rotational Ward identity derived in the companion paper, this yields that the corresponding breaking distribution tested against Schwartz functions is $O(eta^2,|log((Lambda_{mathrm{YM}}eta)^{-1})|)$ and hence vanishes as $eta to 0$.
Category: Mathematical Physics

[161] ai.viXra.org:2602.0076 [pdf] replaced on 2026-02-22 17:48:08

The Topological Inversion Model (TIM): A Unied Non-Singular Framework for Cosmology, Quantum Topology, and Gravitational-Wave Phenomenology

Authors: Kobie Janse van Rensburg
Comments: 13 Pages. Fixed ANSI characters and Citations.

We present Version 5 of the Topological Inversion Model (TIM), a unified non-singular cosmological framework based on a Planck-scale reciprocal inversion R = L2p/r. Six results are now established. (1) A topological snap at r = Lp generates an elastic recoil field Ψ whose self-consistent scattering amplitude gives echo reectivity Γinner ≈ 0.42 at ξΨ = 0.002; the observable Fabry-Pérot echo is Γecho ≈ 0.125 (Section 11b). (2) A calibrated BBN fit yields ψBBN = 1.059±0.012, partially mitigating the 7Li problem.(3) An overshoot texture mechanism raises the predicted H0 from 67.4 to 71.1 km s−1 Mpc−1, reducing the Hubble tension from 5.0σ to 1.7σ and forward-predicting the NANOGrav 15-year PTA signal. (4) Polymer one-loop analysis confirms UV finiteness without counterterms and vacuum stability (m2eff = 1.016 m2Ψ > 0). (5) Dark matter is identied with the gradient energy of the thinning field δΨ(r) = Ψeq j0(mΨr): no new particle is required, the density profile is naturally cored (ρDM ∝ r0 at r → 0, resolving the core-cuspproblem), rotation curves are flat, and the required field mass mΨ ≈ 3.8 × 10−23 eV c−2 falls within the observational window for ultralight dark matter. (6) The Standard Model is shown to emerge as the low-energy effective theory on the post-Snap manifold: topological defects of the Skyrme field carry quantum numbers matching the observed fermion spectrum, and integrating out high-momentum modes recovers LSM exactly. All results are reproducible from the providedPython code.
Category: Quantum Gravity and String Theory

[160] ai.viXra.org:2602.0076 [pdf] replaced on 2026-02-22 02:57:25

The Topological Inversion Model (TIM): A Unified Non-Singular Framework for Cosmology, Quantum Topology, and Gravitational-Wave Phenomenology

Authors: Kobie Janse van Rensburg
Comments: 16 Pages. (Note by ai.viXra.org Admin: Many symbols are not legible/blocked; please cite listed scientific references)

We present Version 3 of the Topological Inversion Model (TIM), a unified non-singularcosmological framework based on a Planck-scale reciprocal inversion R = L2p/r. This versionadds a fourth substantive result to the three established in Version 2. First, the echoreflectivity G(Y, M) is derived from a self-consistent WKB boundary condition. Second,yBBNand Leff(z) are shown to be physically independent, with best-fit yBBN = 1.059 ± 0.012 partially mitigating the Li-7 problem. Third, TIM is compared against LQC, Asymptotic Safety, and ECO models. Fourth (new), we propose that a slight underdamping of the topological snap generates a network of texture defects via the Kibble-Zurek mechanism. These textures act as early dark energy, shrinking the pre-recombination sound horizon and raising the predicted H0 from 67.4 to 71.1 km s-1 Mpc-1 — a substantial partial resolution of the Hubble tension. A single new parameter e (overshoot amplitude) is constrained jointly by the BBN expansion rate and the CMB early dark energy fraction, giving e = 0.115 ± 0.02. The texture collapse additionally predicts a stochasticgravitational-wave background with WGWh2 ~ 1.5 × 10-9 at 3 nHz — consistent with theNANOGrav 15-year signal. All results are reproducible from the provided Python code.
Category: Quantum Gravity and String Theory

[159] ai.viXra.org:2602.0071 [pdf] replaced on 2026-02-18 02:42:09

Temporal Necessity in Relational Mathematical Realism: A Godelian Argument Against the Block Universe

Authors: Jason Merwin
Comments: 10 Pages.

A foundational question in the philosophy of physics is whether time is a fundamental dimension of reality or an emergent phenomenon. The standard Block Universe interpretation of general relativity treats time as a static dimension, with the passage of time relegated to psychological illusion. In this paper, we present a novel argument against the Block Universe derived from the framework of Relational Mathematical Realism (RMR), which identifies physical existence with mathematical structure. We demonstrate that if reality is a sufficiently complex, locally consistent mathematical structure, then Gödel's First Incompleteness Theorem renders a static, completed universe logically impossible. The resolution of this impossibility requires the structure to undergo a non-terminating sequence of state extensions, which we identify with the passage of time. We conclude that time is not a dimension within which the universe exists, but rather the logically necessary process by which a complex mathematical structure maintains consistency. This result, if sound, constitutes the first derivation of temporal passage from mathematical logic and ontology alone.
Category: History and Philosophy of Physics

[158] ai.viXra.org:2602.0063 [pdf] replaced on 2026-02-14 08:18:38

Conditional Continuum Limit of 4d SU(N_c) Yang-Mills Theory via Two-Layer Architecture, RG-Cauchy Uniqueness, and Step-Scaling Confinement

Authors: Lluis Eriksson
Comments: 19 Pages.

Building on the lattice results established in Papers [E26I]-[E26IX], we give a conditional construction of a scaling-limit state for pure SU(N_c) lattice Yang-Mills theory in four Euclidean dimensions, along dyadic lattice spacings a_k = a_0 2^{-k}. The construction proceeds via a two-layer architecture. Layer 1 (Local fields): For bounded gauge-invariant local observables (Wilson loops, normalized plaquette traces), expectations converge without extracting subsequences to a unique limit. Precompactness of expectations at fixed physical side length L is trivial since |_{a,L}| <= 1. Uniqueness follows from a multiscale RG-Cauchy estimate that bounds the change of local expectations under a single RG step. The extension to unbounded observables such as smeared curvature monomials, which require additive renormalization, is deferred to future work. Layer 2 (Confinement): The physical string tension sigma_phys > 0 is established through step-scaling of Creutz ratios evaluated on Wilson loops whose physical dimensions R x T are held fixed as a -> 0. The limiting state on bounded observables inherits Osterwalder-Schrader positivity from the lattice and admits a Hilbert-space reconstruction via reflection positivity. SO(4) rotational invariance is expected in the continuum (the hypercubic breaking being O(a^2), subject to standard operator classification and construction of renormalized Schwinger functions). The mass gap is established conditionally via uniform exponential clustering of connected correlators -- an input from a uniform physical transfer-matrix spectral gap -- and the reconstruction theorem. Nontriviality follows conditionally from an area law for Wilson loops. Key dependencies on prior papers: uniform LSI inputs [E26I]-[E26IX]; Balaban multiscale effective action [E26III]-[E26V]; DLR-LSI [E26VII]; unconditional lattice closure inputs [E26IX].
Category: Mathematical Physics

[157] ai.viXra.org:2602.0046 [pdf] replaced on 2026-02-12 19:04:00

Ricci Curvature of the Orbit Space of Lattice Gauge Theory and Single-Scale Log-Sobolev Inequalities

Authors: Lluis Eriksson
Comments: 11 Pages.

We establish that the orbit space B = A/G of SU(N_c) lattice gauge theory satisfies the Riemannian curvature-dimension condition RCD*(N_c/4, dim A); in particular, it satisfies CD(N_c/4, infinity) in the sense of Lott-Villani-Sturm. The proof proceeds by showing that the configuration space A = SU(N_c)^{|B_1(Lambda)|}, equipped with the bi-invariant product metric = -2 tr(XY), is an Einstein manifold with Ric_A = (N_c/4) g_A, and then applying the stability of the RCD* condition under quotients by compact groups of measure-preserving isometries (Galaz-Garcia-Kell-Mondino-Sosa). This approach bypasses the need for explicit O'Neill curvature computations and handles the singular stratum (reducible connections) automatically. As a consequence, we derive a conditional log-Sobolev inequality for measures on B of the form d mu = e^{-Phi} d nu / Z with constant alpha = (N_c/4) e^{-osc(Phi)}. All constants are computed explicitly for SU(2) and SU(3). This provides the geometric input in a program aiming at a volume-uniform log-Sobolev inequality for SU(N_c) lattice Yang-Mills theory at weak coupling; the complementary analytic input (uniform bounds on the effective potential oscillation, via Balaban's renormalization group) is the subject of ongoing work.
Category: Mathematical Physics

[156] ai.viXra.org:2602.0041 [pdf] replaced on 2026-02-12 19:02:28

Uniform Log-Sobolev Inequality and Mass Gap for Lattice Yang--Mills Theory

Authors: Lluis Eriksson
Comments: 24 Pages.

We establish that SU(N_c) lattice Yang-Mills theory in d=4 dimensions with Wilson action at sufficiently weak coupling (beta = 2N_c/g^2 >= beta_0) satisfies a log-Sobolev inequality with constant alpha_* > 0 uniform in the lattice size L_vol. Combined with reflection positivity of the Wilson action and the DLR-LSI extension plus Stroock-Zegarlinski mixing route, this yields a mass gap Delta_phys > 0 uniform in L_vol without additional assumptions. The proof combines three ingredients: (i) Balaban's constructive renormalization group, which produces controlled effective actions at all scales; (ii) the orbit space Ricci curvature bound Ric_B >= N_c/4, which gives a uniform log-Sobolev constant for conditional measures of fast modes at each RG scale via the Bakry-Emery criterion; and (iii) a multiscale entropy decomposition with sweeping-out bounds, where the geometric scaling factor ||Q_(k)*||^2 = 2^{-(d-1)k} of transversal block averaging ensures summability of cross-scale errors.
Category: Mathematical Physics

[155] ai.viXra.org:2602.0024 [pdf] replaced on 2026-02-11 18:04:59

Gravitational Lensing and the Bullet Cluster via Spacetime Creep and Stress Relaxation

Authors: Chang-Sik Kim
Comments: 3 Pages.

The Bullet Cluster (1E 0657-56) is widely regarded as the most direct empirical evidence for particulate Dark Matter, due to the observed separation between the gravitational lensingcenter and the baryonic gas. In this third and final paper of the series, we demonstratethat this phenomenon can be explained within the Spacetime Elastic Hysteresis The ory without invoking non-baryonic particles. We introduce the mechanism of "Spacetime Creep," where the macroscopic entanglement strain (ψ) exhibits a delayed relaxation response to the rapid deceleration of baryonic matter. Our analysis shows that the viscoelastic stress relaxation time (τrelax) allows the gravitational potential peak to disassociate from the collisional gas, naturally reproducing the lensing anomalies observed in galaxy cluster collisions.
Category: Quantum Gravity and String Theory

[154] ai.viXra.org:2602.0023 [pdf] replaced on 2026-02-11 18:04:08

Resolving the Hubble Tension and the Early Galaxy Problem via Viscoelastic Memory

Authors: Chang-Sik Kim
Comments: 3 Pages.

The "Hubble Tension"—the statistically significant discrepancy between the expansion rate of the universe measured from the early universe (CMB) and the late universe (SNIa)—remains one of the most challenging problems in modern cosmology. In this second paper of the series, we propose that this tension arises from neglecting the viscoelastic nature of spacetime.Building on Kim’s Law (V ∝ ψ2) established in Part 1, we introduce the concept of"Spacetime Hysteresis," where the release of stored elastic energy is delayed by a characteristic time scale τ. Our MCMC analysis shows that a delayed elastic response (τ ≈ 0.15)naturally boosts the late-time expansion rate to H0 ≈ 73 km/s/Mpc, resolving the tensionwithout breaking early-universe physics. Furthermore, this model implies a recalibrated cosmic age of 16.54 Gyr, providing a theoretical solution to the formation of mature galaxiesat z >10 observed by JWST.
Category: Astrophysics

[153] ai.viXra.org:2602.0016 [pdf] replaced on 2026-02-19 23:03:05

The Quantum Space Mechanism - The Origin of Forces

Authors: Joseph Koharski
Comments: 66 Pages. I added manuscript line numbers for easier reader reference and feedback. Corrected minor typographical errors and fixed intermediate arithmetic in the macroscopic dynamics calculations.

This document contains a compilation of five research papers detailing the Quantum SpaceMechanism (QSM). These papers propose a unified framework where Inertia, Gravity, and Timeemerge from the hydrodynamics of a viscous, dilatant vacuum substrate (the Higgs field). The seriescovers: (I) The Entropic Origin of Inertia and the Bridge Equation; (II) The Vacuum Yield Point andthe Origin of Gravity; (III) The Geometry of Mass and Particle Generations via Finslerian Anglesof Attack; (IV) Macroscopic Dynamics, Dark Matter as Metric Expansion, and Electromagnetism;and (V) The Origin of Time as Viscous Dissipation.
Category: Quantum Gravity and String Theory

[152] ai.viXra.org:2602.0013 [pdf] replaced on 2026-02-05 03:11:28

Towards a Local Minimum Time Resolution in Curved Spacetime

Authors: Priyanshu Rauth
Comments: 5 Pages.

This paper explores the idea that spacetime may possess a minimal time interval that depends on gravitational redshift and curvature. Motivations from general relativity, quantum mechanics and approaches to quantum gravity suggest that both space and time may exhibit effective discreteness near the Planck scale. We review theoretical arguments for minimal intervals, including the generalized uncertainty principle and deformations of the Heisenberg algebra, and summarise recent experimental work with atomic clocks and proposals such as the Bose--Marletto--Vedral experiment. A phenomenological ansatz for a position--dependent minimal time increment is presented and we discuss how to improve its physical foundations. The aim is not to propose a theory of everything but to offer a conservative, focused framework that could guide future experiments.
Category: Relativity and Cosmology

[151] ai.viXra.org:2601.0119 [pdf] replaced on 2026-02-20 22:24:30

The Fractal Substrate Equivalence Principle: A Unified Foundation for Quantum Mechanics and General Relativity

Authors: Steven E. Elliott
Comments: 15 Pages.

We introduce the Fractal Substrate Equivalence Principle (FSEP) as the foundational axiom for a unified theory of physics, asserting that in a fractal universe governed by magneto-hydrodynamics (MHD), physical laws, structures, and phenomena are exactly equivalent acrossall scales upon appropriate scaling transformations. Unlike prior fractal cosmologies that treat general relativity (GR) or quantum mechanics (QM) as fundamental, the FSEP posits these as emergent scale-dependent descriptions of a single electric fluid dynamics. This principleunequivocally asserts: stars are photons, black holes are atomic nuclei, and dark matter is electron orbital shells—not as analogies, but as exact physical identities across fractal layers. We demonstrate how this single MHD substrate reproduces the essential features of QM and GR,and explains the origin of dark matter, black hole—galaxy correlations, and the fine structure of atomic spectra, within a unified electric—fluid picture.
Category: Quantum Gravity and String Theory

[150] ai.viXra.org:2601.0118 [pdf] replaced on 2026-02-23 02:46:57

Emergent Gravity, Cosmology, and Quantum Field Theory from Anisotropic Flux Suppression: A Complete Framework

Authors: Frederick Manfredi
Comments: 106 Pages.

We propose an effective classical field theory where a single anisotropic suppression rule—favoring radial flux paths and penalizing perpendicular deviations: S(θ) = 1/ϕ^6 sin^4(θ) with density-softened feedback Seff (θ, ρ) = S(θ)/(1+βρ)—generates corrections to standard GR and QFT. In the high-ρ limit, the denominator softens suppression, recovering standard physics with regular interiors and milder effects. At low/intermediate ρ, we predict testable deviations. The framework mimics key features of general relativity, quantum field theory, and Bell correlations as an effective description. Anchored in semi-Dirac quasiparticle data from materials such as ZrSiS, the rule reproduces strong Bell violations in planar geometries (CHSH up to ∼2.62 rescaled with mild low-β feedback) while predicting significant dilution (CHSH ∼0.12—1.03 dynamically rescaled) in isotropic 3D configurations—an untested signature absent in current experiments. In high-density regimes,flux rebalancing mimics GR phenomenology (Newtonian limit, linearized Einstein equations, Schwarzschild-like metrics with softened interiors) without fundamental curvature or gravitons. Within the approximations of this model, theSemi-Dirac soup reproduces quantum Bell correlations closely matching experiment via suppression nonlinearity (Appendix A)Small corrections are predicted in precision observables (e.g., g-2, black-hole ringdowns, CMB angular power), with the denominator softening high-ρ effects (milder deviations in dense interiors, QGP, nuclear cores). Falsifiable at next-generation facilities (LISA, Euclid, CMB-S4, quantum networks), the model offers a classical effective-theory alternative to standard unification frameworks, unifying gravity, quantum statistics, condensed matter, and cosmology through anisotropic flux dynamics and density softening.
Category: Quantum Gravity and String Theory

[149] ai.viXra.org:2601.0112 [pdf] replaced on 2026-02-25 02:23:01

Experimental Proposal: A Differential Test of One-Way Light Propagation Using a Macroscopic Accelerating Source in Vacuo

Authors: Eliyah Kilada
Comments: 12 Pages.

This paper proposes two complementary experimental configurations to test the independence of the one-way speed of light from source motion—a cornerstone of Special Relativity’s second postulate. While foundational, earlier literature methods were constrained by round-trip averaging, material interactions triggering extinction resets, re-emission biases from mirrors and slits, quantum ambiguities from subatomic sources, range and resolution of interferometer, and the inability to directly isolate one-way source dependence without synchronization assumptions. A key innovation is a differential measurement method using stationary detector arrays that eliminates the need for absolute clock synchronization,addressing a persistent issue in traditional one-way speed tests by avoiding any one-way synchronization circularity. Configuration A uses an accelerating macroscopic transmitter in vacuum with two stationary detectors, measuring how the arrival-time differential (∆t) varies with source velocity. Because both detectors remain at rest relative to each other, anysynchronization offset is a static constant that cancels out when measuring changes in ∆t—we never need to know the "absolute" time at eitherdetector, only how their relative measurements shift with source motion. Configuration B inverts this approach: a stationary light source with accelerating detector arrays, eliminating ambiguities about light emission in moving frames while maintaining the synchronization-free differential advantage. The designs avoid extinction concerns entirely by using dualdirect-emission sources without intervening media, slits, or reflections, ensuring propagation occurs solely in ultra-high vacuum (UHV, 10−9 Torr). By presenting both configurations—one with an accelerating source and stationary detectors (Config A), the other with a stationary source and accelerating detectors (Config B)—we provide complementary paths to testing the second postulate, sidestepping potential ambiguities relatedto light emission or propagation in moving frames. Modern femtosecond timing and ultra-high vacuum technology, along with cost-effectivecomponents like off-the-shelf lasers and acceleration systems, make theseexperiments feasible at a lab scale, with estimated costs under $300k.
Category: Relativity and Cosmology

[148] ai.viXra.org:2601.0112 [pdf] replaced on 2026-01-31 07:43:18

Experimental Proposal: A Differential Test of One-Way Light Propagation Using a Macroscopic Accelerating Source in Vacuo

Authors: Eliyah Kilada
Comments: 6 Pages.

This paper proposes two complementary experimental configurations to rigorously test the independence of the one-way speed of light from source motion—a cornerstone of Special Relativity's second postulate. The key innovation is a differential measurement method using stationary detector arrays that eliminates the need for absolute clock synchronization, a persistent philosophical problem in traditional one-way speed tests.Configuration A uses an accelerating macroscopic transmitter in vacuum with two stationary detectors, measuring how the arrival-time differential (Δt) varies with source velocity. Because both detectors remain at rest relative to each other, any synchronization offset is a static constant that cancels out when measuring changes in Δt —we never need to know the "absolute" time at either detector, only how their relative measurements shift with source motion. Configuration B inverts this approach: a stationary light source with accelerating detector arrays, eliminating ambiguities about light emission in moving frames while maintaining the synchronization-free differential advantage.Critically, we examine the Ewald-Oseen extinction theorem's applicability to the emission process itself (rather than merely propagation through media), revealing that this commonly-cited objection may be overstated. By presenting both configurations, we provide a path to definitive testing regardless of how extinction applies at the emission level. Modern femtosecond timing and ultra-high vacuum technology make these experiments feasible.
Category: Relativity and Cosmology

[147] ai.viXra.org:2601.0107 [pdf] replaced on 2026-01-27 19:29:12

Emergence of Spacetime and a Big-Bang—Like Singularity from Octonionic Algebra

Authors: Rüdiger Giesel
Comments: 9 Pages.

We present a conceptual framework in which the octonionic division algebra is taken asa pre-geometric fundamental structure. We demonstrate that the intrinsic non-associativityof the octonions excludes the existence of a fundamental global time and enforces a symmetry reduction in order to admit physically interpretable dynamics. This reduction occursthrough an algebraically singular transition, naturally interpreted as a Big-Bang—like origin.Four-dimensional spacetime emerges as a stable associative subalgebra, while cosmologicalexpansion arises as a necessary consequence of residual non-associative degrees of freedom.The framework is mathematically consistent but remains speculative and currently lacksexperimental support
Category: Relativity and Cosmology