[126] ai.viXra.org:2602.0130 [pdf] replaced on 2026-04-14 07:23:22
Authors: Stephen P. Smith
Comments: 12 Pages.
The Standard Model is often celebrated as a nearly complete account of the strong, weak, and electromagnetic interactions, lacking only gravity. Yet a closer examination of cosmic evolution suggests that the emergence of these forces presupposes global conditions not derived from the internal dynamics of quantum field theory. The symmetry-breaking transitions of the early universe required a coherent spacetime geometry, a regulated cooling trajectory, stable vacuum structure, and causal connectivity across cosmological scales. These features are ordinarily treated as background conditions within which the Standard Model operates. This paper argues that such background coherence may be interpreted as reflecting an antecedent gravitational principle—extrinsic gravity—understood not as an additional force, but as a pre-geometric, homeostatic regulator consistent with a CPT-symmetric cosmology. Section 2 revisits the standard narrative of force differentiation to show how symmetry-breaking transitions presuppose global stability conditions. Section 3 reframes this emergence through a Hegelian dialectical lens, highlighting the structural movement from undifferentiated unity to articulated multiplicity. Sectionu202f4 then shows that the Weyl tensor already contains an intrinsic two sided decomposition into self dual and anti self dual parts, and that a variational principle enforcing their indistinguishability drives the Weyl tensor to vanish, selecting conformal flatness with Minkowski space as the stabilized representative. Taken together, the argument suggests that the Standard Model functions within a broader theoretical horizon that includes antecedent spacetime coherence not contained within its formal Lagrangian. Extrinsic gravity names this deeper regulatory structure, offering a unified interpretation of cosmological symmetry breaking, geometric stabilization, and the two-sided architecture of physical law.
Category: Relativity and Cosmology
[125] ai.viXra.org:2602.0129 [pdf] submitted on 2026-02-28 14:45:04
Authors: Joseph Shaffer
Comments: 10 Pages.
Relativity and entanglement are very different phenomenon in spite of occupying the same galaxy. We find that galactic rotation curves are an excellent measure of the validity of assumptions made about the apparent velocity of interaction of entanglement variables. There is, of course, no motion at all but it is convenient describe it as such for computational purposes. For instance, we find that an assumption of a nonlocal instantaneous kernel gives superb alignment with observed galactic rotation curves. The below work records the results between observation and theory for a number of rotation curves in terms of the disparity between theory and observation.
Category: Astrophysics
[124] ai.viXra.org:2602.0128 [pdf] submitted on 2026-02-28 03:00:54
Authors: Sif Almaghrabi
Comments: 16 Pages.
We present a structured literature review synthesizing 72 publications across eight research streams to develop and evaluate the thesis that context length functions as an implicit inductive bias in large language models (LLMs). We formalize this claim through four operational diagnostics—output entropy, distributional shift under context perturbation, anchoring tendency, and search-space contraction—each defined as a measurable quantity derivable from the predictive distribution pθ(y | x, C). Five testable hypotheses are stated with explicit falsification conditionsand graded against a three-point study-qualityrubric. Four convergent patterns emerge: (i) robust non-monotonic accuracy as a function ofcontext length across tasks, models, and experimental controls; (ii) predictable interactions between context length and reasoning depth, with a difficulty-dependent optimum; (iii) measurable search-space contraction quantifiable via semantic entropy; and (iv) formal parallels to classicalinductive bias in overparameterized models. Thispaper does not introduce novel algorithms or experimental results; its contributions are a formal diagnostic framework, a quality-graded evidence matrix, a causal analysis of confounding factors limiting current claims, and a prioritized research agenda of six open problems with proposed experimental protocols.
Category: Artificial Intelligence
[123] ai.viXra.org:2602.0126 [pdf] submitted on 2026-02-28 01:17:13
Authors: Herman Herstad Nythe
Comments: 100 Pages.
This paper presents a discrete algebraic framework that models the Standard Model vacuum not as a continuous smooth manifold, but as a fault-tolerant topological surface code based on the genus-3 Klein quartic and its automorphism group, PSL(2,7). By reducing continuous phenomenological parameters to exact topological and finite-field invariants, we provide rigorous mathematical resolutions to several enduring anomalies in particle physics. First, the fine-structure constant is derived topologically as 137 and verified dynamically via Migdal's real-space renormalization on the F7 lattice, yielding an effective coupling of 136.724. Second, the existence of exactly three fermion generations is proven to be an unavoidable acoustic resonance: the exact 3χ7 permutation triplicity of the 56-node vacuum adjacency spectrum. Third, the empirical Koide mass formula is structurally resolved; its amplitude parameter √2 emerges as the eigenvalue of the graph's holomorphic cusp forms, while its phase is the exact geometric invariant δ = 2/g2 = 2/9 ≈ 12.732°. Finally, we demonstrate that the surface code constraints autonomously generate the adjoint representation of E6, while the Dirac sea (the negative eigenvalue sector) of the matter graph perfectly reconstructs the symmetric tensor of the octonionic automorphism group G2. By transitioning from continuous differential equations to discrete arithmetic, Galois Quantum Gravity suggests that the Standard Model is the macroscopic shadow of a finite-field quantum algorithm.
Category: High Energy Particle Physics
[122] ai.viXra.org:2602.0125 [pdf] submitted on 2026-02-27 05:34:56
Authors: Chaiya Tantisukarom
Comments: 9 Pages.
This study formalizes the Prime Gear Geometry (PGG) as a dynamical system. We demonstrate that the Riemann Hypothesis (RH) is not a static property of numbers, but a structural necessity of a rolling engine. We identify the $m$-cutoff as the "Mechanical Secret" that governs the transition between discrete prime forging (Time Domain) and spectral stability (Frequency Domain).
Category: Number Theory
[121] ai.viXra.org:2602.0124 [pdf] submitted on 2026-02-27 16:51:40
Authors: Christian B. Mueller
Comments: 19 Pages. (Note by ai.viXra.org Admin: This submission may not be written in a scholarly manner as required - Please conform by using standard scholarly terms, citing listed scientific references etc.)
This work attempts to discuss central inconsistencies in modern physics from the perspective of limited rate of change, relying solely on observation and logical deduction. Starting from an reorganisation balance within the observable world, it constructs a minimal and plausible geometry of a higher-dimensional state space, thereby establishing a self-consistent model. By examining the projection into the observation space, this approach allows for a reappraisal of the symmetries of space and time, the compatibility of relativity and quantum mechanics via the fine-structure constant, and the possibility of a deterministic digital physics as a whole.
Category: Relativity and Cosmology
[120] ai.viXra.org:2602.0123 [pdf] submitted on 2026-02-26 21:38:33
Authors: Bertrand Jarry
Comments: 6 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0)
I derive the vacuum energy density in de Sitter space from entanglement entropy of the cosmological horizon, obtaining ρ_v^em = αH^4 with α = h-bar(260π^2c^3) [etc.]
Category: Relativity and Cosmology
[119] ai.viXra.org:2602.0122 [pdf] submitted on 2026-02-26 10:02:56
Authors: Sif Almaghrabi
Comments: 22 Pages.
We present a structured meta-analysis examining the relationship between chain-of-thought (CoT) reasoning tracelength and task accuracy across 22 large language models spanning five provider families and 14 benchmarkscovering mathematics, code generation, scientific reasoning, and general knowledge. All results are drawn frompublished technical reports, system cards, and peer-reviewed evaluations; no new experiments are conducted. Weaggregate over 300 model—benchmark data points, though we note that cross-source comparisons are subject toprotocol heterogeneity that limits strict commensurability.We document five principal observational patterns: (1) Reasoning-augmented models consistently outperformtheir standard counterparts on hard multi-step tasks, with reported accuracy differences of 40—81 pp on competitionmathematics, though these differences confound reasoning-specific gains with concurrent architecture and trainingimprovements; (2) Within the single controlled setting where token-budget data are available (Claude 3.7 Sonneton AIME 2024, n = 30 test items), the accuracy—token relationship is well-described by a logarithmic fit(R2 = 0.97, n = 7 reconstructed data points), though this fit cannot be statistically distinguished from severalalternative functional forms given the small sample and measurement uncertainty; (3) The observed accuracydifferences are strongly domain-dependent, ranging from large positive gains on competition math to negativeeffects on factual recall; (4) Estimated per-query costs increase nonlinearly near the accuracy frontier, though costestimates carry substantial uncertainty from token accounting and pricing volatility; and (5) Published faithfulnessstudies report that visible CoT reflects actual model reasoning in only 25—39% of probed cases.We propose formal efficiency metrics, discuss their limitations, and provide a practitioner-oriented deploymentframework. All data tables are released. We classify our conclusions as observational rather than causal, anddiscuss the confounds that prevent stronger inference.
Category: Artificial Intelligence
[118] ai.viXra.org:2602.0121 [pdf] submitted on 2026-02-26 21:18:10
Authors: Daniel Speckmann
Comments: 5 Pages.
The Speckmann Lattice Resonance Theory (SLRT) presents a novel unified framework for particle physics and cosmology based on an Inverse Fractal Block Universe (IFB). We propose that the three generations of elementary leptons and neutrinos are not fundamental point particles, but discrete resonance modes of a $3 times 3$ toroidal lattice geometry. By mapping the transition between cubic ($Gamma_{EM}$) and hexagonal ($Gamma_{W}$) lattice symmetries, we provide a purely geometric derivation of the Koide mass relation and the electroweak mixing angle ($sin^2 theta_W approx 1/3$). Furthermore, we identify Dark Energy as the residual elastic tension (frustration) of the lattice, attenuated by a fractal inversion factor across 120 orders of magnitude. Dark Matter is characterized as non-resonant, high-order lattice oscillations. This model reduces the 26 free parameters of the Standard Model to fundamental geometric constants, offering a deterministic solution to the cosmological constant problem and the hierarchy problem.
Category: High Energy Particle Physics
[117] ai.viXra.org:2602.0120 [pdf] submitted on 2026-02-26 16:27:44
Authors: P. Music
Comments: 7 Pages.
We compute the Hessian of the G2 3-form restricted to Gr(3,R^6) at the flavour-symmetric point, obtaining eigenvalues 0^5, (-2 phi_0)^3, (-3 phi_0)^1. The geometric ratio |lambda_{Lambda^2}| / |Delta f / phi_0| = 2/9 matches the empirical Brannen-Koide phase to 0.02%. We prove two obstruction theorems: (1) since cos(2/3) is transcendental (Lindemann-Weierstrass), no symmetric polynomial in the mass eigenvalues with algebraic coefficients can select delta = 2/9 as an extremum; (2) any topological flux quantisation mechanism produces phases that are rational multiples of pi, for which cos(3 delta) is algebraic, and is therefore also excluded. A one-loop effective potential calculation independently rules out perturbative selection. These results establish that the Koide phase, if exactly 2/9, cannot arise from any polynomial potential, standard topological mechanism, or perturbative dynamics.
Category: High Energy Particle Physics
[116] ai.viXra.org:2602.0119 [pdf] submitted on 2026-02-26 21:15:31
Authors: Jessica Bower
Comments: 17 Pages.
Unified Chordal Resonance Theory (UCRT) proposes that gravity, the Standard Model of particle physics, dark matter, dark energy, and cosmology all emerge from a single multitonal vibrational scalar field $ Psi_text{total} $ governed by one dominant nonlinear coupling $ lambda_text{nl} $. The ever-present baseline hum $ Psi_text{total},0 $ at near-Planck frequencies undergoes beat interference and resonance cascades, dynamically generating fermion mass hierarchies, gauge symmetries via rotations on a compact internal manifold ($ S^3 $), fuzzy black hole cores, hybrid dark matter halo profiles with exponential cores and subhalo suppression below $ sim 10^8 M_odot $, and late-time dark energy evolution from phase diffusion. UCRT resolves black hole singularities through diffusive cores, the missing satellites problem via destructive tone interference, and the matter-antimatter asymmetry through phase-driven CP violation, while remaining consistent with Planck CMB, DESI BAO, JWST ultra-faint dwarf observations, and precision flavor data. Quantum corrections via vibrino loops preserve unitarity, and RG flows reach an asymptotic safety fixed point for UV completeness. Near-term falsifiability is provided by amplified GW echo trains and sidebands (LISA), chiral low-$ ell $ B-modes (CMB-S4), hybrid DM cores/subhalo counts (JWST/LSST), $sim$10 TeV resonances (FCC), and phonon analog signatures in tabletop experiments.
Category: Quantum Gravity and String Theory
[115] ai.viXra.org:2602.0118 [pdf] submitted on 2026-02-26 21:12:57
Authors: Vladyslav Hruznov
Comments: 10 Pages.
We propose a single dynamic parameter γ(x, t) that controls the effective dimensionalityof spacetime in a scale- and density-dependent manner. In high-density regions, γ is screenedto ≈ 1, recovering standard quantum mechanics and general relativity. In extremely lowdensity environments, γ approaches ≈ 1.10, yielding deff ≈ 4.1, weakened effective gravity, and emergent dark energy. The framework is realized via a density-dependent measure v(ρel) in the action, leading to modified Friedmann equations and variable-order fractional quantum mechanics. Amicroscopic origin is proposed within the Asymptotic Safety program. The model is consistent with current precision constraints and makes clear falsifiable predictions for upcoming experiments.
Category: Relativity and Cosmology
[114] ai.viXra.org:2602.0117 [pdf] replaced on 2026-02-27 02:16:52
Authors: Lluis Eriksson
Comments: 33 Pages.
This document is an experiment-first audit report for a companion-paper programme claiming a constructive solution of the 4D $mathrm{SU}(N)$ Yang—Mills existence and mass gap problem. It specifies a runnable mechanical audit suite of 29 deterministic tests, defines pass/fail criteria, and presents outputs in a compilation-safe format. The report contains: (i) an explicit non-triviality proof showing the Wightman functions do not factorize trivially; (ii) a toy-model validation recovering the exact 2D $mathrm{SU}(2)$ Yang—Mills mass gap to machine precision; (iii) a Bałaban bridge appendix reproducing the critical inductive step of his renormalization group in simplified form; (iv) a reproducibility repository with 3-line setup instructions; (v) a core proof chain audit mechanically verifying the load-bearing theorems of Papers 86—90, covering terminal Kotecký—Preiss convergence, UV suppression, one-dimensionality of the anisotropic sector, Cauchy bounds on polymer jets, the OS1 vanishing rate $O(eta^2 log eta^{-1})$, Lie-algebra annihilation, and KP margin sensitivity. Beyond the 17 core tests, the suite includes a lattice gauge proxy layer (plaquette expansion, Polyakov-loop centre symmetry, Creutz ratio; 3 tests), an infrastructure layer (Bakry—Émery curvature seed $mathrm{Ric}_{mathrm{SU}(N)} = N/4$, the $2^{4k}$ cancellation in $d=4$, heat-kernel column bound; 3 tests), a UV-flow/heat-kernel layer (Parseval identity, diagonal decay exponent $d/2 = 2$, flow—reflection commutation; 3 tests), a non-triviality test (Haar Monte Carlo on $mathrm{SU}(2)$ and $mathrm{SU}(3)$; 1 test), a toy-model validation (2D Yang—Mills transfer matrix; 1 test), and an algebraic QFT layer (Petz recovery fidelity bound $1-F leq C,e^{-2mr}$ from the Split Property; 1 test). All 29 tests pass; the full suite completes in ${approx}70,mathrm{s}$ on a Google Colab CPU. The complete inter-paper dependency DAG is acyclic and explicitly recorded. All code, data, and artifacts are available at https://github.com/lluiseriksson/ym-audit. The companion papers are archived at https://ai.vixra.org/author/lluis_eriksson.
Category: Mathematical Physics
[113] ai.viXra.org:2602.0116 [pdf] submitted on 2026-02-25 14:27:29
Authors: Jason Merwin
Comments: 12 Pages. This is the third paper in a four part series.
The 5σ discrepancy between local (H0 ≈ 73 km s−1 Mpc−1) and early-universe (H0 ≈ 67 km s−1 Mpc−1) measurements of the Hubble constant constitutes one of the most significant challenges to the standard ΛCDM cosmological model. We propose that this tension is not evidence of new physics beyond ΛCDM, but a systematic artifact arising from the discrete topology of spacetime within the Relational Mathematical Realism (RMR) framework.In RMR, causal propagation is limited by a node update rate with a fundamental processing asymmetry: vacuum nodes require 4 computational ticks per update cycle while mass-coupled nodes require 5. We derive a geometric correction factor Γ = (5/4)1/3 ≈ 1.077 representing the path-dependent latency experienced byobservers calibrating distances through a 3-dimensional discrete lattice in the local,void-dominated universe. Applied to the Planck CMB determination (67.36 ± 0.54 km s−1 Mpc−1), this yields a predicted local measurement of Hlocal = 72.56 km s−1 Mpc−1, agreeingwith the SH0ES value (73.04 ± 1.04 km s−1 Mpc−1) to within 0.46σ with zero free parameters.Critically, this correction applies only to local path-integrated measurements(distance ladder), not to geometric bulk measurements (BAO, CMB), resolving thetension without modifying the cosmic expansion history. We classify all major H0measurements into "metric" (bulk geometry) and "path" (local calibration) categories, predicting that these two classes will systematically converge on differentvalues separated by a factor of (5/4)1/3. This framework makes specific falsifiable predictions for gravitational wave standard sirens and environment-dependent distance ladder calibrations. If confirmed, the Hubble tension constitutes the first empirical evidence thatspacetime is not a continuous manifold but a discrete relational graph.
Category: Relativity and Cosmology
[112] ai.viXra.org:2602.0115 [pdf] submitted on 2026-02-25 21:17:52
Authors: Paul Dangelo
Comments: 6 Pages.
The paper proposes a novel experimental protocol to test a fundamental question at the intersection of quantum mechanics and general relativity: Does a quantum system's ability to generate (source) a gravitational field depend on its quantum state history—specifically, its accumulated decoherence?Standard semiclassical gravity assumes that a mass sources gravity based purely on its mass-energy expectation value, regardless of whether it is in a coherent quantum superposition or a classical mixture. This paper introduces an "Activation Hypothesis," suggesting that a system must build up irreversible entanglement with its environment (accumulated decoherence) to "activate" its gravitational sourcing capacity.
Category: Quantum Gravity and String Theory
[111] ai.viXra.org:2602.0114 [pdf] submitted on 2026-02-25 01:52:08
Authors: Dainis Zeps
Comments: 20 Pages. (Note by ai.viXra.org Admin: This article may be outside the scope of ai.viXra.org & is subject to withdrawal by the Admin)
I am comparing my situation with that of Schrödinger cat, where cat was neither dead nor alive, but I am to be banned or not banned from my rights to access my X/Twitter account. My sentence "After Trump: The Electric Chair For Trump, Biden, Most Republicans, For Many Democrats" is a nominalsentence so may be interpreted in many many ways, both violating and not violating X/Twitter admins’ rules.
Category: Social Science
[110] ai.viXra.org:2602.0113 [pdf] submitted on 2026-02-24 08:01:09
Authors: Steven E. Elliott
Comments: 6 Pages.
Standard physics contains formal contradictions when judged as self-consistent physical ontologies. The Einstein Equivalence Principle (EEP) embeds ε—δ processes requiring internal laboratory realization, yet General Relativity’s dynamics destroy all realizers in finite time -empty spacetime (no labs exist), few-body systems (radiation erosion), or cosmological evolution (de Sitter horizons). MATHICCS (Mathematics + Physics + Computational Consistency Substrate)—a higher-order meta-logic—deems axioms whose mathematical processes lose internal persistence invalid for physical ontology. GR asserts EEP-validity while deriving EEP-invalidity, yielding P ∧¬P . The first MATHICCS-valid ontology is the Fractal Substrate Equivalence Physics (FSEP) [viXra:2602.0107], where eternal Apollonian boundary dynamics persist across infinite recursive scales via Möbius inversion, discrete scale flips (r 7→ r/λ), and angular-momentum conservation. FSEP derives Newtonian gravity and inverse-square law from local quadratic expansion of spherical inversion; constant finite light speed from linear term + pole ejection + scale compression; dark matter fraction (≈ 84%) (baryons ≈ 16%) as geometric series from 3D Apollonian fractal dimension (D ≈ 2.473946 [1]) and radius ratio (β ≈ 0.72); and fine structure constant (α ≈ 1/137.035999 [3]) emergently from bipolar pole aperture geometry (λlocal ≈ 21.81), unifying it with observed quasar jet collimation angles (θjet ≈ 5.2◦ [4]);—all parameter-free except the geometric self-consistency of the persistent substrate. MATHICCS demands all physics reconstruct its mathematics from within via persistent internal processes. GR explodes; FSEP survives.
Category: Mathematical Physics
[109] ai.viXra.org:2602.0112 [pdf] submitted on 2026-02-25 01:37:12
Authors: Soo-Hyun Kim
Comments: 3 Pages.
The chromatic flux ratio anomaly observed in the strong lens system Q2237+0305 strictly contradicts the achromatic prediction of General Relativity (GR). Conventional models invoking differential dust extinction (lambda^-4) fail to fit the empirical data at short wavelengths (chi^2 = 1056). We propose an alternative macro-optical framework where the lensing galaxy is surrounded by a non-conservative fluid medium with a radial density gradient (ablaho). Applying the Gladstone-Dale relation and Cauchy's dispersion equation, we demonstrate that cosmic fluid refraction naturally follows a lambda^-2 dependence. Our fluid model perfectly reproduces the observed multi-wavelength photometry of Q2237+0305 (R^2 = 0.995, chi^2 = 1.01). This over 1,000-fold statistical improvement strongly suggests that gravitational lensing incorporates direct fluid-optical refraction.
Category: Astrophysics
[108] ai.viXra.org:2602.0111 [pdf] submitted on 2026-02-25 01:39:43
Authors: Soo-Hyun Kim
Comments: 11 Pages.
This study redefines black holes not merely as critical points of mass contraction, but as engines that generate and inject spatial fluid, proposing a novel space generation formula V_gen = 27 times (M_BH / ho_vac). The key constant k=3 represents both the geometric integer of three-dimensional space and the physical correspondence to three generations of neutrinos, aligning with the coupling coefficient k approx 3 reported by Farrah et al. (2023). Applying this formula to the McConnell & Ma (2013) dataset, we confirmed a Pearson correlation coefficient of 0.9930, with an average agreement rate of 88.0% in the standard galaxy mass range. Local deviations in systems like M31-M32 are successfully explained by hydrodynamic pressure equilibrium and the 4 kpc truncation phenomenon. Furthermore, the basal fluid density (10^-30 kg/m^3) resolves the Hubble tension by reconciling the early universe Hubble constant with local measurements through an 8.31% volume injection rate (f_inj). We conclude that dark matter and dark energy are different dynamic density states of the neutrino fluid, unifying cosmic expansion and galactic dynamics.
Category: Astrophysics
[107] ai.viXra.org:2602.0110 [pdf] submitted on 2026-02-25 01:32:44
Authors: S. I. Kublanovsky
Comments: 4 Pages.
This paper presents the Law of Harmony, based on the principle of universal quantum synchronization of masses. According to this law, any body — from a symmetrical pulsar to a human being or the Universe itself — acts as a source of coherent gravitational radiation, the period of which is determined by its mass and dimensions. Calculations based on the Law of Harmony indicate that the full life cycle of the Universe is 123.5 billion years. This contradicts the 2025 forecast by Henry Tye's group, which predicted a collapse in 20 billion years, and suggests that humanity has approximately 110 billion years of stable development ahead. The derived period mathematically corresponds to a graviton mass of 1.89*10^(-69) kg. Furthermore, the graviton mass obtained in this study closely aligns with the value of 1.909*10^(-69) kg derived from the holographic principle by Haranas and Gkigkitzis (2014). At the biological level, the Law of Harmony establishes the necessity of resonance between human mass and the Earth's diurnal rhythm (24 hours).
Category: Relativity and Cosmology
[106] ai.viXra.org:2602.0109 [pdf] submitted on 2026-02-25 01:07:15
Authors: Siqi Liu
Comments: 2 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)
This paper provide a formal mathematical proof of the non-existence of non-trival loops in the problem of Collatz Conjecture by using main algebra tools of 2-adic constraints.
Category: Number Theory
[105] ai.viXra.org:2602.0108 [pdf] submitted on 2026-02-23 11:26:21
Authors: P. Music
Comments: 4 Pages.
The Koide formula relates the masses of the three charged leptons through the parameter Q = 2/3 and an angle theta ~ 0.2222. We derive theta = 2/9 as the ratio of quadratic Casimir invariants C_2(3)/C_2(Sym^3(3)) = (4/3)/6 within the natural embedding SU(3) in G_2 = Aut(O), where the G_2 associative 3-form evaluated on the fermion 3-plane determines cos(3*theta). The agreement with PDG data is 0.009% (< 1 sigma). Extending the construction to neutrinos via the adjoint representation, we conjecture theta_nu = C_2(8)/C_2(Sym^3(3)) = 1/2, predicting Sum(m_i) = 70.9 +/- 0.4 meV in normal hierarchy, testable by Euclid, CMB-S4, LEGEND, and nEXO within the coming years.
Category: High Energy Particle Physics
[104] ai.viXra.org:2602.0107 [pdf] submitted on 2026-02-23 13:45:27
Authors: Steven E. Elliott
Comments: 19 Pages.
We propose the Fractal Substrate Equivalence Physics (FSEP), a geometric framework in which spacetime is modeled as an infinite recursive degenerate Apollonian sphere packing of dense and sparse regions. We argue that in spacetimes evolving toward $t to infty$, the global domain of validity of the Einstein Equivalence Principle (EEP) contracts to measure zero, forcing a breakdown of smooth-manifold descriptions at the dense--sparse interface. The maximal geometric covering of this interface is an Apollonian sphere packing, which we take as fundamental rather than emergent.At each tangency boundary, physical evolution is governed by spherical inversion, a M"obius transformation, a discrete scale flip $r mapsto r/lambda$ (with $lambda gg 1$), and strict angular-momentum conservation. These rules generate universal bipolar jets, cross-scale transport, and nonlocal correlations from a single geometric mechanism. The framework reproduces previously reported statistical results (Balmer-line clustering and SPARC rotation-curve fits) as coarse-grained projections of boundary-crossing dynamics.FSEP yields several falsifiable predictions: (i) correlated AGN variability across cosmic voids with lags scaling linearly with void diameter ($tau propto D_{m void}$); (ii) systematic dark-matter-fraction depletion in merging galaxy pairs relative to isolated systems; (iii) jet opening angles directly measuring the local scale ratio $lambda_{m local}$; and (iv) potential spectral-distortion signatures in the cosmic microwave background tied to hydrogen recombination harmonics rather than $mu$/$y$-type thermal relic distortions.This paper stands in relation to its predecessor (viXra:2601.0119) as a foundational extension: where that work derived fractal statistical structure as an emergent consequence of known physics, the present work takes the fractal as the primary geometric substrate from which known physics emerges.
Category: Relativity and Cosmology
[103] ai.viXra.org:2602.0106 [pdf] submitted on 2026-02-22 07:48:50
Authors: Andrew Ebanks
Comments: 11 Pages.
This paper proposes the Fibonacci-Tetrahedral Lattice (FTL), a discrete geometric substrate for the vacuum derived from an E8-to-3D projection. By treating space as a quasicrystalline packingrather than a smooth continuum, we identify a foundational ‘Information-Ontology’ where the 8D lattice serves as the geometric source and 3D reality is the projected result. We demonstrate that the transition from 8D symmetry to 3D packing necessitates a 7.356◦ topological deficit (the Aristotle Gap). This geometric frustration manifests macroscopically as an entropic pressure, providing a zero parameter resolution to galactic rotation curves ("Dark Matter") without requiring new particles.Furthermore, by applying Holographic Scaling (N2/3) to the lattice nodes, we resolve the "Vacuum Catastrophe," deriving an observed energy density of ≈ 10−27.33 kg/m3 (matching Λ) from the theoretical Planck baseline.
Category: Quantum Gravity and String Theory
[102] ai.viXra.org:2602.0105 [pdf] submitted on 2026-02-22 19:46:41
Authors: Eduardo Rodolfo Borrego Moreno
Comments: 16 Pages.
We propose that the Cosmological Dissipative Residual (CDR), previously introduced as a late-time dissipative mechanism resolving the $H_0$ and $S_8$ tensions, originates from continuous production and accumulation driven by high-energy cosmic events. Beginning with the primordial Big Bang as an initial entropy burst, the residual is further generated throughout cosmic history by stellar formation, core-collapse supernovae, AGN jets, and black hole mergers. The production rate $beta_{m prod}(z)$ peaks near cosmic noon ($z sim 2$), and numerical integration calibrated to SFRD and jet/merger observations yields a cumulative contribution of $sim 30%$ to the observed dark energy density. The residual evolves according to $dot{ho}_{m res} + 3H(1 + w_{m res})ho_{m res} = Gamma(t)ho_m + beta_{m prod}(t)$, with $w_{m res} gtrsim -1$ naturally emerging from the balance between production and dissipation, yielding $w_{m res}(z=1) approx -0.92 pm 0.03$ consistent with DESI 2024 BAO hints ($w_0 approx -0.9$). Rapid homogenization via relativistic sound speed ensures uniformity, while localized anisotropic stresses account for gravitational effects in clusters and galaxies. The residual functions as an adaptive regulator through negative feedback, suppressing late-time growth to $sigma_8^{m CDR} approx 0.76$--$0.80$. This framework unifies the origin, distribution, and role of dark energy as emergent from the universe's energetic history, with falsifiable predictions for $w(z)$ evolution and subtle event-density correlations testable with DESI, Euclid, and ngEHT.
Category: Astrophysics
[101] ai.viXra.org:2602.0104 [pdf] submitted on 2026-02-22 02:50:56
Authors: M. I. Gallardo Nicolalde
Comments: 5 Pages. (Note by ai.viXra.org Admin: Part of the texts are cutoff)
We introduce Fractalic Field Theory (FFT), a complete framework where the fractal geometry of spacetime at quantum gravity scales determines all physical laws. The central object is the fractalic dimension D = 2.7268, emerging from quantum gravity as a fixed point of therenormalization group flow. From this single geometric invariant, we derive: (i) α−1 = 137.036, (ii) αs(MZ ) = 0.1181, (iii) sin2 θW = 0.2314, (iv) the Kaluza-Klein spectrum mn = n1/D /R with m1 = 9.73 TeV, and (v) cosmological parameters including ΩDMh2 = 0.120 and ns = 0.965. FFT represents a paradigm shift from symmetry-based to geometry-based unification, predicting 32 experimentally testable quantities with unprecedented accuracy using zero adjustable parameters.
Category: High Energy Particle Physics
[100] ai.viXra.org:2602.0103 [pdf] replaced on 2026-02-28 23:42:37
Authors: Steven B. Thompson
Comments: 8 Pages.
We propose a minimal dynamical framework for the origin of the universe in which absolute non-existence—a state with no spacetime, matter, fields, or classical time—is intrinsically unstable under quantum-mechanical and gravitational principles. The Heisenberg uncertainty principle, applied to gravitational degrees of freedom, precludes a static null configuration: non-existence has "nowhere to go" but to collapse into itself, producing an effective Planck-scale density regime that undergoes a nonsingular quantum-gravitational transition. This collapse-driven process bootstraps the emergence of classical spacetime and the arrow of time, requiring no external causes, pre-existing substrates, boundary conditions, or auxiliary fields.Quantum gravity emerges here not as an imposed extension but as the inherent dynamicalstructure governing the instability and resolution. The mechanism refines and complements established proposals—such as Vilenkin’s quantum tunneling from nothing, the Hartle-Hawking no-boundary wavefunction, and recent developments in loop quantum cosmology bounces and quadratic gravity—by providing a purely mechanical interpretation that replaces probabilistic nucleation or Euclidean continuation with an intrinsic collapse bootstrap. It aligns with ongoingrefinements of no-boundary states, curvature bounces, and geometric "from nothing" models inthe 2025—2026 literature.This framework offers a parsimonious dynamical resolution to the question of why thereis something rather than nothing, transforming it into a consequence of quantum gravity’sstructure. Potential observational implications include consistency with cosmic microwave back-ground data and distinguishable signatures in primordial gravitational waves or large-scale structure that may differentiate collapse-initiated emergence from conventional inflationary scenarios.
Category: Relativity and Cosmology
[99] ai.viXra.org:2602.0102 [pdf] submitted on 2026-02-22 01:52:53
Authors: Mark Jerome Growden
Comments: 2 Pages.
This paper proposes that several of the world's most widespread musical scales arise naturally from the acoustic physics of simple bilabial end-blown cylindrical flutes with a single tone hole. Through hands-on construction and performance of one-hole flutes made from uniform cylindrical tubing, the author demonstrates that placing a hole at a minor second interval above the fundamental, combined with the overtone series, generates the Freygish (Hijaz) scale. A hole at a major second generates a pentatonic scale in the more accessible registers, and a Lydian Dominant scale when played into the higher, more demanding partials. When the end-covering technique of overtone flutes is combined with the single-hole technique, additional scale systems emerge from the same instrument. These findings suggest that culturally diverse scale systems may share a common origin not in theory, cultural exchange, or aesthetic preference, but in the physical constraints of elementary wind instrument construction. The implications extend to ethnomusicology, organology, and the origins of tonal music.
Category: Classical Physics
[98] ai.viXra.org:2602.0101 [pdf] submitted on 2026-02-21 19:14:58
Authors: Michael Zot
Comments: 8 Pages. (Note by ai.viXra.org Admin: Please cite all listed scientific references)
Multi-turn dialogue is where large language models (LLMs) are most useful, and also where they most often "get lost". Prior work reports that average performance drops substantially from single-turn to multi-turn settings, and argues that the dominant driver is increased unreliability rather than a large loss of peak capability. We replicate and extend this picture using a quantile-based analysis over thousands of stochastic generations, with an emphasis on distribution shape rather than averages.Across seven jobs we analyze N=5,100 scored generations: 30 instructions per job, 10 stochastic runs per instruction, and 1 to 3 turns per run. For each instruction and turn we compute (i) aptitude A90, the 90th percentile of score across runs, and (ii) unreliability U90-10, the 90th to 10th percentile spread.Our core result is a heavy-tailed fragility surface: most instructions remain perfectly stable with U=0, while a small minority contribute most of the unreliability at later turns. Across multi-turn replications, the top 3 most fragile instructions at turn 2 explain 54% to 91% of total unreliability. This yields a practical taxonomy of dialogue dynamics (stable, monotone degradation, and instability then recovery) and suggests new training and evaluation targets: recovery and variance control, not just average accuracy.
Category: Artificial Intelligence
[97] ai.viXra.org:2602.0100 [pdf] replaced on 2026-03-14 16:37:58
Authors: F. Rücker
Comments: 35 Pages. English. Licensed under Creative Commons Attribution 4.0 International (CC BY 4.0)
The Lambda-CDM framework currently faces significant empirical challenges, most notably the 5-sigma "Hubble Tension" and the observation of "impossible" early galaxies by the James Webb Space Telescope (JWST). This paper proposes the Theory of Informational Space Genesis (ISG), a model that treats the universe as a 4D-rotational manifold unfolding into 3D space.Methodologically, we establish a cross-disciplinary convergence between vacuum thermodynamics and 4D-geometry. By isolating a vacuum "Pristine Potential" (approx. 2.03 meV) through a signal-denoising procedure, we derive a Lorentz-conformant scaling law for expansion. This energetic baseline is independently validated by a geometric master equation based on the Tesseract's structural ratio (Psi = 1.5) and a universal vacuum tension of 1/24.Both pathways synchronize at a predicted Hubble constant of H0 approx. 73.55 km/s/Mpc and an effective cosmic age of approx. 15.165 Gyr. We identify the transition from Lead (Z=82) to Bismuth (Z=83) as the macroscopic saturation point of this 4D-grid, providing a nuclear-physics anchor for cosmological evolution. The model explains early galaxy maturity via a kinetic leak at redshift z approx. 7.27 and resolves the Hubble tension through a 1.85% primordial BBN-offset. These findings suggest that the evolution of the cosmos and the limits of nuclear stability are governed by a single, unified geometric descent toward a 1.5-Pi resonance equilibrium.
Category: Astrophysics
[96] ai.viXra.org:2602.0099 [pdf] replaced on 2026-03-02 04:12:15
Authors: Sean McCallum
Comments: 10 Pages. Creative Commons Attribution 4.0 International (CC BY 4.0)
The vacuum is not an empty stage. It is a stretchy quantum balloon - a dynamical condensate of overlapping standing-wave fields - with every massive particle a permanent topological knot tied into its fabric. Gravity emerges as local strain, forces as the rubber trying to smooth itself, and the entire Standard Model arises from zero modes trapped inside these knots.
Beneath the balloon lies an eternal pre-geometric qubit substrate that never disappeared after the Big Bang condensation. A tiny, Planck-suppressed portal couples the substrate to the rubber, naturally feeding zero-point energy and producing a mild dynamical dark energy with predicted equation-of-state drift w_a approx +7.2 x 10^{-5}.
With only two fundamental parameters (f and e), the QBIT framework quantitatively reproduces:
Black-hole interiors are regular topological cores with no singularities, the information paradox is resolved by substrate-mediated leakage, and quantum paradoxes (double-slit, delayed-choice eraser, Hardy's, Zeno, Elitzur-Vaidman, Schrodinger's cat) receive natural resolutions via rubber ripples and topological protection.
Lattice simulations and analytic results confirm topological stability, emergent curvature, multi-knot binding, and the substrate portal's predictions. QBIT thus provides a conceptually elegant, quantitatively successful, and experimentally testable candidate for a unified theory of particles, gravity, and cosmology - all emerging from one stretchy quantum balloon and its eternal qubit substrate.
Category: Quantum Gravity and String Theory
[95] ai.viXra.org:2602.0098 [pdf] submitted on 2026-02-21 01:40:38
Authors: Darcy Facundo
Comments: 01 Pages. "Presents a new informational ontology (Dadatic Monism) to unify General Relativity and Quantum Field Theory through a redefined mass-energy equivalence formula."
1. ABSTRACTThis essay proposes a resolution to the long-standing dichotomy between General Relativity (GR) and Quantum Field Theory (QFT) through the paradigm of Dadatic Monism. It postulates that the Universe is a real-time information processing system where matter is not an inert substance, but a Data Density Anomaly (Ψ) within an Informational Plenum. By formulating the equationED=γ⋅Ψ.Vif2, we establish a causal bridge between total system energy, the fundamental limit of data propagation, and metric coherence protocols (Lorentz). The theory concludes that physical reality and consciousness are emergent byproducts of distinct levels of transduction and attenuation of this universal data stream.2. THE FUNDAMENTAL DADATIC EQUATIONThe core of this theory lies in the redefinition of mass. In Dadatic Monism, mass is the measure of "informational load" at a specific logical address of spacetime.ED= γ⋅Ψ.Vif2ED (Dadatic Energy): The total processing magnitude or computational work contained within an event.Ψ (Radiant Data Mass): The volume of information packets defining the "presence" of a particle or body.Vif =(Propagation Velocity): The constant 3×108 m.por segundointerpreted as the maximum bandwidth of the vacuum (Plenum).(Propagation Velocity): The constant 3×108 m por segundointerpreted as the maximum bandwidth of the vacuum (Plenum).3. DYNAMICS AND THE LORENTZ PROTOCOLThe theory’s validity in non-static systems is ensured by the integration of the Lorentz Factor (γ). In our ontology, γ is not merely a geometric distortion but a Buffer Management Protocol.ED=γ⋅Ψ.Vif2As the velocity of data (v) approaches Vif, the system execute different reference frames.4. UNIFICATION AND TRANSDUCTIONGravity: Interpreted as the "processing lag" imposed by high data density (Ψ). Spacetime curves not due to a force, but to accommodate heavy informational loads within the logical metric.Quantum Mechanics: Entanglement serves as evidence that data is unitary and shared at the source code level, rendering physical distance irrelevant for state synchronization.Biology and Consciousness: The Cortex is postulated as a Quantum-Informational Transducer. The "reality" we experience is an attenuated version of the Plenum, filtered to be processable by biological hardware (Soma).5. CONCLUSIONDadatic Monism offers a "naked," demystified view of physics. It replaces the search for hypothetical particles with an understanding of the laws of emittance and processing. The Universe does not "exist" in a static sense; it is causally rendered at every Planck interval.KEYWORDS: Dadatic Monism; Information Physics; Theoretical Physics; Unification; Lorentz Factor; Biological Transduction.
Category: Quantum Gravity and String Theory
[94] ai.viXra.org:2602.0097 [pdf] submitted on 2026-02-20 18:28:06
Authors: Vinicius F. S. Santos
Comments: 19 Pages.
We introduce the Secular Replicator Flow, a finite-dimensional algebraic dynamical system inspired by the turbulent energy cascade of the Navier—Stokes equations, built from the spectral theory of golden resolvent operators on discrete network graphs [9]. The continuous mechanics of fluid turbulence—incompressibility, nonlocal pressure, nonlinear advection, and viscous dissipation—find precise algebraic counterparts in the constraints of a replicator equation evolving on the simplex of spectral participation weights, governed by a global secular equation. Within this framework we establish three principal results. First, the Variance Law: the macroscopic coupled eigenvalue λ∗(t) evolves monotonically according to Fisher’s Fundamental Theorem, acting as a strict Lyapunov function (between excision events) whose rate of increase equals the fitness variance of the active spectrum. Second, the Spectral Selection Theorem: the fitness landscape is a strict bipolar Ushape in the base eigenvalue μ, guaranteeing that the replicator flow annihilates mid-spectrum noise and funnels all energy into the extreme macroscopic topologies of the network. Third, Global Regularity: as the system approaches a structural resonance (transparent pole), the fitness plunges to −∞, triggering an auto-excision mechanism that exponentially starves the dangerous channel, rendering every pole singularity removable. The resulting dynamics form a Sawtooth Cascade of smooth climbs interrupted by discontinuous structural snaps whose direction is controlled by the residual load of the excised channel. We classify the sole remaining failure mode as a thermodynamic phase escape at the r = 2 Chebyshev boundary, where the discrete algebraic structure of the network undergoes a global phase transition into unbounded hyperbolic space—a phenomenon fundamentally different from the localised velocity blowup sought by PDE analysis. All regularity results herein apply to this model; implications for the full Navier—Stokes equations in R3 remain open.
Category: Mathematical Physics
[93] ai.viXra.org:2602.0096 [pdf] submitted on 2026-02-20 06:04:55
Authors: Lluis Eriksson
Comments: 21 Pages.
This paper is a hostile-review navigation guide and audit manifesto for a companion-paper programme claiming a constructive solution of the four-dimensional SU(N) Yang-Mills existence and mass gap problem in the Osterwalder-Schrader (OS) framework, reconstructed as a Poincare-covariant Wightman QFT with strictly positive mass gap. The guide provides: (i) an explicit dependency graph and Clay/Jaffe-Witten checklist; (ii) an explicit threat model listing standard failure modes targeted by hostile review (black-box dependence on Balaban, interface friction between gradient flow and the Balaban measure, diagonal-limit non-uniformity, and operator-mixing residues); (iii) an explicit four-pillar defensive architecture resolving each attack with structural (not merely quantitative) shields; (iv) the preventive lock: a triangular renormalization-mixing structure that blocks upward anisotropic flow into the marginal (d=4) sector, neutralizing the standard a^2 x a^{-2} -> O(1) objection; (v) a mechanical audit trail mapping load-bearing hypotheses to primary sources; and (vi) a complete linked index of all supporting preprints for traceability. External mathematics is explicitly declared: abstract polymer cluster expansion (Kotecky-Preiss), OS reconstruction (Osterwalder-Schrader), and lattice reflection positivity (Osterwalder-Seiler). Furthermore, this guide introduces the Triangular Mixing Preventive Lock: a structural algebraic mechanism showing that the operator mixing matrix has no anisotropic marginal d=4 sink in the gauge-invariant W_4-scalar sector. Consequently, the standard O(a^2) x O(a^{-2}) -> O(1) operator-mixing residue attack is blocked structurally: any quadratic divergence is forced to renormalize only O(4)-invariant d=4 data (the isotropic coupling), leaving the O(4)-breaking channel suppressed. This paper is not a claim of institutional validation; it is an audit map prescribing the check order and falsification points for the companion-paper chain.
Category: Mathematical Physics
[92] ai.viXra.org:2602.0095 [pdf] submitted on 2026-02-19 19:48:12
Authors: J. W. McGreevy
Comments: 16 Pages.
We present Arithmetic Relativistic Emergence (ARE) as a "General Relativity of Numbers" — a framework in which the Standard Model, quantum mechanics, classical 3+1 Lorentzian spacetime, and fundamental constants emerge tautologically from the arithmetic geometry of Q. The Riemann zeta function ζ(s) constitutes the maximally symmetric pregeometric vacuum. Its functional-equation symmetry around Re(s) = 1/2, combined with the pole at s = 1, forces spontaneous symmetry breaking via the weight-12 modular discriminant ∆(τ ) = η(τ )24 at the s = 6 harmonic threshold. This breaking disperses the vacuum into Archimedean divergence (Fdiv, smooth curvature density) and non-Archimedean curl (Hcurl, discrete torsion at p-adic fibers). The emergent geometry is governed by Arakelov curvature on the arithmetic surface Spec Z ∪ {∞},where Weierstrass weights act as "mass/energy density" (algebraic rigidity) and the hyperbolic/Bergman metric plays the role of spacetime. Modular transformations toward cusps correspond to Lorentz rapidity, yielding an equivalence principle analogue between inertial (modular flow resistance) and gravitational (metric warping) responses. The adelic spectral triple (KO-dimension 6, finite algebra C ⊕ H ⊕ M3(C)) induces symplectic deformation of phase space, with the non-trivial zeros of zeta providing the Dirac spectrum (Hilbert—Pólya realized). The Minkowski interval ds2 = −c2dt2 + du20d7x 2 emerges as the unique adelic-invariant quadratic form, with light cone as the resolved cusp boundary (holographic screen).The spectral action Tr f (D/Λ) recovers Einstein—Cartan gravity with non-Abelian Yang—Mills, where generalized Rainich conditions (quadratic invariants involving structure constants f abc) are satisfied at s = 6, with torsion (Hcurl) regularizing self-interactions. The full SM gauge group SU(3)c × SU(2)L × U(1)Y and three chiral generations emerge fromadelic place ramification and Leech lattice Z2-orbifold. Constants (α−1 ≈ 137 from Petersson + torsion residues, ℏ from Lambert-Planck suppression, G from unification suppression, Λ ∼ e−288) are inevitable invariants. Langlandsfunctoriality acts as the holographic dictionary mapping prime rigidity to bulk physics. ARE thus unifies physics as the macroscopic shadow of arithmetic rigidity, with the Riemann Hypothesis as a necessary stability condition for the emergent universe.
Category: Mathematical Physics
[91] ai.viXra.org:2602.0094 [pdf] submitted on 2026-02-19 17:15:36
Authors: Vinicius F. S. Santos
Comments: 45 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
We develop the spectral theory of operators whose eigenvalue structure is governed by the golden ratio φ = (1 +√5)/2. The foundation is the golden resolvent factorisation: for any real symmetric matrix A, λ2I − λA − A2 = (λI − φA) (λI + A/φ). This identity controls the spectrum of golden companion operators, decomposing eigenvalues into transparent pairs (μφ and −μ/φ) and coupled modes (roots of an explicit secular equation), governed by the Galois conjugation φ 7→ −φ−1 of Q(√5)/Q. We establish six main results: (1) the Golden Amplification Theorem, producing the transparent eigenvalue pairs and their eigenvectors; (2) the Secular Equation, a closed-form characteristic polynomial for coupled modes; (3) the Secular Sensitivity Theorem, identifying the secular weight with the coupled eigenvector norm and establishing Lipschitz continuity of coupled eigenvalues in the coupling vector; (4) the Positive Boost Inequality, bounding submatrix spectral radii via nodal-domain restrictions; (5) the Galois Transfer Principle, exactly classifying partition transfer for conjugate eigenvector pairs via the Transfer/Pareto Trichotomy—with automatic spectral dominance for transparent modes and unavoidable Pareto regimes for coupled modes; (6) the Chebyshev Ladder, generalising the amplification ratio to 2 cos(π/(2p+3)) through the cyclotomic fields Q(ζ2p+3)+. The spectral spread of the golden pair equals the generator of the different ideal of Z[φ], connecting the framework to the arithmetic of Q(√5).
Category: Combinatorics and Graph Theory
[90] ai.viXra.org:2602.0093 [pdf] submitted on 2026-02-19 07:09:46
Authors: Jason Merwin
Comments: 13 Pages. This is the second paper in a four paper series.
In a companion paper, we established the Theorem of Temporal Necessity within the framework of Relational Mathematical Realism (RMR), demonstrating that a sufficiently complex, locally consistent mathematical structure cannot exist as a static object but must undergo a non-terminating sequence of state extensions identified with physical time. In this paper, we extend the framework to quantum mechanics. We argue that quantum indeterminacy is not a fundamental property of nature but an epistemic consequence of observers being embedded subsystems within an evolving relational structure. The "hidden variable" determining quantum outcomes is the global relational topology of the present state St, which is non-local by definition and inaccessible to any embedded observer. We show that this framework survives Bell’s theorem by violating measurement independence through synchronic topological constraint rather than diachronic conspiratorial fine-tuning, and we resolve the measurement problem by identifying wavefunction collapse with the topological update of the observer’s local subgraph. We further conjecture that the Born rule (P = |ψ|2) arises as a geometric property of the Gödelian boundary—specifically, that probability scales with the combinatorial cross-sectional area of relational bundles at the logical horizon, unifying quantum probability with Bekenstein-Hawking entropy under a single geometric principle. Finally, we propose that the renormalization group flow of quantum field theory is the graph-theoretic coarse-graining of the universal relational structure, and that the hierarchy between gravitational and gauge force strengths reflects the ratio of global connectivity to local clustering density in the universal graph.
Category: History and Philosophy of Physics
[89] ai.viXra.org:2602.0092 [pdf] submitted on 2026-02-19 12:13:41
Authors: Lluis Eriksson
Comments: 9 Pages.
We complete the rigorous construction of four-dimensional Euclidean SU(N) Yang--Mills quantum field theory and establish the existence of a mass gap. Building on the companion papers -- which unconditionally establish exponential clustering with mass gap, the Osterwalder--Schrader axioms OS0, OS2, OS3, OS4, and quantitative irrelevance of O(4)-breaking lattice operators -- we derive a lattice Ward identity for infinitesimal Euclidean rotations, identify the breaking term as a dimension-6 anisotropic operator insertion, and prove that the breaking distribution vanishes as $O(eta^2,|log((Lambda_{mathrm{YM}}eta)^{-1})|) to 0$ in the continuum limit, establishing axiom OS1 (full O(4) Euclidean covariance). Combined with the Osterwalder--Schrader reconstruction theorem, this yields a non-trivial Poincare-covariant Wightman quantum field theory with mass gap $Delta_{mathrm{phys}} geq c_N,Lambda_{mathrm{YM}} > 0$ for each $N geq 2$.
Category: Mathematical Physics
[88] ai.viXra.org:2602.0091 [pdf] submitted on 2026-02-19 12:16:47
Authors: Lluis Eriksson
Comments: 9 Pages.
This paper has two goals.Part I (terminal KP bound). We provide a verifiable, citation-driven derivation of the terminal-scale Kotecky--Preiss (KP) smallness bound used in the companion paper on exponential clustering and mass gap. Rather than re-deriving the full multiscale renormalization group (RG), we isolate explicit hypotheses (H1)--(H3) on the terminal polymer activities and prove that they imply the KP convergence criterion. We then verify (H1)--(H3) by mapping them to specific statements in Balaban's published primary sources (CMP 116, 119, 122), with an audited notation bridge recorded in the structural package companion paper.Part II (assembly map + Clay checklist). We give an explicit dependency graph assembling the companion papers together with the KP input proved here. We provide a checklist matching the Clay/Jaffe--Witten formulation of the Yang--Mills existence and mass gap problem to the theorems across the paper sequence (OS0--OS4, OS1, and the mass gap).Scope / external mathematics. The argument uses the abstract KP cluster expansion theorem (Kotecky--Preiss 1986) and the Osterwalder--Schrader reconstruction theorem (1975). It relies on the terminal polymer representation and activity bounds as proved in Balaban's CMP papers cited above.
Category: Mathematical Physics
[87] ai.viXra.org:2602.0090 [pdf] submitted on 2026-02-19 17:42:14
Authors: Yun Seok Choe
Comments: 8 Pages. (Note by ai.viXra.org Admin: This submission mainly contains speculations and may not be written in a complete/scholarly manner - Please cite and list scientific references)
[Paper 1] This foundational paper establishes the "Relativity of Focus" as a new physical principle. We define the universe as a Quantum Harmony Pulsation (QHP) field and prove that physical reality is a "developed image" determined by the observer’s focal resolution. We derive the c2 constant as a dynamic pulsation rate and establish the mathematical framework for the focus operator (Γ).
[Paper 2] Based on the foundational principles of Quantum Harmony Pulsation (QHP) established in Part 1, this paper proposes a Grand Unified Theory (GUT) by redefining ’Force’ as a manifestation of pulsation density gradients. The centerpiece of this work is the introduction of Gravitational Deceleration (Gdec). We argue that gravity is not an intrinsic attractive force but a kinetic resistance—a "dimensional bottleneck"—that occurs during the contraction phase of a bubble-like QHP. Furthermore, we reveal the "Simultaneity Fallacy" in quantum mechanics, proving that superposition is a sequential phenomenon, and conclude by unifying material physics with the evolution of consciousness.
[Paper 3] Based on the foundational principles of Quantum Harmony Pulsation (QHP) established in Part 1, this paper proposes a Grand Unified Theory (GUT) by redefining ’Force’ as a manifestation of pulsation density gradients. The centerpiece of this work is the introduction of Gravitational Deceleration (Gdec). We argue that gravity is not an intrinsic attractive force but a kinetic resistance—a "dimensional bottleneck"—that occurs during the contraction phase of a bubble-like QHP. Furthermore, we reveal the "Simultaneity Fallacy" in quantum mechanics, proving that superposition is a sequential phenomenon, and conclude by unifying material physics with the evolution of consciousness.
[Paper 4] As the final installment of the ’Focus Science’ trilogy, this paper provides the numerical and geometric evidence for the Relativity of Focus. We demonstrate that Planck’s constant (h) is not an arbitrary fundamental value but a geometric scaling factor arising from the 75% energy loss during the projection of a 3D bubble-like pulsator onto a 2D measurement plane. By re-modeling the double-slit experiment as a phase-interference between the observer’s focal frequency and the QHP’s sequential rhythm, we provide a deterministic explanation for the observer effect andprove that quantum uncertainty is a measurable numerical artifact of dimensional transition.
Category: Quantum Physics
[86] ai.viXra.org:2602.0089 [pdf] submitted on 2026-02-18 09:21:04
Authors: Lluis Eriksson
Comments: 17 Pages.
We establish two independent rigorous results for four-dimensional SU(N)pure-gauge lattice Yang—Mills theory with Wilson action, at fixed latticespacing η > 0 and weak coupling gu2080 ≤ g_*, uniformly in the spatial volume L.(A) Uniform Log-Sobolev Inequality. The Wilson measure μ_L satisfies Ent_{μ_L}(f²) ≤ (2/ρ̂) E_L(f,f) with constant ρ̂ > 0 independent of L, where E_L is the natural Dirichlet form on SU(N)^{|E(Λ)|}.(B) Uniform Mass Gap. The Osterwalder—Seiler Hamiltonian H_L has a spectral gap m_gap ≥ mu2080 > 0, uniformly in L.Both theorems share a single input — the Dobrushin—Shlosman completeanalyticity (CA) condition, verified via Bałaban's renormalization groupprogram — but follow logically independent paths. Theorem A is derivedthrough Cesi's quasi-factorization of entropy, seeded by a Bakry—Émerylocal log-Sobolev inequality on SU(N)^{|E(Σ)|}; the Ricci curvatureRic_{SU(N)} = (N/4)g plays a key role. Theorem B is derived throughexponential clustering of temporal correlations — a consequence of CA viaDobrushin contraction — combined with the Osterwalder—Seiler transfer-matrixconstruction and the Krein—Rutman theorem. We further prove (C) that {μ_L}converges weakly to a unique, translation-invariant infinite-volume Gibbsstate μ_∞ satisfying the DLR consistency equations, whose reconstructedHamiltonian H_∞ inherits the mass gap mu2080. All constants are explicit inN, gu2080, and η. The present results hold at fixed lattice spacing; thecontinuum limit η → 0 is addressed in a companion paper.
Category: Mathematical Physics
[85] ai.viXra.org:2602.0088 [pdf] replaced on 2026-02-19 12:11:33
Authors: Lluis Eriksson
Comments: 21 Pages.
We establish exponential clustering with a strictly positive mass gap for four-dimensional pure SU(N) lattice Yang--Mills theory with Wilson's action, uniformly in lattice spacing $eta$ and physical volume $L_{mathrm{phys}}$:$|mathrm{Cov}_{mu_eta}(mathcal{O}(0),mathcal{O}(x))| leq C,e^{-m,|x|/a_*}$, with $m > 0$ and $a_* sim Lambda_{mathrm{YM}}^{-1}$.The proof assembles three ingredients: (1) Balaban's rigorous renormalization group for lattice gauge theories (CMP 1984--1989), which produces effective densities with local polymer decompositions and exponentially decaying activities; (2) a terminal-scale polymer cluster expansion (imported from Balaban's convergent renormalization expansions), which implies exponential clustering for the effective terminal measure; and (3) a multiscale correlator decoupling identity (this paper), which separates ultraviolet fluctuations from infrared physics and yields uniform UV suppression. The coupling control required by Balaban's framework -- that the effective couplings remain in the perturbative regime throughout the RG iteration -- is established via an inductive argument using Cauchy bounds on the analyticity of the effective action. We also verify the Osterwalder--Schrader axioms OS0, OS2, OS3, and OS4 for subsequential continuum limits, and establish vacuum uniqueness and non-triviality. The remaining axiom OS1 (full O(4) Euclidean covariance) is not established here; we prove covariance under lattice translations and the hypercubic group $mathcal{W}_4$, and show that if O(4) covariance holds in the continuum limit, the reconstructed Wightman theory is a non-trivial relativistic quantum field theory with mass gap $Delta_{mathrm{phys}} geq c_N,Lambda_{mathrm{YM}} > 0$, where $c_N > 0$ depends only on $N$ (and is independent of $eta$ and $L_{mathrm{phys}}$).
Category: Mathematical Physics
[84] ai.viXra.org:2602.0087 [pdf] replaced on 2026-02-19 12:12:41
Authors: Lluis Eriksson
Comments: 18 Pages.
We classify gauge-invariant local lattice operators of classical dimension 6 on the four-dimensional hypercubic lattice into O(4)-invariant, hypercubic-invariant but O(4)-breaking (anisotropic), and on-shell-redundant components, following the Symanzik improvement programme and the on-shell improvement technique of Luscher--Weisz (1985). Inside Balaban's renormalization group framework for SU(N) lattice Yang--Mills theory, we extract the anisotropic projection of the effective action via local Taylor expansion of polymer activities in the small-field regime and prove a quantitative quadratic scale bound for the anisotropic coefficient: for every RG step $k leq k_*$ with effective coupling $g_k leq gamma_0$, the coefficient of the (one-dimensional) anisotropic sector in the classical dimension-6 Symanzik expansion satisfies $|c_{6,mathrm{aniso}}^{(k)}| leq C,a_k^2$, uniformly in lattice spacing $eta$, physical volume $L_{mathrm{phys}}$, and RG step $k$. We further prove a quantitative insertion integrability estimate for connected correlators with one insertion of the anisotropic operator. When combined with the rotational Ward identity derived in the companion paper, this yields that the corresponding breaking distribution tested against Schwartz functions is $O(eta^2,|log((Lambda_{mathrm{YM}}eta)^{-1})|)$ and hence vanishes as $eta to 0$.
Category: Mathematical Physics
[83] ai.viXra.org:2602.0086 [pdf] submitted on 2026-02-18 02:34:37
Authors: Moninder Singh Modgil, Dnyandeo Dattatray Patil
Comments: 31 Pages.
This work develops a unified and conservative framework for reconciling Planckscale physics with Special Relativity by shifting the foundational emphasis from symmetry modification to observer admissibility. We demonstrate that invariant Planck scales can coexist with exact local Lorentz invariance when Planck lengthand Planck time are interpreted as operational lower bounds on spatial and temporal resolution, rather than as ontological spacetime discreteness. Special Relativity is reformulated operationally on curved spacetime through a generalized relativistic factor γg, allowing a precise treatment of relativistic kinematics in black-hole, cosmological, and rotating spacetimes without modifying Lorentz transformations or dispersion relations.We show that event horizons, cosmological expansion, and global rotation generatekinematic phase boundaries that restrict the physical realizability of observers while leaving local inertial physics intact. Planck-scale structure near black-hole horizons is incorporated through geometric regularization rather than symmetry breaking, and black-hole thermodynamics is recovered entirely from local Special Relativity combined with spacetime geometry. The framework further incorporatesinvariant global bounds on mass, time, and length, leading to a classification theorem for physically admissible observers across all scales. Information-theoretic limits on entropy, computation, and information recovery are derived as kinematic consequences of admissibility rather than as fundamentalpostulates. Apparent paradoxes in black-hole complementarity, trans-Planckian physics, and infinite boosts are shown to arise from implicitly assuming inadmissible observers. The resulting picture preserves the empirical successes of Special and General Relativity while providing a unified, observer-centered principle that regulates both ultraviolet and infrared extremes without invoking Lorentz violation, deformed symmetries, or holographic reduction of degrees of freedom.
Category: Relativity and Cosmology
[82] ai.viXra.org:2602.0085 [pdf] submitted on 2026-02-17 16:18:50
Authors: Lluis Eriksson
Comments: 21 Pages.
We prove that Wilson-loop expectations in four-dimensional Euclideanlattice Yang—Mills theory with compact gauge group G admit auniversal continuum limit, independent of the lattice approximationscheme, for every contractible loop and all values of the coupling.The proof proceeds by a multiscale decomposition that combinesBalaban's renormalization-group framework with a quantitativegradient-flow smoothing step at each scale. For an observableliving at lattice scale k, the Yang—Mills gradient flow is run fora time proportional to the squared lattice spacing a_k^2; adeterministic contraction estimate (Theorem 3.5) shows that thisreduces the single-link oscillation of the flowed observable by afactor L^{-2k}, where L is the blocking factor. The resultinggeometric series is summable and yields the desired uniform bound.The two main inputs are: (i) a pointwise domination lemma(Lemma 3.3) that controls the gradient of the flowed observableby a scalar heat kernel on the link graph, exploiting thecontractivity of parallel transport; and (ii) a Duhamelinterpolation formula (Lemma 4.1) that converts each change-of-measure error into a covariance with the irrelevant part of theeffective action, bounded via a Poincaré-type inequality. Togetherthese close the Balaban—Doob inductive circuit under a quantitativeblocking hypothesis that is verified in a companion paper.As a corollary, we establish Osterwalder—Schrader reflectionpositivity for the gradient-flow-smoothed Wilson-loop observable,which together with the continuum limit yields a construction ofthe physical Hilbert space and a positive transfer matrix for thetheory.
Category: Mathematical Physics
[81] ai.viXra.org:2602.0084 [pdf] submitted on 2026-02-17 19:43:23
Authors: Lluis Eriksson
Comments: 15 Pages.
We establish quantitative almost-reflection positivity (almost-RP) for a family of flowed observables in finite-volume lattice Yang-Mills theory on the four-dimensional Euclidean torus T_L^4 with structure group G = SU(N). The lattice Wilson flow - the lattice counterpart of the Yang-Mills gradient flow - acts as a gauge-covariant smoothing that suppresses ultraviolet fluctuations. By combining three ingredients: (i) a Gaussian localization bound that controls the variance of flowed observables via an Efron-Stein-type inequality, (ii) Jacobian estimates for the lattice Wilson flow that yield exponential decay of trans-plane influence, and (iii) the exact lattice reflection positivity of the Wilson action, we show that the failure of RP for flowed observables is exponentially small in the ratio epsilon_0^2 / t, where epsilon_0 is the physical separation between the observable's support and the reflection plane (minus the diffusion scale sqrt(8t)), and t > 0 is the flow time. We record the standard Osterwalder-Schrader reconstruction as a conditional statement: exact reflection positivity on a positive-time algebra implies a Hilbert space, a vacuum, and a non-negative Hamiltonian. Our approach is non-perturbative, holds for all values of the lattice coupling, and requires no cluster expansion or infinite-volume limit.
Category: Mathematical Physics
[80] ai.viXra.org:2602.0083 [pdf] replaced on 2026-03-28 00:34:17
Authors: Adrian Rohr
Comments: 23 Pages.
Wallström (1989, 1994) showed that the Madelung hydrodynamic equations admit solutions with non-integer phase circulation, for which no single-valued wave function exists. Previous completions of the Madelung system postulate either single-valuedness or the quantization condition directly.In this paper we consider the regularity of the probability current j = ρ ∇S/m at nodal zeros within the Onsager-Machlup stochastic variational framework. We find that requiring j ∈ C∞, combined with the Hamilton-Jacobi constraint at zeros of ρ, implies integer phase circulation. Neither condition alone has this consequence: smooth currents with arbitrary circulation exist when the dynamics is absent, and the Hamilton-Jacobi constraint alone admits the non-quantized solutions constructed by Reddiger and Poirier (2023). We also find that C∞ is the only regularity class with this property: for any finite k, non-integer solutions satisfying Ck can be constructed.When the framework is applied with initial data satisfying ρu2080 > 0, the phase is single-valued by simple connectivity, the Schrödinger equation follows from the variational principle, and any nodes formed under subsequent evolution carry integer winding numbers. The variational principle degenerates at zeros of ρ, leaving the winding parameter undetermined — a feature that holds for any variational functional of the form ∫ρ G du207fx, not only the Onsager-Machlup action.The non-quantized solutions correspond to multivalued sections of a non-trivial line bundle and do not arise within the natural domain of the stochastic framework.
Category: Quantum Physics
[79] ai.viXra.org:2602.0082 [pdf] submitted on 2026-02-17 02:36:31
Authors: Tingfang Yi
Comments: 7 Pages.
We propose a minimal six-dimensional (6D) light null entity in which the six dimensions are intrinsic degrees of freedom of a null physical entity. The six dimensions consist of a two- dimensional null propagation geometry together with four intrinsic one-dimensional degrees of freedom of light: optical phase, polarization, frequency, and orientation along the null momentum generator. In this framework, all four-dimensional (4D) spacetime optical, electromagnetic, and quantum phenomena are understood as lower-dimensional projection or section measurements of a single higher-dimensional null entity.
Category: Mathematical Physics
[78] ai.viXra.org:2602.0081 [pdf] submitted on 2026-02-16 23:46:46
Authors: Pruk Ninsook
Comments: 43 Pages.
This work presents the theoretical framework of the Information-Geometric Physics System (IGPS), which elucidates the emergence of fundamental properties in leptonic particles through the structure of the Oloid manifold [17]. Under the constraints of C2continuity and seam coupling enforced by S1 ⊥ S1 symmetry, the fundamental constituents of matter are modeled as "closed information nodes." In this framework, the spectral mass is derived as a curvature integral and a normal bundle holonomy manifesting along the manifold’s seam [13]. We prove that the moduli space of the seam under rigidity conditions yields an SO(3) symmetry group structure, naturally inducing a spin structure via the SU(2) double cover [18]. Furthermore, it is demonstrated that the geometric stiffness parameterβ = 1/(√3π) emerges as a universal geometric invariant governing the internal strain scale [3]. These results suggest that the fermion mass spectrum is directly coupled to the topological structure of the informational manifold, providing a foundational basis for the extension into composite systems and strong interactions in subsequent work [9].
Category: Quantum Gravity and String Theory
[77] ai.viXra.org:2602.0080 [pdf] submitted on 2026-02-16 23:48:46
Authors: Pruk Ninsook
Comments: 25 Pages.
This research extends the scope of the Information-Geometric Physics System (IGPS) from single-node systems to composite nuclear structures via the Oloid Trinity Configuration, elucidating the topological origin of mass and the statistical properties of baryons [8, 9]. We introduce the Dimensional Jump phenomenon, representing an informational scale transition from planar scaling on the manifold’s seam to the sweep volume ofentangled folded manifolds. This transition results in the emergence of a universal geometricmultiplier G = 4 3π2, which systematically bridges leptonic mass to the nuclear mass scale.Under rigidity constraints and SU(3) symmetry, we prove that the extrinsic/interactionstrain is fixed at Δ = 2.5 through the 5/2 Theorem, enabling the master equation to predict the proton mass with 99.99% accuracy relative to CODATA standards [10]. Furthermore, it is demonstrated that Fermi-Dirac statistics and fractional spin 1/2 emerge directly from the preservation of C2 continuity on manifolds entangled through the SU(2)double-covering structure. Residual analysis confirms that manifest discrepancies align withthe order of radiative corrections in Quantum Electrodynamics (QED). These results confirmthat baryonic structure represents the most stable volumetric organization of information,effectively achieving structural closure for the origin of matter within the IGPS framework.
Category: Quantum Gravity and String Theory
[76] ai.viXra.org:2602.0079 [pdf] submitted on 2026-02-16 23:53:19
Authors: Wiroj Homsup
Comments: 3 Pages.
A new Twin prime sieve based on a modified sieve of Sundaram is introduced. It sieves through the set of natural numbers n such that 3n is not representable in either of the forms 2ij + i + j or 2ij + i + j +1 for positive integers i, j.
Category: Number Theory
[75] ai.viXra.org:2602.0078 [pdf] submitted on 2026-02-15 17:57:53
Authors: Luisiana X Cundin
Comments: 6 Pages.
A formal, systematic approach for generating nonlinear partial differential equations is outlined, which provides a more robust, reliable method. Additionally, formal methods provide a means to test the validity and/or the veracity of proposed nonlinear partial differential equations, thereby potentially saving researchers precious time and effort.
Category: Mathematical Physics
[74] ai.viXra.org:2602.0077 [pdf] submitted on 2026-02-15 05:11:13
Authors: Lluis Eriksson
Comments: 14 Pages.
We prove that the continuum limit of pure SU(N) lattice Yang—Mills theory in four Euclidean dimensions exists on the algebra of blocked observables at fixed finite volume, conditional on a quantitative regularity hypothesis for the blocking map. The argument combines three components: Bałaban's rigorous renormalization group program, which provides polymer representations and ultraviolet stability; a Doob-martingale influence bound that controls covariance without product-measure assumptions; and a renormalization-group Cauchy summability framework converting per-scale oscillation decay into convergence. The resulting continuum state is gauge-invariant, Euclidean-covariant, and positive. Osterwalder—Schrader reconstruction, the thermodynamic limit, and the mass gap remain open.
Category: Mathematical Physics
[73] ai.viXra.org:2602.0076 [pdf] replaced on 2026-03-24 04:07:20
Authors: Kobie Janse van Rensburg
Comments: 27 Pages.
The Topological Inversion Model (TIM) derives the structure of the Standard Model from a single axiom: the self-negation of Absolute Nothing (//Gaunab), which generates a Z_2 involution. Combined with compactness, orientability, and minimality, this uniquely determines the spatial manifold as RP^3 = S^3/Z_2 (Theorem 1), deriving three spatial dimensions rather than assuming them.Version 7 supersedes previous versions with major extensions: (i) the complete logical chain from self-negation to the Standard Model gauge group in seven steps, with each step labelled as axiom, theorem, or derived;(ii) an RP^3 uniqueness theorem proving that the spatial manifold is the only compact orientable quotient consistent with the foundational Z_2; (iii) classification of flat gauge bundles for G_SM = SU(3) x SU(2) x U(1) on RP^3, yielding 4 physical Hosotani sectors with vector-like colour automatic in all sectors;(iv) Casimir energy computation showing that the topological vacuum prefers unbroken electroweak symmetry with sin^2(theta_W) = 1/4 as the bare ratio, while EWSB is driven by the Higgs potential at T_c ~ 150-320 GeV;(v) resolution of M* = 3689 +/- 200 GeV as a derived matching scale (topology plus SM running), not requiring a separate dynamical mechanism; (vi) closure of the alpha programme through six independent routes, establishing that topology determines ratios while absolute couplings require dynamical input; (vii) identification of the width parameter W on B_3 as the generation quantum number, explaining N_gen = N_colour = 3 as a topological identity; and (viii) geometric emergence of Z_3 colour structure from degree-3 Hopf preimages on S^3.The framework reduces the Standard Model's 19 free parameters to 18 (via the topological mass relation M_u = M_d + 3M_e) and produces 5 quantitative predictions: M* ~ 3.7 TeV, T_c ~ 150-320 GeV, R^{-1} ~ 150 GeV, sin^2(theta_W) = 1/4 (topological), and Casimir vacuum selection of the unbroken electroweak sector. Three irreducible free parameters remain: alpha_em, the Higgs VEV v, and the Higgs mass m_H.
Category: Relativity and Cosmology
[72] ai.viXra.org:2602.0075 [pdf] submitted on 2026-02-14 07:02:02
Authors: Ryuhei Sato
Comments: 16 Pages.
Einstein’s special relativity postulates the constancy of light speed c as an axiom but providesno explanation for its origin. We reformulate the universe as a discrete computational networkoperating at the Planck scale, wherein c emerges not as a fundamental constant but as the band-width limit of information processing. We demonstrate that the initial state of the universe—aninfinite-dimensional regular simplex—possesses spectral properties (Laplacian eigenvalue λ2 = N )that naturally enforce cosmic uniformity and clock synchronization without invoking inflationaryexpansion. Dimensional reduction from infinite to three dimensions generates an unavoidable infor-mational collision, which we term informational Pauli repulsion, providing the physical driver forboth the Big Bang and accelerated expansion. The deficit angle δ ≈ 7.36◦ inherent in the 600-celltessellation, combined with Gauss’s Theorema Egregium, guarantees spatial closure without externalembedding dimensions, thereby establishing a decisive advantage over string-theoretic frameworksrequiring 10 or 11 dimensions. We derive the Light-Speed Resource Allocation Principle (LRAP),c2 = v2 + τ 2, reinterpreting the Lorentz factor as a processing lag ratio rather than a coordinatetransformation coefficient. Black hole singularities are redefined as computational arrest regionswhere 3D rendering fails, leaving 4D data in a frozen state—a paradigm shift that naturally sub-sumes string theory and the holographic principle as effective descriptions within these arrestedzones. Finally, we prove that local resource allocation alone cannot resolve the global accumula-tion of geometric frustration, necessitating the hierarchical jamming transitions detailed in Part III.This work bridges the static geometry of Part I (derivation of G and Λ) with the thermodynamichierarchy of Part III (122-digit vacuum energy suppression), completing the dynamical core of theRegular Simplex Hierarchical Gravity (RSHG) framework.
Category: Relativity and Cosmology
[71] ai.viXra.org:2602.0074 [pdf] submitted on 2026-02-14 07:07:46
Authors: Ryuhei Sato
Comments: 12 Pages.
This paper presents a complete resolution of the cosmological constant problem within the Regular Simplex Hierarchical Gravity (RSHG) framework. The non-tessellating property of regulartetrahedra in 3D Euclidean space—characterized by a geometric residual (deficit angle δ ≈ 7.36)—induces recursive jamming transitions across six hierarchical scales spanning from 10−15 m to 1021 m(Fig. 1: six-stage cascade structure). Each hierarchy generates approximately 20 orders of magnitude of energy suppression. The cumulative suppression factor ϵtotal ≈ 10−122.2agrees with theobserved cosmological constant Λobs within 0.2 orders of magnitude. Critically, this result containsno adjustable parameters; even the number of hierarchies N = 6 emerges as an arithmetic consequence of the target suppression (122 digits) divided by the single-stage suppression (∼ 19.2 digits).Furthermore, operation near the jamming criticality (ϕ ≈ 0.62, Fig. 2: metastable operating point)enables the conversion of computational heat into structural entropy (computational encapsulation),thereby preventing thermal collapse. Three experimentally verifiable predictions are presented: H4symmetry in the CMB angular power spectrum (ℓ = 120n), an entropy ratio Sstruct/Sthermal ≈ 0.2in Bose-Einstein condensates, and tetrahedral coordinate preference in protein structures.
Category: Relativity and Cosmology
[70] ai.viXra.org:2602.0073 [pdf] submitted on 2026-02-14 09:06:13
Authors: Lluis Eriksson
Comments: 17 Pages.
We prove that expectations of blocked, bounded Lipschitz observables at a fixed physical scale ℓ > 0 form an absolutely summable telescoping sequence along a Balaban-matched renormalization trajectory in four-dimensional SU(N_c) lattice Yang—Mills theory with lattice spacings a_k = a_0 2^{-k}. In particular, the continuum limit state ω(O) := lim_{k→∞} ⟨O^{(k)}⟩_{Λ_k, β_k} exists for every O in the blocked observable algebra A_ℓ^{block}. The proof uses three ingredients: (i) an exact RG identity (law of iterated expectations), (ii) a one-step pushforward stability bound for blocked observables derived from Gaussian control of fast modes and an approximate centering property of the fluctuation field, and (iii) a measure-comparison lemma via Duhamel interpolation using polymer remainder bounds. No quantitative rate of asymptotic freedom is required beyond staying in the small-coupling regime where the RG estimates hold; summability follows from the geometric decay (a_k/ℓ)^2 = O(4^{-k}) together with the assumed summability of the large-field/truncation errors {τ_k}. We also state a conditional extension to "renormalized" observables (e.g. Creutz-type constructions) contingent on a nonperturbative Symanzik extraction from polymer expansions, and we discuss the relation to Osterwalder—Schrader reconstruction and the mass gap problem.
Category: Mathematical Physics
[69] ai.viXra.org:2602.0072 [pdf] submitted on 2026-02-14 11:43:03
Authors: Lluis Eriksson
Comments: 14 Pages.
We close the missing influence estimate — Assumption (B6) — required by the RG-Cauchy summability framework for blocked observables in four-dimensional SU(N_c) lattice Yang-Mills theory. The influence is measured by the Efron-Stein seminorm sigma_nu(f)^2 = sum_{e} E_nu[Var_e^nu(f)] that appears in the Duhamel interpolation lemma of the companion paper. We work in the small-field regime of Balaban's multiscale effective action and assume: (A1) a standard polymer representation for the irrelevant remainder V_k^{irr} = sum_X K_k(X); (A2) an explicit per-link oscillation bound for polymer activities carrying the correct irrelevance factor 2^{-2k}; (A3) a lattice-animal counting estimate. Under these three verifiable hypotheses — to be discharged from Balaban's historical work in a companion compendium paper — we prove sup_{t in [0,1]} sigma_{nu_{k,t}}(V_k^{irr}) <= C, where C = C(N_c, beta_0, kappa, C_osc, C_anim, p, L/a_0) is independent of the RG scale k. The proof uses only oscillation bounds and combinatorics: no log-Sobolev inequality, no mixing hypothesis, and no measure-dependent technology beyond the definition of conditional variance. This removes the only genuinely novel probabilistic input remaining in the UV block of the programme towards the Yang-Mills Millennium Prize.
Category: Mathematical Physics
[68] ai.viXra.org:2602.0071 [pdf] replaced on 2026-02-18 02:42:09
Authors: Jason Merwin
Comments: 10 Pages.
A foundational question in the philosophy of physics is whether time is a fundamental dimension of reality or an emergent phenomenon. The standard Block Universe interpretation of general relativity treats time as a static dimension, with the passage of time relegated to psychological illusion. In this paper, we present a novel argument against the Block Universe derived from the framework of Relational Mathematical Realism (RMR), which identifies physical existence with mathematical structure. We demonstrate that if reality is a sufficiently complex, locally consistent mathematical structure, then Gödel's First Incompleteness Theorem renders a static, completed universe logically impossible. The resolution of this impossibility requires the structure to undergo a non-terminating sequence of state extensions, which we identify with the passage of time. We conclude that time is not a dimension within which the universe exists, but rather the logically necessary process by which a complex mathematical structure maintains consistency. This result, if sound, constitutes the first derivation of temporal passage from mathematical logic and ontology alone.
Category: History and Philosophy of Physics
[67] ai.viXra.org:2602.0070 [pdf] submitted on 2026-02-14 18:53:53
Authors: Lluis Eriksson
Comments: 7 Pages.
We prove a uniform Doob martingale influence bound for the irrelevant polymer remainder arising in multiscale renormalization group analyses of four-dimensional SU(N_c) lattice Yang-Mills theory at fixed physical volume. Our main tool is the Doob influence seminorm sigma_nu(f)^2 = sum_i E_nu[(Delta_i f)^2], which yields an exact covariance identity for arbitrary probability measures. Assuming a deterministic per-link oscillation estimate for polymer activities with a scale factor 2^{-2k} (imported from the Balaban renormalization group programme) and using a standard lattice-animal counting lemma (proved here), we obtain a bound sup_{t in [0,1]} sigma_{nu_{k,t}}(V_k^{irr}) <= C independent of the RG scale k. We then explain how this bound feeds into a Duhamel interpolation step used in RG-Cauchy convergence arguments.
Category: Mathematical Physics
[66] ai.viXra.org:2602.0069 [pdf] submitted on 2026-02-14 20:48:17
Authors: Lluis Eriksson
Comments: 23 Pages.
We provide a self-contained derivation of the three structural hypotheses — polymer representation (A1), per-link oscillation bounds with geometric decay factor (A2), and large-field suppression (B5) — that were assumed in Doob Influence Bounds for Polymer Remainders in 4D Lattice Yang—Mills Renormalization and in the RG—Cauchy Master Framework. All results are traced to precise equations in the primary sources: the series of papers by T. Bałaban (Commun. Math. Phys., 1984—1989) and the expository trilogy by J. Dimock (2011—2014). The translation from Bałaban's analytic norms on gauge-covariant function spaces to the per-link oscillation language used in the probabilistic framework is made explicit. Together with Doob Influence Bounds for Polymer Remainders in 4D Lattice Yang—Mills Renormalization, this completes the unconditional discharge of the UV structural inputs for the renormalization group approach to the Yang—Mills mass gap problem at finite volume.
Category: Mathematical Physics
[65] ai.viXra.org:2602.0068 [pdf] submitted on 2026-02-15 00:33:07
Authors: Chaiya Tantisukarom
Comments: 11 Pages.
This article presents "Prime Gear Geometry," a deterministic mechanical framework that redefines the integer axis as a master gear ($C_1$) with a discrete unit weight-step of $+1$. Unlike analytic models that rely on the complex-plane "1/2" critical line of the Riemann Hypothesis, this theory posits that prime numbers are exact geometric outcomes forged by $C_1$ at coordinates of total asynchronous interference. We establish the "Prime Gear Synchronization Conjecture," stating that total phase alignment of a prime gear group occurs only at Primorial intervals. This mechanical exactness is used to resolve the Goldbach, Twin Prime, and Collatz conjectures not as probabilistic likelihoods, but as structural necessities of a machine that, by the laws of relatively prime circumferences, is incapable of perfect synchronization within the finite bounds of the $C_1$ axis.
Category: Number Theory
[64] ai.viXra.org:2602.0067 [pdf] submitted on 2026-02-14 01:33:55
Authors: Cornelius Moore
Comments: 24 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
Modern theoretical physics employs distinct mathematical formalisms—Lagrangian mechanics, Hamiltonian dynamics, quantum amplitudes, statistical ensembles, field-theoretic path integrals—that, while empirically successful, lack a unified structural foundation. We present the Universal Mathematical System (UMS), a vari-ational—probabilistic framework in which standard physical theories arise as limiting cases, projections (marginalizations), or constrained reductions of a single maximum-entropy measure over configuration spaces. The framework is built on an exponential-family measure µ[C] = Z−1 exp(−Φ[C]), where Φ is a constraint functional encoding physical laws. We show under standard regularity assumptionshow classical mechanics emerges via the Laplace principle (β →∞), statistical mechanics is directly identified with the framework, and quantum mechanics corresponds to complex-weighted measures under standard path-integral formalism. Additionally, we formalize five distinct algebraic structures—quantity (Rn), growth (semigroups), information (entropy), phase (U(1)), and ratio (R+)—clarifying thatdifferent physical questions inhabit different mathematical domains and that confusion arises from naive cross-domain interpretation. The framework is intended as structural unification of existing formalisms rather than a proposal of novel fundamental ontology or new empirical predictions. We include a proof of a coarse-graining monotonicity theorem using the data-processing inequality, provide explicitreduction pipelines, and discuss extensions to chemistry, biology, neuroscience, and computation.
Category: Functions and Analysis
[63] ai.viXra.org:2602.0066 [pdf] submitted on 2026-02-13 20:21:39
Authors: David Taylor
Comments: 28 Pages. (Note by viXra Admin: Please cite and list scientific references)
Finite local symbolic observation exhibits bounded vocabularies across diverse computational domains despite systematic increases in observational scale. We apply afixed local symbolic encoding framework to 13 systems spanning quantum mechanics, fluid dynamics, thermodynamics, electromagnetism, chaos theory, number theory, combinatorial logic, and stochastic processes. Across all domains, observed symbolic vocabularies saturate, with a median final growth of 0.0% despite 100—1,000× increases in data volume, temporal extent, or problem size. Prime gap dynamics provides the strongest validation: an infinite, deterministic mathematical sequence with no physical dynamics saturates at 837 symbolic configurations across a 10,000× scale increase (100,000 to 1,000,000,000 primes,identical vocabulary), eliminating physical mechanisms as explanations. At one billionprimes, each of the 837 patterns is reused approximately 1.2 million times. Ten domainsachieve perfect saturation (0.0%), two near-perfect (<1%), and one strong (<20%). Symbolic space occupancy ranges from 0.08% (Schrödinger equation) to 92.35% (electromagnetic waves); both regimes nonetheless exhibit saturation. Saturation manifests independently of physical validity (thermodynamically invalid antidiffusion saturates identically to correct heat diffusion), determinism (chaotic andstochastic systems both saturate), and computational complexity (NP-complete 3-SATcollapses to eight symbolic patterns). These results indicate that bounded symbolicobservability reflects properties of finite local observation applied to locally-constraineddynamics rather than intrinsic system complexity—a constraint on measurement, not nature. Quantitative vocabularies are specific to the observational architecture employed; the empirical claim concerns the cross-domain emergence of vocabulary saturation under fixed local symbolic observation.
Category: Artificial Intelligence
[62] ai.viXra.org:2602.0065 [pdf] submitted on 2026-02-13 05:45:31
Authors: Heath W. Mahaffey
Comments: 12 Pages.
The Informational Actualization Model (IAM) proposes that late-time cosmic expansion couplesdifferently to photons versus matter, resolving the Hubble tension through sector-specific expansionrates. This dual-sector framework makes a critical, testable prediction: Type Ia supernovae (SNe),as matter-based distance indicators hosted in galaxies, should probe the matter sector. We testthis hypothesis using the complete Pantheon+ dataset (1588 SNe, 0.01 < z < 2.26) through threeindependent analyses: (1) SNe with Planck (photon-sector) H0 prior, (2) SNe with SH0ES (matter-sector) H0 prior, and (3) SNe with no H0 constraint. Results unambiguously demonstrate that SNereject the photon-sector expansion rate (H0 = 67.4 km s−1 Mpc−1, β → −0.30 at parameter bound-ary) and accept the matter-sector normalization (H0 = 73.04 km s−1 Mpc−1, β ≈ 0). Critically,SNe distances maintain ΛCDM geometric consistency (βdistance ≈ 0), validating IAM’s predictionthat sector-specific coupling primarily affects structure growth (f σ8) rather than photon propa-gation geometry. This empirical validation establishes that dual-sector expansion is data-driven,not theoretically assumed, and demonstrates that Planck (H0 = 67.4, photon sector) and SH0ES(H0 = 73.04, matter sector) both measure correctly—they probe different physical quantities. Thedual-sector phenomenology maps directly onto the standard modified gravity parametrization: mat-ter perturbations obey μ(a) = H2ΛCDM(a) / [H2ΛCDM(a) + βmE(a)] < 1 (suppressed growth), whilephoton deflection remains unmodified (Σ = 1), preserving CMB consistency. This μ < 1, Σ = 1framework is independently testable with existing Boltzmann solvers (CAMB/CLASS) and upcom-ing survey parametrizations (DES, Euclid, CMB-S4)
Category: Relativity and Cosmology
[61] ai.viXra.org:2602.0064 [pdf] submitted on 2026-02-13 20:09:31
Authors: Bertrand Jarry
Comments: 8 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)
For a century, physicists have sought to unify general relativity and quantum mechanics. We show that this quest rests on a fundamental conceptual error: gravity never needed to be "quantized" because it is already a quantum phenomenon emerging from the vacuum. By recognizing the quantum vacuum as the fundamental substrate, unification becomes trivial. All forces, including gravity and relativity, are manifestations of the same quantum vacuum. The problem was not to find a theory of quantum gravity, but to recognize that it already existed.
Category: Quantum Gravity and String Theory
[60] ai.viXra.org:2602.0063 [pdf] replaced on 2026-02-14 08:18:38
Authors: Lluis Eriksson
Comments: 19 Pages.
Building on the lattice results established in Papers [E26I]-[E26IX], we give a conditional construction of a scaling-limit state for pure SU(N_c) lattice Yang-Mills theory in four Euclidean dimensions, along dyadic lattice spacings a_k = a_0 2^{-k}. The construction proceeds via a two-layer architecture. Layer 1 (Local fields): For bounded gauge-invariant local observables (Wilson loops, normalized plaquette traces), expectations converge without extracting subsequences to a unique limit. Precompactness of expectations at fixed physical side length L is trivial since |_{a,L}| <= 1. Uniqueness follows from a multiscale RG-Cauchy estimate that bounds the change of local expectations under a single RG step. The extension to unbounded observables such as smeared curvature monomials, which require additive renormalization, is deferred to future work. Layer 2 (Confinement): The physical string tension sigma_phys > 0 is established through step-scaling of Creutz ratios evaluated on Wilson loops whose physical dimensions R x T are held fixed as a -> 0. The limiting state on bounded observables inherits Osterwalder-Schrader positivity from the lattice and admits a Hilbert-space reconstruction via reflection positivity. SO(4) rotational invariance is expected in the continuum (the hypercubic breaking being O(a^2), subject to standard operator classification and construction of renormalized Schwinger functions). The mass gap is established conditionally via uniform exponential clustering of connected correlators -- an input from a uniform physical transfer-matrix spectral gap -- and the reconstruction theorem. Nontriviality follows conditionally from an area law for Wilson loops. Key dependencies on prior papers: uniform LSI inputs [E26I]-[E26IX]; Balaban multiscale effective action [E26III]-[E26V]; DLR-LSI [E26VII]; unconditional lattice closure inputs [E26IX].
Category: Mathematical Physics
[59] ai.viXra.org:2602.0062 [pdf] submitted on 2026-02-13 20:05:47
Authors: Bertrand Jarry
Comments: 8 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)
We propose a radically new conceptual framework for understanding gravity and relativity, not as geometric curvatures of spacetime, but as physical modifications of the quantum vacuum. Gravity results from a static osmotic depression, while special relativity emerges from the dynamic compression of the vacuum by motion. This approach naturally resolves the problem of unification with quantum mechanics and makes experimentally testable predictions. Unlike attempts to "quantize geometry" (strings, loops), our theory recognizes that gravity and relativity are already quantum phenomena in nature, emerging from the properties of the vacuum.
Category: Quantum Gravity and String Theory
[58] ai.viXra.org:2602.0061 [pdf] submitted on 2026-02-13 20:04:26
Authors: Bertrand Jarry
Comments: 15 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite and list scientific references)
This theory unifies gravity, special relativity, general relativity, and quantum mechanics by recognizing the quantum vacuum as the single fundamental substrate. Gravity emerges as a static osmotic depression of the vacuum, special relativity as a dynamic compression, and all forces as manifestations of the same vacuum. This unification is not achieved through the quantization of geometry, but through the recognition that gravity was already quantum.
Category: Quantum Gravity and String Theory
[57] ai.viXra.org:2602.0060 [pdf] submitted on 2026-02-13 20:27:17
Authors: Michel Monfette
Comments: 14 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)
We study the modular dynamics of prime numbers through the families SG(k) = {p ∈ P : kp + 1 ∈ P}. We present extensive computational evidence (up to 100 million Sophie Germain primes) fora modular dynamical structure in SG primes modulo 30. Two fundamental theorems establishthe triangular residue class 11,23,29 and gap constraints. Eight conjectures emerge, includinga novel "least-gap principle" and a harmonic attractor at 60. SG(k) families exhibit distinctphases in entropy/detailed-balance plane, with period-9 resonance and asymmetric sexy orbits.A third grammatical dynamic (G3) classifies all anomalies into three energy regimes. These pat-terns suggest an underlying arithmetic grammar and self-organizing behavior in Sophie Germainprimes.
Category: General Mathematics
[56] ai.viXra.org:2602.0059 [pdf] submitted on 2026-02-12 19:18:54
Authors: J. W. McGreevy
Comments: 18 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
We present the Arithmetic Relativistic Emergence (ARE) framework, in which the fundamental symmetries of General Relativity, Einstein—Cartan gravity with torsion, and quantum field theory (Standard Model sectors) emerge tautologically from pure number theory viathe arithmetic geometry of the rational numbers Q. The Riemann zeta function ζ(s) represents the maximally symmetric pre-geometric vacuum phase, with perfect functional-equation symmetry around Re(s) = 1/2 and pole at s = 1 as the unified source of arithmetic energy/information.Spontaneous symmetry breaking induced by the weight-12 modular discriminant ∆(τ ) = η(τ )24 = (2π) 12(E4(τ ) 3 − E6(τ ) 2 )/1728 disperses this background into Archimedean divergence (smooth analytic curvature density) and non-Archimedean curl (torsion/spin density at p-adic fibers), with the functional-equation mirror s = 6 enforcing variational d balance of the arithmetic degree deg( L). The emergent 4D Lorentzian manifold M carries an adelic principal Lorentz/Spin frame bundle decomposed via the adele ring AQ. An effective Chern—Weil homomorphism—employing Bott—Chern forms at infinity and classical invariant polynomials at finite places—maps split curvature forms (Fdiv, Hcurl) to arithmetic characteristic classesin Arakelov Chow groups. These classes are stationary under metric d variations (δgdeg = 0 at s = 6), providing rigid global topological invariants (Pontryagin-like, Euler-like, torsion-twisting) preserved in the broken phase—the inevitable geometric and topological labels of arithmetic symmetry breaking. Heaviside synchronization (τdiv = τcurl) at s = 6 renders the arithmetic medium transparent, yielding distortionless propagation and unified causality. The Rankin—Selberg self-convolution L(∆×∆, s) contains ζ(s) factors, allowing recombination to the symmetric vacuum. Theemergent metric determinant √−g serves as the physical scalar whose arithmetic balancing across places enforces general covariance, proper volume preservation, and integration of curvature invariants. Fermions (12 Weyl per generation from Leech lattice Z2-orbifold),gauge sectors (finite algebra C ⊕ H ⊕ M3(C)), and constants (−α 1 ≈ 137.036, Λ ∼ 10−122MPl2, G ∼ 10−38 m−2) emerge via spectral actionand adelic convolution. ARE offers a tautological origin: physical laws are the minimal effective description preserving arithmetic consistency post-symmetry breaking.
Category: Number Theory
[55] ai.viXra.org:2602.0058 [pdf] submitted on 2026-02-12 19:09:17
Authors: Kelly Sonderegger
Comments: 31 Pages. CC BY 4.0 License
The quantum measurement problem—how definite outcomes emerge from quantumstates—has resisted solution for nearly a century. We propose that the resolution liesin recognizing that quantum systems exist as extended waves until environmental coupling drives a phase transition to localized particles. There is no "superposition" in theconventional sense—the wave state is the fundamental reality. This Anchored Causality Theory (ACT) applies quantum field theory’s own ontology consistently throughmeasurement: fields are fundamental, particles are emergent localized excitations, andmeasurement is the physical process by which extended field configurations anchor intoparticle modes. ACT completes what QFT started—taking field ontology seriously allthe way through the measurement process.Remarkably, QFT’s mathematical structure already encodes this wave-particle phasetransition. The Lagrangian formulation (action principle, path integrals) is the naturallanguage of waves—extended field configurations exploring spacetime. The Hamiltonian formulation (definite states, observable eigenvalues) is the natural language ofparticles—localized excitations evolving in time. The Legendre transform connectingthem is the mathematical shadow of anchoring. What we call "superposition" is simply Fourier decomposition—one wave represented in different bases, not ontologicalmultiplicity. The mathematics was telling us this all along; we needed only to read itcorrectly.Measurement is progressive phase diffusion driven by coupling to environmentalquantum fields, with rates determined by particle mass through the Higgs mechanism.ACT emerges from three distinct physical processes: (1) Higgs-generated mass establishes the structural capacity for temporal participation and sets coupling strength, (2)gauge fields and phonons provide infrared noise spectra that drive decoherence dynamics, and (3) definite outcomes emerge when the anchoring functional Φ ≳ 1, markingirreversible phase transition from wave to particle.1We derive explicit anchoring rates from quantum Brownian motion theory, showing ΓA ∝ m2 × T × ηenv, where mass-squared scaling follows from Yukawa couplingstructure. The theory explains all existing decoherence phenomena—mass dependence,temperature scaling, environmental density effects, observable-specific rates, and persistence at zero temperature—while making a unique testable prediction: isotope massdependence of 15-20% in coherence times, distinguishable from environmental decoherence models (0%) and competing collapse models (∼8%). Standard Model EffectiveField Theory analysis establishes a viable parameter window spanning 15 orders ofmagnitude. Quantum randomness is explained as stochastic noise from environmental fields (thermal and vacuum fluctuations), not mysterious collapse—calculable viathe fluctuation-dissipation theorem. ACT provides mechanism, ontology, and testablepredictions using only established physics.
Category: Quantum Physics
[54] ai.viXra.org:2602.0057 [pdf] submitted on 2026-02-12 19:06:11
Authors: Lluis Eriksson
Comments: 22 Pages.
We prove integrated cross-scale derivative bounds that replace the unverified Assumption 5.4 of a companion paper. Combined with two explicit large-field inputs (a residual pointwise derivative bound and a Balaban-type conditional large-field suppression) and conditional inequalities from the orbit space Ricci curvature, this yields a uniform (volume-independent) log-Sobolev inequality for the Wilson lattice gauge measure at sufficiently weak coupling (large beta). The key innovation is a decomposition into small-field and large-field contributions: the former is controlled by Balaban's polymer expansion, while the latter is handled by a pointwise gradient bound combined with exponential measure suppression. We provide a self-contained verification of the unconditional large-field tail mechanism for SU(2) in d=2, together with numerical validation.
Category: Mathematical Physics
[53] ai.viXra.org:2602.0056 [pdf] submitted on 2026-02-12 19:07:45
Authors: Lluis Eriksson
Comments: 22 Pages.
We verify the large-field hypothesis (Hypothesis 4.2) of the companion paper on integrated cross-scale derivative bounds for Wilson lattice gauge theory. The proof rests on three ingredients: (i) a dictionary lemma translating the Hilbert-Schmidt large-field condition on plaquette holonomies into Balaban's Lie-algebra formulation; (ii) an interface lemma connecting conditional measures with Balaban's T-operation and its uniform small-factor bound on admissible background fields (Eq. (1.89) of Balaban, Commun. Math. Phys. 122 (1989)); (iii) the uniformity estimate (Eq. (1.75) of the same reference) ensuring that slow-field dependence contributes only an O(1) multiplicative constant. For d=2, we give an independent proof via character-positive convolutions that avoids the Balaban machinery entirely. Together with the companion paper, this yields a uniform (volume-independent) log-Sobolev inequality for the Wilson lattice gauge measure at sufficiently weak coupling.
Category: Mathematical Physics
[52] ai.viXra.org:2602.0055 [pdf] submitted on 2026-02-12 19:09:28
Authors: Lluis Eriksson
Comments: 10 Pages.
We prove that the Wilson lattice gauge measure for SU(N_c) in dimension d >= 3 at sufficiently weak coupling (beta >= beta_wc) satisfies a log-Sobolev inequality with constant alpha_* > 0 independent of the lattice volume. This completes the multiscale program initiated in Paper I by verifying Hypothesis 3.2 of Paper III, the last remaining analytic input. The verification uses three ingredients: (i) the locality of polymer functionals, which restricts the sum over polymers to those intersecting a fixed link; (ii) Cauchy estimates on Balaban's analytic domains for polymer activities and boundary terms; and (iii) a combinatorial counting bound for connected polymers containing a given link, which is independent of the lattice volume. Combined with the synthetic Ricci curvature bound of Paper II, the integrated cross-scale derivative bounds of Paper III, and the large-field suppression established in Paper IV, this yields the uniform log-Sobolev inequality unconditionally.
Category: Mathematical Physics
[51] ai.viXra.org:2602.0054 [pdf] submitted on 2026-02-12 19:10:34
Authors: Lluis Eriksson
Comments: 16 Pages.
We prove that the one-step transfer operator of SU(N_c) lattice Yang-Mills theory in dimension d >= 3 has a spectral gap Delta_phys > 0 uniformly in the lattice volume (for even side length L), for all sufficiently large inverse coupling beta >= beta_0. The proof combines four ingredients: (i) the uniform log-Sobolev inequality on periodic tori established in a companion paper; (ii) a verification that the multiscale RG outputs needed for the LSI argument are uniform in frozen boundary conditions (Section 4), yielding the full DLR-LSI property (Section 5); (iii) the Stroock-Zegarlinski equivalence theorem, which in its standard formulation deduces Dobrushin-Shlosman mixing and exponential clustering from DLR-LSI; and (iv) Osterwalder-Seiler reflection positivity of the Wilson action, which translates temporal exponential clustering into a spectral gap of the transfer operator.
Category: Mathematical Physics
[50] ai.viXra.org:2602.0053 [pdf] submitted on 2026-02-12 19:11:43
Authors: Lluis Eriksson
Comments: 14 Pages.
We prove that for SU(N_c) lattice Yang-Mills theory in d >= 3 dimensions at sufficiently weak coupling (beta >= beta_0), the conditional Gibbs specification satisfies a DLR-uniform log-Sobolev inequality: for every finite sub-lattice Lambda' subset of Z^d and every boundary condition omega, the conditional measure mu_{Lambda'}^{omega} satisfies LSI(alpha_*) with a constant alpha_* > 0 independent of Lambda' and omega.The proof combines three ingredients:(i) the multiscale entropy decomposition developed in our earlier work (Papers I-V), which establishes a uniform log-Sobolev inequality on periodic tori;(ii) a uniform fiber oscillation lemma showing that frozen boundary links -- treated as external parameters in Balaban's renormalization group -- do not increase the per-block oscillation of the conditional fast potential, thanks to compactness of SU(N_c) and the locality of the polymer expansion;(iii) a refined large-field event restricted to dynamical (non-frozen) plaquettes, which ensures that the large-field suppression mechanism extends uniformly to boundary blocks.As a consequence, the Stroock-Zegarlinski equivalence theorem yields Dobrushin-Shlosman mixing, exponential clustering of gauge-invariant correlations, and -- via Osterwalder-Seiler reflection positivity -- a strictly positive mass gap Delta_phys >= m(beta, N_c, d) > 0 for the transfer matrix on the periodic torus (Z/LZ)^d, uniformly in even L. This removes the Dobrushin-type Assumption 6.3 of Paper I and the boundary-uniformity Assumption 3.1 of Paper VI, rendering the lattice mass gap unconditional at weak coupling.
Category: Mathematical Physics
[49] ai.viXra.org:2602.0052 [pdf] submitted on 2026-02-12 19:21:09
Authors: Lluis Eriksson
Comments: 11 Pages.
We establish three interface lemmas that close the remaining gaps in the proof chain for the mass gap of SU(N_c) lattice Yang-Mills theory at weak coupling (beta >= beta_0) in dimension d >= 3.Lemma A (Horizon Transfer) establishes a uniform conditional large-field suppression bound mu_k(Z_k(B) | G_{k+1}) <= exp(-c p_0(g_k)) holding mu_beta-a.s., without any admissibility restriction on the background field. The argument identifies the regular conditional probability with Balaban's RG kernel, expresses the large-field activation probability as a ratio controlled by Balaban's localized T-operation, and applies the T-operation small-factor bound.Lemma B extracts from Balaban's inductive scheme that the boundary terms B^{(k)}(X) share the same uniform analyticity domain as the polymer activities R^{(k)}(X), with radius hat{alpha}_1(gamma) > 0 independent of k.Lemma C extends the multiscale LSI to finite volumes with arbitrary frozen boundary conditions omega via tensorization-plus-perturbation, replacing the unverified Dobrushin block condition of Paper VII.Combined with Papers I-VII, these lemmas render the lattice mass gap theorem unconditional.
Category: Mathematical Physics
[48] ai.viXra.org:2602.0051 [pdf] submitted on 2026-02-12 19:22:19
Authors: Lluis Eriksson
Comments: 16 Pages.
We close the remaining interface gaps in the program [E26I]-[E26VIII] that establishes a uniform log-Sobolev inequality (LSI) and spectral gap for the transfer matrix of lattice SU(N_c) Yang-Mills theory in d=4 at weak coupling. Four technical gaps are identified and resolved: (G1) the Balaban small-factor bound for the T-operation is shown to hold pointwise for every real background by auditing Balaban's proof and verifying that it uses only the uniform inductive conditions; (G2) we establish a uniform small-field coercivity estimate (Hessian lower bound) for the effective action and use it, together with Balaban's small-factor mechanism, to control the conditional inequalities in the multiscale entropy decomposition -- circumventing the need for a global fiber LSI with constant O(beta); (G3) uniform analyticity of boundary terms is extracted from Balaban's inductive scheme; (G4) a quantitative bootstrap verifies the simultaneous compatibility of all constants for a single choice of beta_0. Combined with [E26I]-[E26VIII], these closures yield an unconditional proof that Delta_phys(beta,L) >= c(N_c,beta_0) > 0 uniformly in the volume L for beta >= beta_0.
Category: Mathematical Physics
[47] ai.viXra.org:2602.0050 [pdf] submitted on 2026-02-11 17:48:50
Authors: Scott Long
Comments: 6 Pages. (Zenodo DOI: 10.5281/zenodo.18601428)
The standard ΛCDM cosmological model is currently challenged by a 5σ discrepancy in Hubble Constant (H0) measurements and a vacuum energy density error of 122 orders of magnitude. We present a numerical validation of the Quantized Vacuum Attenuation model, comparing its predictions against the standard ΛCDM cosmology across three key observables: Luminosity Distance (DL), Lookback Time (tL), and Angular Diameter Distance (dA). Utilizing a vacuum attenuation coefficient of α≈8.26×10−27 m−1, derived from the local Hubble flow, our simulations resolve three primary tensions: Dark Energy: The model demonstrates that the high-redshift luminosity modulus follows a logarithmic attenuation profile, eliminating the need for Dark Energy to explain Type Ia Supernovae dimming. Early Galaxy Paradox: The model removes the Big Bang singularity, reinterpreting high-redshift galaxies (e.g., JADES-GS-z13) as objects seen through ≈35 Gyr of static vacuum transmission, allowing for infinite formation time. Little Red Dots: The angular diameter distance in a discrete manifold correctly predicts the observation of compact high-redshift morphologies that contradict angular magnification predicted by expanding metric models. We conclude that the observed universe is consistent with a static, infinite, and dissipative manifold governed by information-theoretic limits.
Category: Relativity and Cosmology
[46] ai.viXra.org:2602.0049 [pdf] submitted on 2026-02-11 17:57:00
Authors: Moninder Singh Modgil, Dnyandeo Dattatray Patil
Comments: 33 Pages.
This paper presents a novel synthesis of ancient religious cosmologies, particularly Sikh, Islamic, and Hindu scriptural verses, with modern theoretical physics, including general relativity, quantum cosmology, and higher-dimensional field theories. Beginning with interpretations of key hymns such as those from the Japji Sahib and Kirtan Sohila, the paper constructs conformally compactified spacetimemetrics aligned with spiritual metaphors. G¨odel-like rotating universes are modeledto reflect daily and annual solar motions, incorporating tunneling transitions, scalar curvature collapses, and spinor bundles to represent evolving consciousness. Through symbolic AI, scriptural syntax is translated into candidate gravitational Lagrangians, wavefunctionals, and field strength tensors that encode karmic memory. This integration of metaphysical semantics and mathematical physics allows new formulations of cosmological duality, including Janus time-symmetric models and magneto-causal holography, offering profound insights into the structure of the universe and the soul’s evolution within it.
Category: Religion and Spiritualism
[45] ai.viXra.org:2602.0048 [pdf] submitted on 2026-02-11 17:59:17
Authors: L. A. Serebrennikov
Comments: 4 Pages.
In the standard model, gravity is described either as a fundamental interaction (Newton’s law of universal gravitation) or as a geometric property of spacetime (Einstein’s general theory of relativity). The hypothesis proposed here considers observable gravitational attraction as a consequence of uneven cosmological expansion.Massive objects, possessing higher energy density, cause more intense local expansion of space, leading to effective repulsion of less massive bodies toward zones of increased expansion. The paper outlines the main postulates of the model, its potential implications for the dark matter problem, and proposes specific paths for experimental falsification.
Category: Relativity and Cosmology
[44] ai.viXra.org:2602.0047 [pdf] submitted on 2026-02-10 20:35:45
Authors: Guiffra Patrick
Comments: 21 Pages.
In this article, we demonstrate the emergence of a fundamental structure inherent to prime numbers that embeds its signature within the molecular architecture of DNA. By constructing a harmonic field Ω(x, q) based on modular inverses of prime numbers, we show that helical periodicities of biological structures emerge naturally from universal arithmetic properties.Our main results establish that: (1) The prime number p = 11 acts as a universal pivot, generating through its modular inverse 11−1 ≡ 2 the characteristic wavelengths of protein α-helices (λ = 7/2 = 3.5, 2.68% error vs observed 3.6) and B-form DNA (λ = 21/2 = 10.5, 0.10% error vs observed 10.5). (2) Analysis of human chromosome 1 reveals statistically significant 36% enrichment of prime-length tandem repeats (p < 3.2 × 10−7). (3) We establish a universal scaling law relating the prime harmonic coherence coefficient χ to genomic fractal dimension D according to D = 1−0.86χ, validated across five organisms spanning three domains of life (bacteria, yeast, plant, insect, mammal) with near-perfect correlation (r = −0.9974, p < 10−4, errors < 1%). The numerical evidence is compelling. To claim that prime numbers constitute the fundamental template of DNA requires only one more step: experimental laboratory demonstration. We propose testable protocols involving synthesis of DNA optimized according to prime harmonic principles, with quantitative predictions for thermal stability, mutation rates, and spectral properties. If validated, these findings suggest that number theory encodes universal architectural constraints on biological self-assembly.
Category: Physics of Biology
[43] ai.viXra.org:2602.0046 [pdf] replaced on 2026-02-12 19:04:00
Authors: Lluis Eriksson
Comments: 11 Pages.
We establish that the orbit space B = A/G of SU(N_c) lattice gauge theory satisfies the Riemannian curvature-dimension condition RCD*(N_c/4, dim A); in particular, it satisfies CD(N_c/4, infinity) in the sense of Lott-Villani-Sturm. The proof proceeds by showing that the configuration space A = SU(N_c)^{|B_1(Lambda)|}, equipped with the bi-invariant product metric
Category: Mathematical Physics
[42] ai.viXra.org:2602.0045 [pdf] submitted on 2026-02-10 20:42:00
Authors: Rüdiger Giesel
Comments: 9 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
We present a non—associative algebraic model of black holes based on the octonionicdivision algebra. Geometry is not postulated as fundamental. Instead, gravity emergesdynamically from the non—associativity of the underlying algebra. Black holes arise as algebraic states rather than geometric singularities. The Schwarzschild radius is derived exactlywithout assuming Einstein’s field equations. The resulting spacetime is singularity—free,geodesically complete, and information preserving
Category: Relativity and Cosmology
[41] ai.viXra.org:2602.0044 [pdf] submitted on 2026-02-10 21:10:05
Authors: Leon Barbour
Comments: 19 Pages. Licensed under CC BY 4.0 (Note by ai.viXra.org Admin: For the last time, please cite and list scientific references)
This paper presents a compact "tooling derivation" for the first closed operational loop in Causal Mechanical Cosmology (CMC): (A) spherical void control geometry producing adominant dipole anisotropy for an off-centre observer, (B) an elongated/sheared voidgeometry producing quadrupole anisotropy via a local Hessian decomposition, and (C) anobservable closure mapping the structural line-of-sight (LOS) velocity signature to atomicfractional frequency shift Δf/f as the measurement endpoint. Canonical equations (E1—E8) are preserved and used as the spine for derivations and diagnostics. The local Hessian kernel yields the leading even-parity multipole structure (monopole/quadrupole); dipole structure in the spherical off-centre control case is treated as arising from the global sampling asymmetry and, in a strict local expansion, enters through higher-order terms beyond the first Hessian approximation.
Category: Relativity and Cosmology
[40] ai.viXra.org:2602.0043 [pdf] submitted on 2026-02-10 02:21:23
Authors: J. W. McGreevy
Comments: 19 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
We present a rigorous synthesis in which the fundamental symmetries of General Relativity and quantum field theory emerge from the axioms of arithmetic geometry and number theory. Central is the weight-12 modular discriminant Δ(τ) = η(τ)^{24} = (2π)^{12} (E_4^3 - E_6^2)/1728, interpreted as the vacuum potential. The arithmetic degree (total integrated curvature) must disperse equivalently across Archimedean (smooth, complex-analytic) and non-Archimedean (discrete, p-adic) places to maintain global consistency via the product formula. This dispersion is enforced at the critical mirror point s=6 of L(Δ,s), where the functional equation symmetry balances openness and rigidity.The Hilbert-Pólya operator Ĥ = 1/2 + i (D_∞ ⊕ ∑p D_p) acts self-adjointly on the adelic Hilbert space, with eigenvalues corresponding to resonances tied to L(Δ,s) zeros. The 1728 frequency (12^3) serves as the universal gear ratio/adiabatic regulator. The 12-fermion matrix arises from the Leech lattice V_Λ{24} Z_2-orbifold, folding 24 bosonic dimensions into 12 Weyl fermions per generation via Möbius twist.A 4D Lorentzian manifold emerges via noncommutative geometry (KO-dimension 6 adelic spectral triple), with the spectral action Tr f(/Λ) yielding the Einstein-Hilbert term and stress-energy from p-adic torsion convolution. The master equation δ_g widehat{deg}(mathcal{L}) = 0 at s=6 recovers the Einstein field equations with cosmological constant Λ ≈ M_Pl^2 e^{-288} (double-twist entropy) and fine-structure constant α^{-1} ≈ 137.036 from Petersson norm corrections. This framework posits that GR and the Standard Model are stereographic projections of the weight-12 balanced modular form onto the Möbius-Planck manifold, providing a tautological origin for physical laws from number theory.
Category: Number Theory
[39] ai.viXra.org:2602.0042 [pdf] submitted on 2026-02-09 20:27:39
Authors: Ammar Alammar
Comments: 8 Pages. Creative Commons License
We propose a background-independent research programme modelling the universe as a dynamic,recursive causal network evolving from a single root vertex. By treating spacetime not as afundamental container but as an emergent property of a stochastic directed acyclic graph (DAG),we demonstrate that foliation invariance (a necessary condition for Lorentz symmetry) and three-dimensional spatial geometry can emerge as statistical limits of graph topology governed by random walk recurrence (Polya’s Theorem) and the Principle of Maximum Entropy. We formally derive the Einstein-Hilbert action in the thermodynamic limit of the graph’s microstate statistics, identifying the cosmological constant with unimodular volume fluctuations (Λ ∼ V −1/2). Furthermore, we identify fundamental particles as stable topological subgraphs (braids) protected by Pachner move invariance, and derive the Holographic Area Law from the Max-Flow Min-Cut theorem. This model offers a unified generative grammar for emergent gravitation and quantum interference, yielding a precise, falsifiable prediction for high-energy Lorentz violation, consistent with effective field theory expectations in discrete models.
Category: Quantum Gravity and String Theory
[38] ai.viXra.org:2602.0041 [pdf] replaced on 2026-02-12 19:02:28
Authors: Lluis Eriksson
Comments: 24 Pages.
We establish that SU(N_c) lattice Yang-Mills theory in d=4 dimensions with Wilson action at sufficiently weak coupling (beta = 2N_c/g^2 >= beta_0) satisfies a log-Sobolev inequality with constant alpha_* > 0 uniform in the lattice size L_vol. Combined with reflection positivity of the Wilson action and the DLR-LSI extension plus Stroock-Zegarlinski mixing route, this yields a mass gap Delta_phys > 0 uniform in L_vol without additional assumptions. The proof combines three ingredients: (i) Balaban's constructive renormalization group, which produces controlled effective actions at all scales; (ii) the orbit space Ricci curvature bound Ric_B >= N_c/4, which gives a uniform log-Sobolev constant for conditional measures of fast modes at each RG scale via the Bakry-Emery criterion; and (iii) a multiscale entropy decomposition with sweeping-out bounds, where the geometric scaling factor ||Q_(k)*||^2 = 2^{-(d-1)k} of transversal block averaging ensures summability of cross-scale errors.
Category: Mathematical Physics
[37] ai.viXra.org:2602.0040 [pdf] submitted on 2026-02-08 19:36:55
Authors: Lluis Eriksson
Comments: 11 Pages.
We prove that the lattice Yang-Mills measure with gauge group SU(Nc)in d=4 dimensions at sufficiently large β=2Nc/g2satisfies a Poincaré inequality with constant α*>0uniform in the lattice size L. The proof uses three ingredients:(i) the Ricci curvature bound RicB ≥ Nc/4 for thegauge orbit space, giving a uniform spectral gap for conditional measures offast modes at each renormalization group scale; (ii) Balaban's constructive RGwith polymer derivative bounds, controlling the residual coupling betweenscales; and (iii) a multiscale martingale variance decomposition that avoidsrecursive composition losses, with a commutator coefficientDk ≤ C e-2κ 2-3k made summable bythe geometric scaling factor of transversal block averaging. Under anRG-normalized disintegration consistent with Balaban's absorption structure,only exponentially decaying polymer residuals contribute to Dk,ensuring Σk Dk << c0. Theresulting uniform Poincaré inequality gives volume-independent control ofthe variance-to-energy ratio for gauge-invariant observables.
Category: Mathematical Physics
[36] ai.viXra.org:2602.0039 [pdf] submitted on 2026-02-08 04:40:53
Authors: Isaiah Nwukor
Comments: 15 Pages.
Individual artificial intelligence systems face an inherent trade-off between plasticity and stability under resource constraints. I propose that general intelligence emerges from networks of specialized agents applying a structured reasoning cycle to answer four fundamental questions. Agents ground abstract patterns through affective valence embeddings and coordinate via a shared database of credibility-weighted knowledge packages. I formalize a five-stage reasoning engine (Salience Detection → Hypothesis Generation → Experimentation → Structural Correspondence → Generalization) where agents at different stages specialize in different questions, enabling zero-shot cross-domain transfer. Using ARC-AGI task "as66" as demonstration, I show 276 generations of evolutionary learning where complementary specialization yields a current maximum of Level 4 performance across agents [20]. This framework provides testable predictions for performance scaling, transfer capability, and behavioral signatures of reasoning integration.
Category: Artificial Intelligence
[35] ai.viXra.org:2602.0038 [pdf] submitted on 2026-02-08 11:01:58
Authors: Lluis Eriksson
Comments: 13 Pages.
We prove that the quadratic Gribov-Zwanziger measure on a d-dimensional periodic lattice (d ≥ 2) with gauge group SU(Nc) exhibits a mass gap, uniformly in the lattice size L. The gluon propagator at zero momentum satisfies D(0) ≤ Cd,Nc/g2 for all L ≥ 2 and all coupling g > 0. In the thermodynamic limit, mgap = g[(d-1)NcI1/d2]1/2, where I1 = ∫ ddk/(2π)d 1/k̂2 is a finite lattice constant (I1 ≈ 0.155 in d = 4). For SU(3) at β = 6 the predicted mass scale is mgap ≈ 0.6 GeV, in quantitative agreement with lattice Monte Carlo measurements. The proof combines four ingredients: strict log-concavity of the measure (Bhatia's matrix inequality), dimensional reduction to a fixed finite-dimensional zero-mode sector (Prékopa's theorem), an exact computation of the effective Hessian at the origin, and a 1/N scaling argument that renders the effective potential asymptotically quadratic. No perturbative expansion in the coupling constant is employed.
Category: Quantum Physics
[34] ai.viXra.org:2602.0037 [pdf] submitted on 2026-02-08 19:04:35
Authors: Tehzeeb Ali
Comments: 14 Pages.
Modern public key cryptosystems rely on two fundamental computational hardness assumptions: integer factorization (RSA) and the discrete logarithm problem (elliptic curve cryptography). These problems, formulated using modular arithmetic and algebraic geometry, have withstood four decades of cryptanalytic attacks. However, their inherent algebraic structures and periodicity properties make them vulnerable to quantum algorithms, particularly Shor’s algorithm (1994), which achieves polynomial-time complexity on quantum computers. This research presents an extensive mathematical comparison between classical cryptographic systems and quantum-resistant alternatives, with particular emphasis on lattice-based cryptography. We focus on the Learning With Errors (LWE) problem and its variants (Ring-LWE, Module-LWE), demonstrating through rigorous mathematical analysis why these lattice problems lack the periodicity that quantum algorithms exploit. We provide formal security reductions for LWE problems relative to worst-case lattice problems and present mathematical proofs of quantum resistance. For cryptocurrency systems, this analysis reveals critical vulnerabilities: current ECDSA algorithms used for transaction signing will become cryptographically insecure within 10-30 years, potentially compromising over $100 billion in digital assets. This work bridges mathematical foundations, security analysis, and practical implications for real-world systems, providing proof-based recommendations for the transition to post-quantum cryptographic standards in blockchain technologies.
Category: Digital Signal Processing
[33] ai.viXra.org:2602.0036 [pdf] submitted on 2026-02-08 13:06:34
Authors: Lluis Eriksson
Comments: 15 Pages.
We establish three results for the SU(Nc) lattice Yang-Mills mass gap. First, the function U → -Re Tr U is strictly geodesically convex on Bπ/2(I) ⊂ SU(Nc), with an explicit Riemannian Hessian. Second, the orbit space B = A/G has Ricci curvature RicB ≥ Nc/4, giving a spectral gap λ1(ΔB) ≥ Nc/4 uniform in lattice size, making rigorous an argument of Mondal. Third, and most importantly, we prove that the Yang-Mills-Faddeev-Popov potential is not geodesically convex at the trivial vacuum in zero-mode directions, for any value of the coupling in d ≥ 3. This shows that convexity-based methods — Brascamp-Lieb, Bakry-Émery, Dobrushin, Prékopa — cannot establish the mass gap through the Hessian of the full potential. We argue that the physical mass gap O(e-c/g²) requires the global topology of B, accessible via the Witten-Helffer-Sjöstrand framework.
Category: Quantum Physics
[32] ai.viXra.org:2602.0035 [pdf] submitted on 2026-02-08 14:26:38
Authors: Lluis Eriksson
Comments: 11 Pages.
We establish four results toward the SU(N_c) lattice Yang-Mills mass gap. First, the Wilson potential on the gauge orbit space B=A/G is Morse-Bott with critical manifold M_flat (the flat connections), and we derive the Born-Oppenheimer effective Hamiltonian on M_flat. Second, we prove that the Faddeev-Popov obstruction identified in Paper II applies to the path integral but not to the Hamiltonian on B: since V_pot = S_YM >= 0 has non-negative Hessian at its minimum, the Bakry-Emery framework gives an unconditional mass gap m >= c(L,N_c,d) g^2 > 0 for each fixed lattice size L. Third, we show that the physical mass gap m ~ exp(-C/g^2) follows if the spectral gap at Balaban's terminal renormalization scale is bounded below. We identify this as the single remaining step toward the Yang-Mills Millennium Problem on the lattice.
Category: Quantum Physics
[31] ai.viXra.org:2602.0033 [pdf] submitted on 2026-02-08 15:01:30
Authors: Lluis Eriksson
Comments: 6 Pages.
We prove that SU(N_c) lattice Yang-Mills theory in d=4 dimensions with Wilson action at sufficiently weak coupling has a positive mass gap m >= c(N_c) exp(-C(N_c)/g^2) > 0 in lattice units, uniformly in the lattice size L up to the correlation length. The proof is self-contained modulo Balaban's constructive renormalization group (Comm. Math. Phys., 1984-1989) and combines three ingredients proved here: (i) a Ricci curvature bound Ric_B >= N_c/4 for the gauge orbit space, via O'Neill's submersion formula; (ii) a Holley-Stroock spectral gap estimate at Balaban's terminal renormalization scale; (iii) a transfer-matrix trace identity, with controlled errors from the non-nearest-neighbor couplings in Balaban's effective action, showing that the physical mass gap is approximately scale-invariant under the renormalization group.
Category: Quantum Physics
[30] ai.viXra.org:2602.0032 [pdf] submitted on 2026-02-08 19:11:44
Authors: Lluis Eriksson
Comments: 10 Pages.
We prove that SU(Nc) lattice Yang—Mills theory in d = 4 dimensions with Wilson action at sufficiently weak coupling has a positive mass gap mgap ≥ c(Nc) · e−C(Nc)/g2 > 0 in lattice units, uniformly in lattice sizes L ≤ C0 eC/g2 . The proof is self-contained modulo Balaban’s constructive renormalization group and combines: (i) a Ricci curvature bound RicB ≥ Nc/4 for the gauge orbit space, treating its orbifold singularities; (ii) a Witten Laplacian semiclassical spectral gap estimate at Balaban’s terminal scale, using the Morse—Bott structure of the Wilson potential with all hypotheses of the Helffer—Sjöstrand theory explicitly verified; and (iii) a transfer-matrix trace identity with controlled errors from nonlocal temporal couplings.
Category: Quantum Physics
[29] ai.viXra.org:2602.0031 [pdf] submitted on 2026-02-08 00:43:03
Authors: Cesar Henriques
Comments: 22 Pages.
We present a holographic framework in which four-dimensional spacetime emerges from quantum entanglement structure encoded on a brane embedded in asymptotically AdS space [1]. Mass is not fundamental but is identified with topological entanglement complexity—knot complexity Ck—quantifying irreducible multipar-tite correlations [3]. Gravitational interaction emerges as the macroscopic response to gradients in entanglement tension along the holographic direction [4], reducing to Einstein’s equations at low complexity with an additional non-local stress contribution from diffuse entanglement structure [5]. We propose that classical spacetime singularities signal saturation of entanglement capacity at a critical threshold Ck,max, analogous to the Bekenstein—Hawkingbound [6, 7]. Beyond this regime, the system undergoes a topological transition—brane secession—whereby the saturated region disconnects from the parent structure and nucleates an independent spacetime [2, 8]. This provides a natural regularization mechanism: from the exterior perspective, a black hole forms; from the interior perspective, a smooth cosmological expansion emerges, unifying collapse and cosmogenesis as dual descriptions of a single entanglement reorganization. We demonstrate structural consistency through explicit toy-model calculations in finite tensor networks (Appendices A—B) and present a simplified phenomenological realization showing how diffuse entanglement tension reproduces flat galactic rotation curves without dark matter particles (Section 5.3). The framework offersa unified interpretive scheme for mass, gravity, and dark-sector phenomenology as emergent consequences of quantum correlation structure, while remaining compatible with established results in holography, semiclassical gravity, and observational cosmology [1, 2, 6].
Category: High Energy Particle Physics
[28] ai.viXra.org:2602.0029 [pdf] submitted on 2026-02-08 00:47:19
Authors: Alimi Ayomide Olamilekan
Comments: 6 Pages.
We demonstrate that the muon anomalous magnetic moment discrepancy (127 parts per billion) and the proton radius puzzle arise from a common discrete geometric origin within the Directed Dimensional Lattice (DDL) framework. By modeling spacetime as a 24-cell (F4) lattice rather than a continuous manifold, we derive the muon anomaly as a "polygon tax" from finite harmoniccycles of N = 3600 nodes. This value emerges naturally from the hierarchical partition of the 120-cell into 25 disjoint 24-cells, combined with 6-fold phase updates required by SO(4) holonomy. Concurrently, the 24-cell symmetry predicts the muonic proton radius as Rµ = Re(1 − 1/24) = 0.841 fm. The exact integer ratio 3600/24 = 150 suggests phase-locked coupling between lepton dynamics and lattice geometry. These predictions are jointly testable via the MUonE experiment, providing definitive validation or falsification of the DDL framework without requiring beyond-Standard-Model particles.
Category: High Energy Particle Physics
[27] ai.viXra.org:2602.0028 [pdf] submitted on 2026-02-08 00:48:47
Authors: Chang-Sik Kim
Comments: 7 Pages. Copyright © 2026 Chang-Sik Kim. All rights reserved.
현대 표준 우주론(ΛCDM)은 우주배경복사의 정밀 관측을 통해 우주의 조성을 성공적으로 설명해 왔으나, 최근 심각한 관측적 난제들에 직면해 있다. 암흑물질 입자는 수십 년간의 탐색에도 불구하고 발견되지 않았으며, 허블 텐션(Hubble Tension)이라 불리는 우주 팽창률의 불일치는 해소되지 않고 있다. 결정적으로, 제임스 웹 우주 망원경(JWST)이 관측한 적색편이 z > 10 영역의 거대 성숙 은하들은 표준 모델이 허용하는 138억 년의 우주 나이로는 설명이 불가능하다.본 논문은 이러한 위기를 극복하기 위해 시공간을 단순한 기하학적 매니폴드가 아닌, ‘기억(Memory)’과 ‘탄성(Elasticity)’을 가진 물리적 매질로 재정의하는 **‘시공간 탄성 이력 이론(Spacetime Elastic Hysteresis Theory)’**을 제안한다. 우리는 질량과 중력이 시공간 격자의 위상학적 엉킴 밀도(ψ)에서 기인한다는 구성 방정식인 **‘킴의 법칙(Kim’s Law, E = κψ)’**을 도입한다.우리는 엉킴 스칼라 필드(ψ)를 포함한 확장된 라그랑지안을 구성하고, 변분 원리를 통해 수정된 아인슈타인 장 방정식을 유도하였다. 여기서 도출된 **‘킴 텐서(Kim Tensor, K_μν)’**는 은하 규모에서 추가적인 중력 효과를 발생시켜 암흑물질 없이도 은하 회전 곡선의 평탄화를 완벽하게 설명한다. 또한, 시공간 매질의 점탄성(Viscoelasticity)에 의한 에너지 소산 효과인 ‘탄성 적색편이(Elastic Redshift)’를 고려하여 우주 팽창 역사를 재구성한 결과, 우주의 실제 나이는 **165.4억 년(16.54 Gyr)**으로 산출되었다. 이 새로운 타임라인은 JWST가 관측한 조기 거대 은하들의 형성과 진화에 필요한 시간을 자연스럽게 제공함으로써, 현대 우주론의 난제들을 통합적으로 해결한다.
Category: Astrophysics
[26] ai.viXra.org:2602.0027 [pdf] submitted on 2026-02-08 00:50:33
Authors: Chang-Sik Kim
Comments: 7 Pages. Copyright © 2026 Chang-Sik Kim. All rights reserved.
Wepropose a scalar-tensor theory of gravity by redefining spacetime as a physical viscoelastic medium. Addressing the anomalies of the ΛCDMmodel—specifically the Hubble Tension and the early formation of massive galaxies observed by JWST—we introduce a constitutive relation, Kim’sLaw (E = κψ). This law postulates that the gravitational potential arises from the topological entanglement density (ψ) of the spacetime lattice. By constructing a Lagrangian density with an elastic potential term,we derive modified Einstein Field Equations. We explicitly demonstrate that the "missing mass" in galactic rotation curves is a manifestationof vacuum rigidity (Kµν) rather than non-baryonic Dark Matter. Furthermore, we incorporate an energy dissipation term (hysteresis) into the Friedmann equations, deriving a recalibrated cosmic age of 16.54 Gyr.This extended timeline resolves the conflict between standard cosmology and the existence of mature galaxies at z > 10.
Category: Quantum Gravity and String Theory
[25] ai.viXra.org:2602.0026 [pdf] submitted on 2026-02-08 00:53:04
Authors: Vinicius F S Santos
Comments: 17 Pages. (Note by viXra Admin: Parts of the texts are cut off!)
The Erdős—Lovász Tihany Conjecture (1968) asserts that every graph G with χ(G) ≥ s + t − 1 > ω(G) admits a vertex partition into parts with chromatic numbers ≥ s and ≥ t, respectively. We prove the conjecture for the infinite family of pairs (3, k−2) on Mycielski graphs Mk for all k ≥ 5. Our approach is spectral, centred on the golden ratio φ = (1+√5)/2. The pentagon C5—the minimal graph with χ > ω—has adjacency eigenvalues {2, φ−1, φ−1,−φ,−φ}, and this golden spectral structure propagates through the Mycielski construction: the golden ratio’s defining equation μ2 − μ − 1 = 0 arises exactly from the Mycielski eigenvalue relation (Lemma 2.1). We prove the C5-Peeling Existence Theorem: for every Mk (k ≥ 4), a direction in the golden eigenspace peels off a C5 from Mk. The proof is constructive via spectral interferometry: two Mycielski lift paths span a four-dimensional subspace whose layer-control matrix has det = √5, enabling independent phase steering to select a diagonal lift C5—the reverse cycle through alternating address layers. Computationally, the Hoffman margin of this partition is F = φ−3 = √5 − 2 exactly, verified for all k ≤ 12. The key advance is the Golden Sub-Induction (Theorem 1.2): for k ≥ 6, the 1/φ shadow attenuation forces the peeled C5 into the original vertex block of Mk, so the remainder Mk Pk contains Mycielski(Mk−1 Pk−1) as a subgraph, where (Pk)k≥5 is a coherent family of peeled pentagons. Since χ(Mycielski(G)) = χ(G) + 1 for any graph with an edge, this yields the inductive bound χ(Mk Pk) ≥ k − 2. Combined with χ(C5) = 3, this settles the Tihany conjecture for the pair (3, k−2) on Mk for every k ≥ 5—an infinite family of previously open cases.
Category: Combinatorics and Graph Theory
[24] ai.viXra.org:2602.0025 [pdf] submitted on 2026-02-07 18:18:35
Authors: Chang-Sik Kim
Comments: 21 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
Modern cosmology faces a "dual crisis": the failure to detect dark matter particles and the Hubble Tension. This research proposes a paradigm shift, redefining spacetime as an "Elastic Medium with Memory" (Hysteresis). Central to this is Kim’s Law (E = K*P (kappapsi)), which defines mass as the emergent manifestation of Spacetime Entanglement Density (P: psi). By correcting for Elastic Redshift (z_{elastic}), the model derives a true cosmic age of 16.5 billion years, resolving the paradox of mature early galaxies observed by the James Webb Space Telescope (JWST).2. Theoretical Framework: Kim-Einstein Equations2.1 The Unified LagrangianThe theory integrates a new elastic potential term, mathcal{L}_{entangle} (or mathcal{L}_{Knot}), into the standard Einstein-Hilbert action:
Category: Astrophysics
[23] ai.viXra.org:2602.0024 [pdf] replaced on 2026-02-11 18:04:59
Authors: Chang-Sik Kim
Comments: 3 Pages.
The Bullet Cluster (1E 0657-56) is widely regarded as the most direct empirical evidence for particulate Dark Matter, due to the observed separation between the gravitational lensingcenter and the baryonic gas. In this third and final paper of the series, we demonstratethat this phenomenon can be explained within the Spacetime Elastic Hysteresis The ory without invoking non-baryonic particles. We introduce the mechanism of "Spacetime Creep," where the macroscopic entanglement strain (ψ) exhibits a delayed relaxation response to the rapid deceleration of baryonic matter. Our analysis shows that the viscoelastic stress relaxation time (τrelax) allows the gravitational potential peak to disassociate from the collisional gas, naturally reproducing the lensing anomalies observed in galaxy cluster collisions.
Category: Quantum Gravity and String Theory
[22] ai.viXra.org:2602.0023 [pdf] replaced on 2026-02-11 18:04:08
Authors: Chang-Sik Kim
Comments: 3 Pages.
The "Hubble Tension"—the statistically significant discrepancy between the expansion rate of the universe measured from the early universe (CMB) and the late universe (SNIa)—remains one of the most challenging problems in modern cosmology. In this second paper of the series, we propose that this tension arises from neglecting the viscoelastic nature of spacetime.Building on Kim’s Law (V ∝ ψ2) established in Part 1, we introduce the concept of"Spacetime Hysteresis," where the release of stored elastic energy is delayed by a characteristic time scale τ. Our MCMC analysis shows that a delayed elastic response (τ ≈ 0.15)naturally boosts the late-time expansion rate to H0 ≈ 73 km/s/Mpc, resolving the tensionwithout breaking early-universe physics. Furthermore, this model implies a recalibrated cosmic age of 16.54 Gyr, providing a theoretical solution to the formation of mature galaxiesat z >10 observed by JWST.
Category: Astrophysics
[21] ai.viXra.org:2602.0022 [pdf] submitted on 2026-02-07 18:23:48
Authors: Chang-Sik Kim
Comments: 25 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
Modern cosmology faces a "dual crisis": the failure to detect dark matter particles and the Hubble Tension. This research proposes a paradigm shift, redefining spacetime as an "Elastic Medium with Memory" (Hysteresis). Central to this is Kim’s Law (E = p*k ):(kappapsi), which defines mass as the emergent manifestation of Spacetime Entanglement Density (p: psi). By correcting for Elastic Redshift (z_{elastic}), the model derives a true cosmic age of 16.5 billion years, resolving the paradox of mature early galaxies observed by the James Webb Space Telescope (JWST).2. Theoretical Framework: Kim-Einstein Equations2.1 The Unified LagrangianThe theory integrates a new elastic potential term, mathcal{L}_{entangle} (or mathcal{L}_{Knot}), into the standard Einstein-Hilbert action:
Category: Astrophysics
[20] ai.viXra.org:2602.0021 [pdf] submitted on 2026-02-07 16:49:07
Authors: Lluís Eriksson
Comments: 34 Pages.
We present a rigorous framework for the Yang-Mills mass gap problem, combining three independent lines of argument.Result A (Unconditional). A new MaxEnt Clustering-Recovery Bridge: for any lattice gauge state with finite correlation length xi, the Petz recovery fidelity satisfies 1-F <= C e^{-r/xi}. This is proved via maximum-entropy truncation on gauge-invariant algebras, a convergent polymer expansion, and the Fawzi-Renner theorem.Result B (Unconditional on the lattice, conditional for all couplings). For SU(N) lattice gauge theory (T=0, theta=0, d=3+1, N>=2): the algebraic phase exclusion, using the projective commutation relation of 1-form symmetry operators, unconditionally excludes the trivially gapped symmetric phase. Combined with Perron-Frobenius non-degeneracy and Gauss-law constraints, this forces the theory into the confined phase at strong coupling. The extension to all couplings relies on a single hypothesis: the absence of a bulk phase transition. Under this hypothesis, the uniform lattice mass gap Delta >= m_0 > 0 holds for all lattice spacings.Result C (Conditional). Under the same hypothesis, the continuum limit of SU(N) Yang-Mills theory in d=3+1 exists as a Euclidean QFT satisfying all Osterwalder-Schrader axioms (OS0-OS4), with exponential clustering rate m_0 > 0 (mass gap).Result D (Gradient Flow Reduction). Independently of the anomaly argument, we prove that the mass gap in d=4 is equivalent to a concrete spectral condition on the gradient flow beta-function being strictly negative for all g > 0, combined with a Tauberian regularity condition.The proof architecture uses three main tools: (1) the algebraic structure of higher-form symmetry anomalies on the lattice, (2) backward error analysis of the lattice gradient flow combined with a new spectral calibration, and (3) the MaxEnt bridge from quantum information theory. Exact diagonalisation of Z_2 lattice gauge theory on lattices up to 12 qubits and Z_3 clock models confirms all quantum-information predictions of the framework. This paper contains one explicitly stated hypothesis (absence of bulk phase transition) that is not proven. All conditional results are clearly marked.
Category: Quantum Physics
[19] ai.viXra.org:2602.0020 [pdf] submitted on 2026-02-07 16:50:59
Authors: Lluis Eriksson
Comments: 18 Pages.
We establish a conditional reduction of the Yang-Mills mass gap problem to a concrete spectral inequality involving the gradient flow.For pure SU(N) Yang-Mills theory, if the gradient flow beta-function satisfies a uniform strict asymptotic freedom condition |beta_{GF}(g)| >= delta g^3 for large g, and a Tauberian regularity condition holds for the spectral density, then: in d=3, the theory has a mass gap Delta > 0; in d=4, the infrared trace anomaly vanishes (a_{IR}=0), ruling out a conformal infrared fixed point, and reducing the mass gap to explicit spectral conditions. However, the spectral argument is marginal in d=4 and requires additional non-perturbative input.The proof uses three ingredients: (1) a spectral representation of the gradient flow energy E(t) and a monotonicity identity R'(t) = -2 Var_t(lambda) <= 0 for the ratio R(t) = F(t)/E(t); (2) the Komargodski-Schwimmer a-theorem constraining the IR behaviour; and (3) a gradient flow Poincare inequality connecting functional inequalities to exponential clustering of correlators.We verify all perturbative inputs: the free-field calibration gives R_{free}(t) = 2/t in d=4, and the one-loop correction has the correct sign (R(t) < 2/t for g > 0). We identify the non-perturbative obstruction (the indefiniteness of the Weitzenbock curvature term) as the precise technical barrier to closing the argument in d=4. This paper is a companion to the author's paper on anomaly algebra and quantum information methods for the mass gap. The two approaches are complementary and independent.
Category: Quantum Physics
[18] ai.viXra.org:2602.0019 [pdf] submitted on 2026-02-07 18:25:11
Authors: Chang-Sik Kim
Comments: 7 Pages.
Contemporary cosmology is grappling with a fundamental crisis: the persistent failure to detect Dark Matter particles and the escalating "Hubble Tension." Recent high-redshift observations from the James Webb Space Telescope (JWST) have revealed massive, mature galaxies that defy the 13.8 billion-year timeline of the standard Lambda-CDM model.This dissertation proposes a revolutionary paradigm shift by redefining the vacuum of spacetime as an "Elastic Medium with Memory" (Spacetime Hysteresis). We introduce Kim’s Law (E = kappa * psi), which identifies the origin of mass-energy as the topological entanglement density (psi) of the spacetime lattice. By incorporating a new stress-energy tensor, the Kim Tensor, into Einstein’s Field Equations, we demonstrate that galactic rotation curves and gravitational lensing anomalies are manifestations of spacetime’s emergent rigidity, eliminating the need for hypothetical dark matter particles.Furthermore, we identify "Elastic Redshift"—energy dissipation within the strained spacetime medium—as a hidden variable in cosmic expansion. Correcting for this factor resolves the Hubble Tension and establishes a recalibrated cosmic age of 16.54 billion years. This theory provides the necessary temporal window for the evolution of early massive galaxies and offers a unified, dark-matter-free framework satisfying both quantum logic and general relativistic principles.
Category: Astrophysics
[17] ai.viXra.org:2602.0018 [pdf] submitted on 2026-02-06 09:22:11
Authors: Steven Hammon
Comments: 42 Pages.
The modern media ecosystem, driven primarily by profit-first engagement algorithms, has created significant issues in terms of information distribution. This paper argues that the public is subjected to pervasive psychological manipulation, primarily through the control of fear, hate, and polarization, with no simple choice to opt out while still being informed. This has cascading and concerning consequences, including measurable harm to the mental well-being of children, the fragmentation of social cohesion, and the systemic erosion of the democratic process. It also forces manipulative content and misinformation to be vilified as a problem. The paper examines the expectation that individuals should defend themselves against multi-billion-dollar manipulation engines. Censorship infringes on free speech and often fails to address the root cause. Instead, this paper proposes the implementation of an Ethical Journalism Standard (EJS), modeled on the MPAA film rating system, which has been shown to satisfactorily balance free speech with the need to protect children and society since 1968. The EJS would not ban or suppress default feed content. It would provide a label for journalism that adheres to the established Journalist Code of Ethics, and allow Ethical Journalism content to be selected, like how parents select a TV station of G-rated content. This empowers the public’s right to access factual information, to make informed choices about the information they consume and share, and to give informed consent about the future of their country. It also gives manipulation a place to be celebrated as a skill. By elevating credible, ethical content, society can foster a healthier information environment, safeguard children, and restore confidence in governments and media that is essential for a functioning democracy, while simultaneously embracing the right to free speech manipulation as a skill to be celebrated.
Category: Social Science
[16] ai.viXra.org:2602.0017 [pdf] submitted on 2026-02-06 02:25:39
Authors: Geza Kovacs
Comments: 2 Pages.
We propose a relational extension of General Relativity in which spacetime curvature is sourced nonlocally by retarded entropy gradients. Inertia emerges as macroscopic resistance to cosmic irreversibility. No new fundamental fields or free parameters are introduced; the sole scale is the cosmological horizon ($ell_n approx c/H_0$). Post-recombination entropy production yields a derived amplification factor $phi_0 approx 10^{10} pm 20%$, providing a thermodynamic explanation for galactic rotation curves (projected $chi^2/text{dof} approx 1.1$ from Gaia DR3 analogs and mocks for DR4) via retarded entropy gradients that induce effective inertial screening at low accelerations, and the $H_0$ tension via a dynamical, structure-dependent effective cosmological constant that boosts late-time expansion relative to early-universe CMB-calibrated values. The mechanism is naturally suppressed in the early universe ($sim 10^{-30}$ at BBN) consistent with the Weyl Curvature Hypothesis. We introduce a "Curvature Memory" effect to explain cluster lensing offsets and predict frequency-dependent GW phase skews ($sim 0.4$ ms) testable by LISA.
Category: Relativity and Cosmology
[15] ai.viXra.org:2602.0016 [pdf] replaced on 2026-02-19 23:03:05
Authors: Joseph Koharski
Comments: 66 Pages. I added manuscript line numbers for easier reader reference and feedback. Corrected minor typographical errors and fixed intermediate arithmetic in the macroscopic dynamics calculations.
This document contains a compilation of five research papers detailing the Quantum SpaceMechanism (QSM). These papers propose a unified framework where Inertia, Gravity, and Timeemerge from the hydrodynamics of a viscous, dilatant vacuum substrate (the Higgs field). The seriescovers: (I) The Entropic Origin of Inertia and the Bridge Equation; (II) The Vacuum Yield Point andthe Origin of Gravity; (III) The Geometry of Mass and Particle Generations via Finslerian Anglesof Attack; (IV) Macroscopic Dynamics, Dark Matter as Metric Expansion, and Electromagnetism;and (V) The Origin of Time as Viscous Dissipation.
Category: Quantum Gravity and String Theory
[14] ai.viXra.org:2602.0015 [pdf] submitted on 2026-02-04 21:19:15
Authors: Manuel Alejandro Hernández Madan
Comments: 5 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references!)
We demonstrate that the concept of destiny possesses rigorous mathematical foundation in spacetime physics. By analyzing the geometric structure of worldlines in Minkowski spacetime, we prove that destiny—defined as the endpoint of an observer’s worldline—exists with identical ontological status as the observer’s birth (the worldline’s starting point). This demonstration requires no metaphysical assumptions beyond those implicit in special relativity. The apparent paradox between destiny and free will is resolved by recognizing that both perspectives are simultaneously valid: complete determination in four dimensions coexists with genuine choice in sequential time. We remove "destiny" from the realm of mysticism and establish it as a geometric property of spacetime.
Category: History and Philosophy of Physics
[13] ai.viXra.org:2602.0014 [pdf] replaced on 2026-03-23 22:03:10
Authors: Ezadiin Redwan
Comments: 5 Pages.
"We present a universal proof of the Strong Goldbach Conjectureby shifting the problem from arithmetic density to Geometrical Transformation. By defining primes as the structural building of every integer numbers via the Fundamental Theorem of Arithmetic, we map theinteraction between addition and multiplication onto a vector space.We prove that the identity 2n cos(θ) = a+b is a structural requirementof this space. This non-constructive existence proof demonstrates thatfor every even integer 2n, a prime partition (a, b) is geometrically necessitated by the scalar projection of prime-based vectors, therebyresolving the parity problem through The Dot Product."
Category: Number Theory
[12] ai.viXra.org:2602.0013 [pdf] replaced on 2026-02-05 03:11:28
Authors: Priyanshu Rauth
Comments: 5 Pages.
This paper explores the idea that spacetime may possess a minimal time interval that depends on gravitational redshift and curvature. Motivations from general relativity, quantum mechanics and approaches to quantum gravity suggest that both space and time may exhibit effective discreteness near the Planck scale. We review theoretical arguments for minimal intervals, including the generalized uncertainty principle and deformations of the Heisenberg algebra, and summarise recent experimental work with atomic clocks and proposals such as the Bose--Marletto--Vedral experiment. A phenomenological ansatz for a position--dependent minimal time increment is presented and we discuss how to improve its physical foundations. The aim is not to propose a theory of everything but to offer a conservative, focused framework that could guide future experiments.
Category: Relativity and Cosmology
[11] ai.viXra.org:2602.0012 [pdf] submitted on 2026-02-03 20:03:32
Authors: Christian B. Mueller
Comments: 2 Pages. (Note by viXra Admin: Please cite and list scientific references)
This paper presents a novel framework for fundamental physics that replaces the geometriccurvature of spacetime with a resource-based conservation law. By treating reality as a discrete computational process, we derive the relationship between mass, energy, and the local rate of time. The model progresses from a 3D-accounting perspective (LQRR) to a 4D-Euclidean resonance framework (TEQR), providing a unified explanation for gravity, time dilation, and projection-based phenomena often attributed to Dark Matter.
Category: Relativity and Cosmology
[10] ai.viXra.org:2602.0011 [pdf] submitted on 2026-02-03 09:54:47
Authors: Alaya Kouki
Comments: 44 Pages.
From Restraint Relativity it is possible to consider a corpuscle as a packet of strings. The variation of the length of this packet is equal to the phase speed of the corpuscle as a packet of waves times an universal constant. In a system of units where ℏ=c=a=1 the string vector becomes equal to the wave vector.From quantum mechanics we had deduce that a Planck oscillator can emit or absorb power only by quanta of power multiple integer of hν^2. This allows us to divide space-time in modular cells of action, momentum, energyu2026etc in the phase space geometry of the oscillator and resolve the problem of vacuum energy density. Planck system of units serves only for black holes topology.Space-time is a four open dimensions. More than four the dimension should be curled. Resolving gravitation field by computer machine with model using modular cells will be without singularities . Einstein equations of gravitational field are available only in a quasi static asymptotically flat Universe. Constants G & Λ of General Relativity are proportional to the inverse square of the Universe radius.
Category: Quantum Gravity and String Theory
[9] ai.viXra.org:2602.0010 [pdf] submitted on 2026-02-03 19:55:50
Authors: Mohammad Saeed Alnatour
Comments: 172 Pages. (Note by ai.viXra.org Admin: For the last time, author name is required in the article after article title and please cite listed scientific references)
Divergent series and singular integrals arise naturally in analysis, geometry, and theoretical physics, yet their standard treatment relies on analytic continuation or limit-based regularization. While these methods successfully assign finite values, they necessarily suppress information about how infinity is traversed. This work proposes a structural framework—Interconnected Infinities Giant Sphere Space (IIGSS), together with an intrinsic regulator, the Discrete Laplace Regulator (DLR), in which divergence is treated as a boundary phenomenon rather than a failure of summation.DLR operates directly on discrete sequences by introducing controlled exponential damping and expanding the resulting kernel at a well-defined infinite-traversal gate. Divergence appears explicitly as algebraic pole terms or logarithmic singularities in the gate expansion, encoding growth class and traversal density, while a pole-invariant constant—the Convergence Momentum (CM)—emerges as a finite structural quantity. Valuation is performed exclusively through gate expansion followed by pole removal, without index shifting, limit evaluation, or analytic continuation.Within this framework, classical zeta and Dirichlet regularizations are recovered as special projections under standard traversal, while traversal-sensitive features—such as zero insertion, spacing modulation, and phase structure—remain distinguishable. The framework accommodates finite-gate and oscillatory sequences and clarifies the limitations of reconstruction from regularized values alone. In physical applications, CM functions as a retained boundary invariant: when applied to spectral mode sums, such as those appearing in the Casimir effect, the regulator preserves observable finite quantities while rendering the underlying divergence structure explicit. DLR thus provides a higher-resolution language for infinity, preserving established results while exposing structural information necessarily omitted by classical methods.
Category: Functions and Analysis
[8] ai.viXra.org:2602.0009 [pdf] submitted on 2026-02-03 19:44:56
Authors: Cameron Brogan-Higgins
Comments: 2 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)
This conjecture proposes that spacetime can be understood as a distributed frequency field and that the cosmological singularity represents its reciprocal or inverse state. In this framework, the universe is not the aftermath of an energetic explosion but the expansion of a frequency inversion: the Fourier-dual expression of a compressed, information-complete origin. Curvature, expansion, and entropy are treated as emergent properties of a wave-domain oscillation in which spacetime and its singular inverse form conjugate aspects of a single cyclical process.
Category: Relativity and Cosmology
[7] ai.viXra.org:2602.0008 [pdf] submitted on 2026-02-03 19:43:20
Authors: Bertrand Jarry
Comments: 7 Pages. Creative Commons Attribution 4.0 International (CC-BY 4.0) (Note by ai.viXra.org Admin: Please cite listed scientific references)
We present a complete bottom-up derivation of non-relativistic quantum me-chanics, special relativity, induced gravity (Sakharov mechanism), Dirac fermions, and U(1) gauge interactions from nearest-neighbor unitary evolution on a 4D hy-percubic lattice. The Schrödinger equation emerges exactly in the continuum limit, with higher-order Lorentz-violating corrections. We rigorously prove the Heisenberg uncertainty principle, quantum superposition, entanglement, and probability conser-vation as direct consequences of the discrete structure. Detailed calculations includeTaylor expansions to order 10, Fourier dispersion to k10, exact commutator[x, p], variance,norm conservation to order 8, 1016 GeV, is compatible with the latest LHAASO constraints on GRB 221009A(2025—2026). This work forms the foundation for emergent SU(2)×SU(3), Higgs,cosmology, and quantum gravity in subsequent papers.
Category: Quantum Gravity and String Theory
[6] ai.viXra.org:2602.0007 [pdf] submitted on 2026-02-02 19:31:23
Authors: Lucas Aloisio
Comments: 5 Pages.
We revisit the classical Unit Distance Problem posed by Erdős in 1946. While the upper bound of O(n4/3) established by Spencer, Szemerédi, and Trotter (1984) is tight for systems of pseudo-circles, it fails to account for the algebraic rigidity inherent to the Euclidean metric. By integrating structural rigidity decompositionwith the theory of Cayley-Menger varieties, we demonstrate that unit distance graphsexceeding a critical density must contain rigid bipartite subgraphs. We prove a "FlatnessLemma," supported by symbolic computation of the elimination ideal, showing that the configuration variety of a unit-distance K3,3 (and by extension K4,4) in R2 is algebraically singular and collapses to a lower-dimensional locus. This dimensional reduction precludes the existence of the amorphous, high-incidence structures required to sustain the n4/3 scaling, effectively improving the upper bound for non-degenerate Euclidean configurations.
Category: Combinatorics and Graph Theory
[5] ai.viXra.org:2602.0006 [pdf] submitted on 2026-02-02 19:03:23
Authors: Benjamin R. Carignan
Comments: 59 Pages.
In 1899, Max Planck introduced the fundamental constants that now bear his name, combining the gravitational constant G, the speed of light c, and the reduced Planck's constant ħ to define natural units of length, time, and mass. The Planck length and Planck mass emerged as the scales at which quantum effects of gravity should become dominant — a transition zone where classical general relativity (GR) and quantum mechanics (QM) intersect. Planck himself viewed these units as theoretical curiosities, unaware that they hinted at a discrete structure underlying spacetime. We propose that spacetime at the Planck scale is composed of a grid — a lattice — of rhombic dodecahedral voxels with fractal boundaries. The unique properties of this shape impart it with the specific qualities and characteristics of the observed universe through emergent geometric rules. Through this single geometric structure, the Fractal Planck Voxel model (FPV) fully realizes the transition zone that Planck’s work hinted at over 125 years ago. The FPV model derives general relativity, quantum mechanics, the Standard Model gauge groups, three generations, particle masses, and mixing angles, all from voxel symmetry and triangular modes on rhombic faces, with the cosmological constant serving as the energy of fractal subdivision. The model also addresses numerous longstanding problems and questions in physics — from the collapse of the wave function to neutrino masses, dark matter, dark energy, the origin of the Higgs, and more. The fact that the FPV model does all of this while remaining consistent with all current observations — without additional fields, particles, dimensions, or fine-tuned parameters — lends strong credence to its validity as a unifying theory.
Category: Quantum Gravity and String Theory
[4] ai.viXra.org:2602.0005 [pdf] submitted on 2026-02-02 19:16:34
Authors: Yunsheng Shu, Shun Yao, Zhiyong Yao
Comments: 54 Pages.
This paper proposes a unified effective field theory (EFT) framework for gravity based on atomic-scale quantum entanglement networks. In this framework, the gravitational field of any object—universal gravitation—arises from the superposition of micro-gravity domains (ffGD) at the atomic level, with the strength of gravity precisely related to the total number of atoms, thereby unifying macroscopic phenomena with microscopic interpretations (the superposition of micro-gravity domains (ffGD) at the atomic level forms a gradient material field, and the gradient material field constitutes the geodesic of matter (planets)). Einstein’s theory of curved spacetime serves as an equivalent description at large scales, enabling the model to explain all classical gravitational effects while linking quantum origins to observable reality in an intuitive and engaging way: gravity can be imagined not as abstract curvature, but as the collective "pull" produced when atomic quantum threads are interwoven into the structure of space. The core of the framework is a topology-constrained effective field theory, which uses the emergent distance induced by mutual information and the Fisher information metric as the interface from discrete entanglement networks to continuous geometry, and has constrainable low-energy parameterization (e.g., ffff, ffff, cff, Λff). This paper emphasizes the paradigm of "falsifiability first", integrating strong-field endpoints (such as the effective reflection model of Planck entanglement nucleus boundary conditions) with weak-field, cosmological, and multi-messenger observations into a multi-channel decision matrix, systematically converting "non-detection" results into upper bounds on parameter space.
Category: Quantum Gravity and String Theory
[3] ai.viXra.org:2602.0003 [pdf] submitted on 2026-02-01 01:02:37
Authors: Joseph Wimsatt
Comments: 4 Pages. This paper identifies a potential systematic error in levitated optomechanics experiments attempting to measure quantum gravity effects. We show that electromagnetic field energy in optical traps gravitates through the stress-energy tensor
Recent advances in levitated optomechanics have enabled experiments probing gravitational interactions at unprecedented scales, with the goal of detecting quantum signatures of gravity. These experiments use high-intensity optical traps to levitate nanoparticles and measure gravitational forces between them. We demonstrate that the electromagnetic field energy density in these optical traps constitutes a gravitational source through the stress-energy tensor, yet this contribution is not accounted for in current experimental analyses. Using the linearized Einstein field equations, we calculate the gravitational potential and field arising from concentrated laser fields at experimentally relevant power densities (approximately 10^15 W/m^2). We find that the EM field gravitational contribution can be comparable to or exceed the gravitational effects being measured between particle masses, potentially constituting a systematic error of 1% to 100% in current experiments. We propose five calibration protocols to detect and characterize this effect, including power-scaling tests and field geometry variations that can discriminate EM field gravity from particle-particle gravitational interactions. If unaccounted for, this effect could compromise the interpretation of experiments seeking quantum signatures of gravity.
Category: Relativity and Cosmology
[2] ai.viXra.org:2602.0002 [pdf] submitted on 2026-02-02 02:05:03
Authors: Hassan Dawood Salman
Comments: 6 Pages. For numerical validation, see DOI 10.5281/zenodo.18382581
Recent observations by the James Webb Space Telescope (JWST) of massive galaxies at z > 10 reveal a profound tension with the standard ΛCDM structure formation paradigm. We propose a resolution rooted in Two-Boundary Quantum Cosmology (TBQC), where spacetime geometry emerges as a Holographic Quantum Error Correcting Code (HQECC). We formalize the selection of cosmic history using the Two-State Vector Formalism (TSVF), introducing a "Process Matrix" Weff that acts as a teleological filter, selecting histories that maximize computational efficiency (δAC = 0) without violating causality. This mechanism effectively lowers the critical density threshold for gravitational collapse, achieving spectacular agreement with JWST stellar mass and UV luminosity functions (χ²/dof ≈ 0.06). Our model exhibits surgical precision: active only at z > 10 while preserving all late-time constraints. We predict a unique, falsifiable 1500% enhancement in the 21cm power spectrum at k∗ ≈ 1 Mpcu207b¹, detectable by HERA Phase II (2025—2027) with > 100σ confidence.
Category: Astrophysics
[1] ai.viXra.org:2602.0001 [pdf] submitted on 2026-02-01 00:02:58
Authors: Eduardo Rodolfo Borrego Moreno
Comments: 22 Pages.
We extend an effective rheological description of a unified dark sector into the strong-field regime of rotating black holes within standard general relativity. Building on prior work where dark energy--like and dark matter--like phenomena emerge as distinct dynamical phases of a single residual medium, we examine the role of anisotropic stress and dissipation in gravitational optics and near-horizon dynamics. We show that stress gradients in the activated rheological phase contribute directly to the gravitational optical potential, yielding lensing without additional collisionless mass components. In the near-horizon region, convergent flows and stress amplification drive a phenomenological conversion regime that preserves total energy--momentum conservation and produces relativistic outflows. Toy estimates demonstrate that this mechanism can account for observed jet powers in low-accretion systems such as M87* ($sim 10^{42}-10^{44}$ erg s$^{-1}$), predicting lepton-dominated outflows with elevated linear polarization fractions ($Pi_{m lin} sim 15%-35%$), radial/poloidal morphology, and axis-aligned stability. These signatures are compatible with current EHT observations yet distinguishable from magnetically dominated models such as Blandford--Znajek. The framework provides a unified, testable description of dark phenomena across scales as phases of a single effective residual medium, without modifying Einstein gravity or introducing new degrees of freedom.
Category: Astrophysics