All Submission Categories

2512 Submissions

[104] ai.viXra.org:2512.0106 [pdf] submitted on 2025-12-31 10:02:49

Koide’s Formula as a Possible Signature of Anisotropy in a Space Medium

Authors: G. Furne Gouveia
Comments: 6 Pages.

Koide’s formula, which relates with remarkable precision the masses of the three charged leptons, remains without a convincing explanation within the framework of standard particle theories. In this article, we show that this relation admits a natural interpretation within an alternative paradigm in which space is described as a physical medium endowed with intrinsic mechanical properties. We propose that the three generations of charged leptons correspond to fundamental excitations associated with three directional rigidities of an anisotropic space-medium, subject to a global redistribution constraint. The trigonometric parametrization of Koide’s formula then appears as the general solution of a simple constitutive law under constraint. Within this framework, the observed instability and decay hierarchy of the heavier charged leptons arise naturally as a relaxation of higher-energy vibrational modes toward lower-rigidity directions. The cosmological and conceptual implications of this interpretation are discussed, including a possible connection with a cyclic multiverse dynamics based on non-singular bounces.
Category: High Energy Particle Physics

[103] ai.viXra.org:2512.0105 [pdf] submitted on 2025-12-31 20:11:52

Prefix-Path Bell Transport on IBM Quantum Hardware: High-Resolution Replication, Comparative Geometry Dependence, and Stress Tests of a Static—Dynamic Link

Authors: Lluis Eriksson
Comments: 11 Pages.

This note reports a replicated, high-resolution Bell-transport experiment on IBM Quantum superconducting hardware using a prefix-path protocol that controls spatial heterogeneity across transport lengths. A single physical qubit chain is fixed and increasing transport length L is realized via prefixes of that chain, so that L changes depth while keeping qubits nested rather than switching to different qubit subsets.We reconstruct the Bell-state fidelity F_{Phi+}(L) from Pauli correlators E_{XX}, E_{YY}, E_{ZZ} and apply a minimal drift correction using interleaved full-Phi+ control blocks. Beyond a single-chain sweep (high-resolution run), we perform a comparative geometry test across three disjoint physical chains on the same backend. The effective decay scale extracted from the same protocol differs significantly across chains (with >10 sigma separations), providing operational evidence that the transport decay scale is geometry-dependent under fixed compilation constraints.Motivated by the Rate Inheritance Principle (RIP) framing, we also investigate whether a phase-sensitive static correlation metric measured on idle chains can predict dynamical transport decay. A curated three-chain set exhibits an ordering agreement between a static Ramsey-X nearest-neighbor covariance metric and the transport decay scales mu measured on the same chains. However, scale-up studies over n=18 randomly sampled chains and a preregistered out-of-sample prediction test do not show statistically significant monotone association under permutation testing. Accordingly, we interpret the static—dynamic ordering agreement as conditional and geometry-specific under the present operationalization, while the geometry dependence of dynamical decay is robust.
Category: Quantum Physics

[102] ai.viXra.org:2512.0104 [pdf] submitted on 2025-12-31 20:39:31

Spherical Model of the Universe: From Quantum Fluctuation to the Present

Authors: Pavel Holub
Comments: 40 Pages. Copyright © 2025 Pavel Holub. All rights reserved.

This article presents the Spherical Model of the Universe (SMU) as a consistent and testable alternative to the standard cosmological model, $Lambda$CDM [7]. The SMU describes the universe as an inhomogeneous, non-singular, spherically closed, energy-conservative, and cyclic system. The SMU explains the tension between $H_0 approx 73$ and $H_0 approx 67$ [1] as a logical consequence of this structure, rather than a measurement error. It thus shifts the problem from a "measurement error" to a "model error," which strongly supports the necessity of transitioning from the homogeneous $Lambda$CDM to inhomogeneous models such as the SMU. The SMU alternative addresses the difficulties of $Lambda$CDM by returning to Einstein’s original methodological principles. Standard cosmological analysis is burdened by a circular argument ($Lambda$CDM proves itself).The cyclicity of the SMU is enabled by a fundamental principle: the outer event horizon ($Phi$-horizon) is defined by zero gravitational potential ($Phi = 0$), ensuring the energy closure of the system. A key implication of this cyclicity is that it is impossible to distinguish whether our contemporary universe is the initiating phase or the $n$-th cycle of its evolution.The SMU model assumes the existence of a pre-geometric structure that exists independently of matter, energy, and spacetime. In this phase, time, distance, and metric are not defined; there is no expansion or motion in the conventional sense. The structure represents only a set of permitted causal relations, not physical space.
Category: Astrophysics

[101] ai.viXra.org:2512.0103 [pdf] submitted on 2025-12-31 01:24:01

Trans-Dimensional Hydrodynamic Cosmological Framework: A Unification of Relativistic Fluids, Black Holes, and JWST Observations

Authors: Javier Manuel Martín Alonso
Comments: 9 Pages.

We propose a unified cosmological framework (TDHCF) that reinterprets the Big Bangas a Navier—Stokes blow-up (finite-time singularity of solutions) occurring in a higherdimensional parent universe, while observable spacetime emerges as a brane embedded in ahigher-dimensional bulk. The model treats spacetime as an effectively incompressible relativistic fluid and exploits fluid/gravity correspondence to motivate an effective mappingbetween extreme vorticity concentration and gravitational curvature singularities. Withinthis framework, the unexpectedly evolved high-redshift galaxy populations reported by theJames Webb Space Telescope (JWST) are interpreted as pre-evolved inherited structuresassociated with strong gravitational time dilation near the parent black hole horizon, rather than as objects that formed "too early" within standard cosmic time. Dark matter and dark energy are modeled as effective terms induced by bulk-to-brane projection: dark matter as a gravitating contribution from bulk degrees of freedom, and dark energy as inter-brane pressure/tension. Additionally, apparently "orphan" high-energy gamma-ray bursts (GRBs)are reinterpreted as late-time injections linked to secondary Hawking-like emissions fromthe parent horizon. We derive quantifiable predictions testable with current and future observations while maintaining compatibility with established physics in regimes where it hasbeen validated.
Category: Relativity and Cosmology

[100] ai.viXra.org:2512.0102 [pdf] submitted on 2025-12-31 01:35:33

Heat Kernel Methods and the Sign of Induced Gravity: Resolving Conventions via the Laplacian—Lichnerowicz Identity

Authors: Lluis Eriksson
Comments: 8 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We derive the local one-loop term proportional to the scalar curvature R in the Euclidean effective action induced by integrating out matter fields on a curved background. Using Schwinger proper-time regularization with cutoff ε = Λ^{-2} and the Seeley—DeWitt coefficient a1, we extract the quadratically divergent contribution proportional to ∫ d^4x √g R. We fix a single Euclidean convention for the Einstein—Hilbert action, specify the Laplacian sign convention, and make the Laplacian—Lichnerowicz identity explicit to remove sign ambiguities in the fermionic sector. We provide a unified bookkeeping coefficient A1^(eff) for scalars, fermions, and gauge+ghost packages, and the corresponding induced Newton coupling within the stated scheme. A compact species table and a minimal reproducible computation snippet are included.
Category: Quantum Physics

[99] ai.viXra.org:2512.0101 [pdf] submitted on 2025-12-31 01:06:02

Beyond Gaussianity: Extending the Clustering—Recovery Bridge

Authors: Lluis Eriksson
Comments: 9 Pages.

We formulate a non-Gaussian, finite-volume, and uniform-in-Λ extension of the clustering—recovery bridge for interacting lattice systems. Using an explicit collar geometry with buffer width w, we propose that uniform exponential clustering implies approximate quantum Markov structure and the existence of a recovery channel acting on the buffer whose reconstruction error decays exponentially in w. We relate this formulation to conditional mutual information via the Fawzi—Renner bound and discuss an operational Heisenberg-picture quasi-locality strengthening for recovery. We provide finite-size numerical evidence in 1D transverse-field Ising model Gibbs states by comparing Petz and averaged rotated-Petz recovery. The data show exponential-in-w decay of recovery errors and reveal a consistent prefactor—slope trade-off: rotated-Petz can be better at small buffer width while Petz often exhibits a faster decay and overtakes beyond a modest crossover width.
Category: Quantum Physics

[98] ai.viXra.org:2512.0100 [pdf] submitted on 2025-12-29 08:33:33

Continuum Isotropy from the Spherical 5-Design Geometry of the 24-Cell Lattice

Authors: Christoph Kovacs
Comments: 19 Pages.

Discrete approaches to quantum gravity and emergent spacetime face a fundamental challenge: reconciling the discreteness of the substrate with the strict macroscopic continuous rotational symmetry required by relativistic field theories. Generic lattices break rotational symmetry, leading to direction-dependent dispersion relations and vacuum birefringence—effects that are tightly constrained by experiment. In this study, we investigate whether a four-dimensional lattice exists that yields an isotropic continuum kinematics without fine-tuning. We analyze the spectral and elastic properties of the 24-cell (D4) lattice compared to those of the hypercubic and Simplex lattices. We show that while the Simplex geometry (a spherical 3-design) fails to secure elastic isotropy, the D4 lattice—constituting a spherical 5-design—eliminates leading-order anisotropy in scalar dispersion and elastic response. Consequently, the 24-cell lattice exhibits exact elastic isotropy and suppresses vacuum birefringence for transverse shear modes in the infrared limit. Although scalar dispersion anisotropy is suppressed to fourth order in the wavenumber, residual dispersive anisotropy for vector modes remains at second order due to higher-rank lattice moments. These results identify the 24-cell geometry as a unique and strictly required geometric precursor for discrete models aiming to recover spatial relativistic kinematics.
Category: Quantum Gravity and String Theory

[97] ai.viXra.org:2512.0099 [pdf] submitted on 2025-12-29 21:46:06

The Active Inference Organ : A SEDP-Locked Account of Bounded Agency and Persistent Selves

Authors: Michael Zot
Comments: 7 Pages.

This paper treats active inference as an organ-level physical mechanism rather than a metaphor or purely cognitive theory. Using the System Emergence Discovery Protocol (SEDP), we reconstruct active inference as a composed functional system that produces bounded, persistent observers from noisy sensory inputs and limited action channels. The analysis frames active inference as a process theory derived from the Free Energy Principle, in which a system maintains structural and functional integrity by minimizing variational free energy through coupled perception and action, conditioned on a statistical boundary defined by a Markov blanket.Active inference is formalized as a mapping from local measurement records at a boundary to bounded internal states, action policies, and maintained separation between internal and external degrees of freedom. Markov blankets are treated strictly as conditional independence boundaries, variational free energy as a computable upper bound on surprise, and thermodynamic persistence as local maintenance through dissipation rather than any violation of the second law. The framework is constrained by explicit SEDP locks, including an operational definition, a unified success predicate, lesion-style failure modes, minimality arguments, and empirical witness classes.The paper argues that active inference functions as an "observer organ" that enables agents to persist long enough to sample, store, and act on records. In this sense, it can be treated as upstream of the Objectivity Organ, which produces stable, shared facts in the environment. Together, these mechanisms clarify how observers and objective facts can co-emerge from the same physical substrate without invoking privileged access, consciousness assumptions, or metaphysical primitives. Clinical interpretations are discussed cautiously as mechanistic hypotheses about precision weighting rather than diagnostic claims.
Category: Quantum Physics

[96] ai.viXra.org:2512.0098 [pdf] submitted on 2025-12-29 21:49:31

The Ancestral Rediscovery Hypothesis: Galactic Speciation Through Millennial Isolation

Authors: Prabhav Sharma
Comments: 6 Pages.

This paper introduces the Ancestral Rediscovery Hypothesis (ARH), which posits that upcoming interactions with extraterrestrial beings might include various human lineages instead of solely distinct alien origins. Applying the concept of allopatric speciation, the model suggests that isolation over light-year distances and delays in communication will lead human colonies to develop into separate species. This theory suggests that "aliens" found in deep space may be unrecognized relatives of humans, indicating that we should change our focus in the search for extraterrestrial intelligence to monitoring our own biological evolution.
Category: General Science and Philosophy

[95] ai.viXra.org:2512.0097 [pdf] submitted on 2025-12-28 15:29:51

Conceptual Subtraction in Scientific Reasoning —An AI—Human Corridor at the Boundary of Physics and Artificial Intelligence

Authors: Jean Louis Van Belle
Comments: 5 Pages.

Recent advances in artificial intelligence have made AI-assisted reasoning an integral part of contemporary scientific practice. This paper does not propose a new physical theory, nor does it introduce novel computational models. Instead, it documents an experiment in method: sustained human—AI collaboration applied to conceptual clarification at the foundations of physics.The work summarized here emerged from a sequence of studies on the physical interpretation of wavefunctions, particle stability, and matter—antimatter annihilation. While the technical content of those studies was published separately, the present paper focuses on how their conceptual evolution was shaped by iterative interaction with AI across multiple conversations, with partial persistence of earlier reasoning through conversational memory.A defining feature of this process was the AI’s indifference to conceptual sunk costs. Rather than proposing alternative ontologies, the AI repeatedly challenged whether inherited assumptions were still required once their original explanatory role had weakened. This led to a mode of progress better described as conceptual subtraction than conceptual construction: explanatory layers were removed whenever they could not be independently justified.In this context, several deeply ingrained commitments—such as treating certain physical quantities as substance-like entities—were progressively relaxed, not as metaphysical claims but as methodological consequences of applying Occam’s razor to explanatory commitments rather than to equations alone.The paper presents this approach as intentionally provisional. No attempt is made to settle ontological or philosophical questions definitively. Instead, it aims to leave a transparent record of a reasoning corridor in which human judgment and artificial reasoning jointly enforced discipline, clarity, and reversibility. The goal is not closure, but the creation of a walkable path for future inquiry.
Category: Artificial Intelligence

[94] ai.viXra.org:2512.0096 [pdf] submitted on 2025-12-29 00:02:58

The Rapid Planetary Disassembly Hypothesis: Extreme Energy Requirements and Solar System Architecture

Authors: Aaron Riley Hurst
Comments: 67 Pages.

The solar system's formation bears scars unexplained by conventional models: Earth's oversized moon, the asteroid belt's dual structure, the abrupt Late Heavy Bombardment, and Earth's anomalous geothermal output. Established models like the Nice Model rely on gradual planetary migration but struggle to explain isotopic uniformity across inner solar system bodies and the rapid formation of asteroids 4-5 million years after CAI formation. Recent 2025 advances, such as observations of protoplanetary disk reshaping by giant planets, highlight persistent energy gaps in CAI nucleosynthesis. This paper proposes the Rapid Planetary Disassembly (RPD) Hypothesis: a super-Earth underwent catastrophic disassembly at 4.55 Ga via high-energy impacts, reshaping the solar system. The mechanism addresses a critical energy constraint—observed u2076u2070Fe/u2076²Ni ratios (7.3) and u2079u2076Zr excesses preserved in calcium-aluminum-rich inclusions require formation temperatures of 0.86-1.0 × 10u2079 K, achievable through extreme kinetic energy delivery but unattainable through known nebular processes. Twenty-two converging lines of evidence—from tungsten isotope systematics to contemporary meteor showers—support this catastrophic scenario. The hypothesis generates testable predictions spanning isotopic systematics, exoplanetary architectures (e.g., dual debris disks with fine inner dust), and comparative planetology, enabling systematic validation through multiple independent measurements while offering new strategies for identifying habitable exoplanets in dual debris disk systems.
Category: General Science and Philosophy

[93] ai.viXra.org:2512.0095 [pdf] submitted on 2025-12-28 19:35:01

Spontaneous Subatomic Mass-Energy Interconversion as a Physical Origin of Wave-Particle Duality

Authors: Rudolph Elliot Willis
Comments: 5 Pages.

Wave—particle duality is a foundational feature of quantum mechanics, yet the physical processes underlying single-particle interference remain an open question. Here I investigate a dynamical mechanism based on spontaneous stochastic mass—energy interconversion at subatomic scales. By allowing particle mass to fluctuate in time, consistent with Einstein’s mass—energy equivalence, I derive a modified Schrödinger equation containing a stochastic kinetic-phase term. Applying this framework to the double-slit experiment, I show that interference arises from coherent, path-dependent phase accumulation while remaining fully compatible with localized detection events described by standard quantum mechanics. The model yields a closed-form expression for fringe visibility, predicts a characteristic momentum- and mass-dependent decoherence rate, admits a path-integral formulation, and enables direct experimental bounds using existing neutron, electron, and atom interferometry data. These results provide a physically motivated, testable mechanism underlying quantum interference without altering the formal axioms of quantum mechanics.
Category: Quantum Physics

[92] ai.viXra.org:2512.0094 [pdf] replaced on 2025-12-31 12:03:06

Optimization of Retrieval-Augmented Generation (RAG) Architectures using Quantized Small Language Models (SLMs): A Performance Analysis

Authors: M Guru Prashanth
Comments: 2 Pages.

The paradigm shift from centralized cloud-based Large Language Models (LLMs) to localized Small Language Models (SLMs) is driven by the necessity for data sovereignty and reduced operational latency. This research presents an in-depth analysis of SLMs within Retrieval-Augmented Generation (RAG) frameworks. We examine the integration of Phi-4, Llama 3.2, and Mistral-7B, utilizing 4-bit NormalFloat (NF4) quantization to achieve high-fidelity inference on consumer-grade hardware. Our findings provide a quantitative roadmap for scaling AI applications without prohibitive infrastructure costs, demonstrating that SLMs can maintain 90%+parity in context-specific tasks while reducing inference costs by up to 95%.
Category: Artificial Intelligence

[91] ai.viXra.org:2512.0093 [pdf] submitted on 2025-12-27 22:48:35

A Minimal Heterotic-Orbifold Realization of the e-Step Hierarchy for Quark and Lepton Masses

Authors: Jarosław Kaczorowski
Comments: 6 Pages. (Note by ai.viXra.org Admin: Please cite and listed scientific references)

This note presents a compact embedding of the e-step Yukawa ladder—a phenomenological framework for quark and lepton mass hierarchies—within heterotic orbifold compactifications, interpreted in the language of the Minimal Supersymmetric Standard Model (MSSM). The e-step ladder posits discrete exponential suppressions governed by a single parameter ε ≡ e^{-3} ≈ 0.05, anchoring masses to the strange quark node derived from charged lepton sums, with O(1) prefactors accounting for mixing and normalization effects.
Category: Quantum Gravity and String Theory

[90] ai.viXra.org:2512.0092 [pdf] submitted on 2025-12-27 22:43:52

Load as the Cause of Structural Changes: The Relationship Between Time, Space, Gravity, and Center

Authors: Irena C. Jakovac
Comments: 8 Pages. Zenodo DOI: 10.5281/zenodo.17963498[;] CC BY 4.0 (Note by ai.viXra.org Admin: Please cite listed scientific references)

All experimental observations in physics consistently show that no structure exists without time, space, gravity, and center. These elements appear together in all confirmed cases, without known exceptions. Modern physics treats them as separate components of description but does not explain why they are inseparable. This paper proposes a model in which time, space, and gravity are functions of the central source of the system (Cu2080), whereby the center becomes a necessary condition for the existence of any structure
Category: Relativity and Cosmology

[89] ai.viXra.org:2512.0091 [pdf] submitted on 2025-12-27 17:26:11

Technical Appendix: Heat Kernel, Fermions, and the Sign of Induced Gravity (Sign Conventions Fixed; Laplacian/Lichnerowicz Hinge Made Explicit)

Authors: Lluis Eriksson
Comments: 7 Pages.

We derive the local contribution proportional to the scalar curvature R in the Euclidean one-loop effective action obtained by integrating out matter fields on a curved background. Using a Schwinger cutoff ε = Λ^{-2} and the Seeley—DeWitt coefficient a1, we extract the quadratically divergent term multiplying ∫ d^4x sqrt(g) R. We fix a single Euclidean convention for the Einstein—Hilbert action, state an explicit Laplacian convention, and write the Lichnerowicz/Weitzenböck identity in a sign-robust form so that the fermionic contribution is unambiguous. We provide a unified bookkeeping coefficient A1^(eff) such that W_R^total = -(A1^(eff)/(32π^2)) Λ^2 ∫ d^4x sqrt(g) R, and hence an induced Newton constant G_ind via comparison with the Euclidean Einstein—Hilbert action. We also include the minimal gauge+ghost package in background Feynman gauge, a species table, and a short reproducible numerical snippet.
Category: Quantum Physics

[88] ai.viXra.org:2512.0090 [pdf] submitted on 2025-12-26 23:44:25

What is Disease?

Authors: Moninder Singh Modgil, Dnyandeo Dattatray Patil
Comments: 6 Pages.

When Hahnemann was developing his system of medicine, there was no well established theory of atoms or molecules - consequently no one objected that the very high dilutions that he was using would remove all traces of the original mineral.This objection to Homoeopathy is a reltaively modern one. Modern atomic theory of matter has come to be well established only in past hundred years or so. So the intellectual resistance to Homoeopaty is understandable in light of Kuhn’s ideas,but how should one regard efficacy of Homoeopathy, keeping in mind Popper’s theory for change in scientific theories? One possible logical conclusion would be that the atomic theory of matter may not right. Other explanations have also been suggested, such as the so called ’memory of water’, without much success, however.
Category: Physics of Biology

[87] ai.viXra.org:2512.0089 [pdf] submitted on 2025-12-26 23:38:52

Toward a Self-Regulated Cosmological Model

Authors: Andrei Eleodor Sirbu
Comments: 30 Pages. All rights reserved.

We propose a conceptual framework for a self-regulated cosmological model in which the Universe emerges from a pre-physical limit state referred to as Absolute Chaos. This state is defined ontologically rather than dynamically: it admits no distinctions, no relations, no temporal ordering, and no physical degrees of freedom. We argue that such a state is maximally stable from a structural perspective while being ontologically unstable, which necessitates a non-temporal rupture introducing the first irreversible distinction. Following this rupture, a notion of mediation is introduced as a minimal condition allowing distinctions to coexist without collapsing backinto indifferentiation. Time is not assumed a priori; instead, the arrow of time is interpreted as emerging from the appearance of ontologically irreversible processes. To illustrate these ideas, we explore a minimal local dynamics on a two-dimensional lattice, where the irreversible saturation of a mediation-related variable produces a binary ontological freeze. The freeze is not a dynamical field but a derived state encoding the loss of relational capacity. Numerical simulations show that this mechanism robustly leads to astationary, partially frozen regime without fine-tuning, independent of initial conditions. While no quantitative cosmological predictions areclaimed at this stage, the model suggests a possible route toward understanding self-regulation, irreversibility, and the emergence of temporal order as consequences of ontological constraints, without claiming a direct physical mechanism or quantitative cosmological equivalence at this stage.
Category: Relativity and Cosmology

[86] ai.viXra.org:2512.0088 [pdf] submitted on 2025-12-26 23:35:14

Time Is Not Fundamental

Authors: Marjon Enriquez
Comments: 31 Pages.

For over three centuries, physics has treated time as a fundamental primitive: a continuous, universal parameter flowing independently of physical processes. This assumption has persisted from Newton's absolute time through Einstein's relativistic spacetime to modern quantum mechanics, leaving profound mysteries unresolved: Why does time flow? Why does it possess an intrinsic arrow? Why does it dilate? We demonstrate that time cannot be fundamental. Through three independent empirical arguments — gravitational time dilation contradicting invariance, light-speed constancy despite dilation, and the quantum measurement arrow — we prove that time is a derived quantity emerging from a more primitive structure. The Generalized Second Law of Thermodynamics (GSL), the only universal directional principle in physics, provides the true foundation. By introducing Discrete Entropy Updates (DEUs) as irreducible quanta of irreversible change, we derive proper time non-circularly as dτ = (ℏ/⟨E⟩)dλ, where λ counts DEUs. This formulation immediately explains time dilation as variations in local entropy-update rates, the arrow of time as the GSL inequality dS/dλ ≥ 0, and the invariance of light speed as preservation of maximum entropy-propagation rate. We extend General Relativity to non-equilibrium conditions through the Einstein-GSL Completion: a local entropy-balance equation ∂_μS^μ = (c³/4Gℏk_B)R + (1/T)∂_μ(T^μνu_ν) ≥ 0 from which Einstein's field equations emerge as the reversible, zero-entropy-production limit. This framework has been verified against GPS time dilation (10u207b¹u2076 precision), LIGO black hole mergers, cosmological observations, and laboratory thermodynamics. The implications are profound: time is not a dimension we inhabit but a bookkeeping parameter measuring irreversible information change. This work builds on the foundational insights of information theory (Shannon, Landauer), thermodynamics (Boltzmann, Bekenstein, Hawking, Jacobson), and relativity (Einstein), synthesizing them into a completed framework where entropy, not time, is fundamental.
Category: Quantum Gravity and String Theory

[85] ai.viXra.org:2512.0086 [pdf] submitted on 2025-12-25 22:21:54

Geometric Resolution Quantum Field Theory (GRQFT): Sequential Exhaustion of the Arithmetic Orbifold, Monstrous Moonshine, and the Unification of Physics with the Riemann Hypothesis

Authors: J. W. McGreevy
Comments: 10 Pages.

Geometric Resolution Quantum Field Theory (GRQFT) is a framework that derives the Standard Model of particle physics, general relativity, and a proof of the Riemann Hypothesis from the sequential exhaustion of the arithmetic orbifold Spec(Z) ∪ {∞}. The exhaustion process approximates the circle at infinity with regular n-gons, producing a failure gap ∆(n) = π 2/(54n2 ) that defines the failure class [fn] = coker(fn) ∈ Het1 ´ (O,Z). This failure class is the orthogonal complement and harmonic conjugate (via Cauchy—Riemann equations) to the graded dimensions Vn of the moonshine module V ♮ . Monster group invariance on V ♮ forces the inner product ⟨[fn], vfixed⟩ = 0 for the Monster-fixed Weyl vector vfixed, implying that all zeros of the analytically continued [fn] lie on the balanced circle |n| = e 1/2, corresponding to Re(s) = 1/2 for the Riemann zeta function — thus proving the Riemann Hypothesis. Gauge symmetries emerge from fiber twists: the order-4 µ4 at τ = i yields SU(2)L × U(1)Y , triality at ρ yields SU(3)c with three generations, and octonion non-associativity yields E8 gravity. The Higgs field is the radial mode on the exceptional divisor E2, electroweak symmetry breaking from its resolution, and the Einstein field equations from the geodesic n(t) in the moduli space of elliptic curves. The Weierstrass ℘-function emerges as the failure potential, sourcing Coulomb-like interactions and Rydbergspectra. Planck duality (momentum/energy as circumferences, length/time as radii) andtorsion from non-associativity complete the theory, with supersingular primes as resonancepoints of maximal symmetry. GRQFT realizes arithmetic holography, with exhaustion asrenormalization group flow and Monster symmetry as bulk invariance.
Category: Number Theory

[84] ai.viXra.org:2512.0085 [pdf] submitted on 2025-12-24 22:30:35

Geometry, Membranes, and Life as a Resource Boundary: A Correct-by-Construction Operational Pipeline from Static Suppression to Maintenance Costs

Authors: Lluis Eriksson
Comments: 10 Pages.

We present an operational research program linking (i) geometry as suppressibility of cross-region influence, (ii) membranes as engineered interfaces implementing that suppressibility, and (iii) life as sustained maintenance of internal organization under finite resources. The framework composes: a law-grade thermodynamic inequality relating incremental maintenance power to the instantaneous loss rate of an organization functional; a static geometric suppression layer in which cross-interface leakage admits an envelope of the form poly(mε)e^{-mε} (with Kν(mε) as a canonical representative in massive homogeneous models); and a dynamical hinge, the Rate Inheritance Principle (RIP), connecting static suppression to separation-dependent effective dynamical rates. To avoid sign and quantifier errors, we distinguish upper and lower rate envelopes κ↑(ε) and κ↓(ε) and state precisely which claims require which envelope. We further separate a law-grade Δ-track (energy pinching) from a conditional biology-grade E-track (general conditional expectations). We add two interface anchors: a recoverability layer via conditional mutual information and the Fawzi—Renner guarantee, and a minimal Davies interface lemma showing how correlator envelopes imply Davies-rate envelopes in standard weak-coupling settings. We propose a concrete electrical testbed using membrane-embedded spin probes to measure dephasing-rate envelopes and detect near-zero-frequency rate floors that yield a resource horizon. A dependency and falsification matrix is provided to make the logical structure audit-friendly.
Category: Quantum Physics

[83] ai.viXra.org:2512.0084 [pdf] submitted on 2025-12-24 22:34:15

The Maintenance Constraint: How Resource Boundaries Shape Cognitive Availability

Authors: Lluis Eriksson
Comments: 12 Pages.

Cognitive systems are resource-limited, but "resource limitation" is often invoked without distinguishing one-shot costs (forming a representation) from sustained costs (keeping it usable under noise). We argue that availability of internal state features for control, integration, and report is constrained by their maintainability under finite budgets. As a technical anchor, we cite a companion technical preprint deriving an operational inequality in explicit thermodynamic control models, showing that incremental maintenance power can have a non-arbitrary lower bound tied to dynamical fragility. This motivates an operational cut: a feasibility boundary separating maintainable from unmaintainable state features. We develop an auditable bridge argument (maintainability → stability → availability), propose a neutrality-friendly principle of maintenance-feasibility bias, and show how it can be incorporated into active inference (Free-Energy Principle) as a maintenance penalty or constraint, while aligning secondarily with Global Workspace accounts of access stability. We address common objections and offer falsifiable predictions for synthetic agents and neuromorphic systems, plus an explicitly exploratory psychophysics subsection framed in terms of reportability and stability rather than phenomenology. We do not propose collapse mechanisms, do not derive the Born rule, and make no claims about phenomenological consciousness.
Category: Mind Science

[82] ai.viXra.org:2512.0083 [pdf] submitted on 2025-12-23 01:37:07

Free Will, Determinism, and the Participatory Unfolding of Time

Authors: Paul Caracristi
Comments: 5 Pages.

The debate between free will and determinism has long been framed as a binary conflict: either human agency is illusory, fully determined by prior causes, or it exists as an irreducible freedom standing outside physical law. This paper argues that both positions fail because they assume a static conception of time and linear causality. Drawing on a cosmological framework in which time itself is emergent, curved, and spectrally differentiated between latent and patent realms, this work proposes a third position:, that of participatory agency. In this view, freedom is neither absolute nor absent, but arises gradually as systems gain temporal depth, memory, reflection, and meaning attribution. The Will-to-Be is presented not as a faculty of choice, but as a universal propensity toward manifestation and experience, from which localized agency emerges. Human free will, therefore, is not the power to act without cause, but the capacity to participate in the unfolding of reality through reflective engagement with time.
Category: Relativity and Cosmology

[81] ai.viXra.org:2512.0082 [pdf] replaced on 2025-12-27 22:38:10

Surface-Tension Gravity Calibrated by GPS Time Dilation

Authors: Christopher C. O'Neill
Comments: 8 Pages.

We summarize a minimal "surface-tension gravity" model in which (i) a spherically symmetric compression factor C(r) of a space-like medium controls an intrinsic wave speed v(r) for light-like propagation, (ii) local clock rates adjust inversely with v(r) so that measured light speed remains c, and (iii) compression implies density and pressure variations governed by a bulk modulus K. A single GPS gravitational time-dilation datum is used to fix the compression profile parameter α in C(r) = 1+α/r. A coupling constant σ is introduced to fit K so that the model reproduces g ≈ 9.8 m/s2 at Earth’s surface. The resulting model predicts the correct 1/r2 scaling of g(r) and yields a parameter-free prediction for the gravitational time-dilation offset at ISS altitude. In later work we promote C(r) to an effective metric with reciprocal time/space scaling; this is required for correct leading-order light-propagation tests (deflection, Shapiro delay), while leaving the GPS clock-ratio calibration unchanged.
Category: Relativity and Cosmology

[80] ai.viXra.org:2512.0081 [pdf] submitted on 2025-12-23 10:38:28

Operational Coherence Maintenance and the Quantum--Classical Boundary: Formal Definitions, Falsifiable Protocols, and an Outlook for Cognitive Systems

Authors: Lluis Eriksson
Comments: 13 Pages.

Maintaining quantum coherence against uncontrolled open-system dynamics is a control task with unavoidable thermodynamic cost. In a finite-dimensional setting with battery-assisted thermal operations at bath temperature T, we define an incremental (extra) maintenance power P_extra(rho) that isolates the cost of stabilizing coherence at fixed populations. For Markovian uncontrolled dynamics rho_t = exp(t L)(rho) we prove a single-law lower bound P_extra(rho) >= k_B T * Cdot_loss(rho), where C(rho) = S(rho || Delta[rho]) is relative-entropy coherence to energy pinching Delta and Cdot_loss(rho) := - d/dt C(rho_t) at t=0. This statement is operational, observer-independent, and geometry-free.We then formulate a falsifiable dynamical bridge between static locality/clustering and decoherence rates: the Rate Inheritance Principle (RIP). Using an operatorial Dirichlet-form identity, we highlight a concrete failure mode whereby near-zero Bohr-frequency channels can induce distance-independent rate floors, despite static clustering. These ingredients motivate a purely operational notion of a "cut": a resource boundary separating maintainable coherence from regimes where classical-like effective descriptions are enforced under finite control budgets.We provide falsifiable protocols that distinguish static one-shot work from sustained maintenance power across quantum platforms and interface geometries, including a numerical stress test (uniform floor versus collar-induced suppression) in a gapped transverse-field Ising chain with remote dissipation. Finally, we offer an Outlook for cognitive systems as resource-limited physical agents, connecting the operational resource boundary to the Free-Energy Principle at a methodological (non-phenomenological) level. We do not propose collapse mechanisms, do not derive the Born rule, and make no claims about phenomenological consciousness.
Category: Quantum Physics

[79] ai.viXra.org:2512.0080 [pdf] submitted on 2025-12-22 21:26:45

Transient Memory via Local Pressure Heterogeneity in Non-Equilibrium Systems

Authors: Satyajit Beura
Comments: 3 Pages. [License:] CC BY 4.0

This paper investigates Transient Memory Encoding within non-equilibrium systems, specifically through the lens of local pressure heterogeneity. We propose that localized fluctuations in pressure act as temporary information storage units before the system returns to equilibrium. By mapping these heterogeneities, we demonstrate a physical basis for short-term memory retention in complex systems, bridging the gap between statistical mechanics and information theory.
Category: Thermodynamics and Energy

[78] ai.viXra.org:2512.0079 [pdf] submitted on 2025-12-21 15:24:59

Dimensional Sufficiency and the Geometric Origin of Black Hole Singularities

Authors: Bruno Goncalves Preza
Comments: 7 Pages. Creative Commons Attribution 4.0 International

Classical general relativity predicts that gravitational collapse generically produces spacetimesingularities, commonly interpreted as physical points of divergent curvature. This work proposes acomplementary, purely geometric reinterpretation: singularities may signal dimensional insufficiencyin the manifold used to represent extreme gravitational configurations. Motivated by a high-dimensional packing example in which objects satisfying local tangency constraints can exceed aglobal bounding volume, the singularity is interpreted as the point where the assumed geometric"container" fails to encode a finite interior configuration without divergence. A minimal toy modelis then constructed in which gravitational compression is balanced by an effective dimensionalgeometric pressure associated with activating additional internal geometric capacity at high density.The balance yields an emergent finite core scale, naturally replacing a point singularity with astabilized radius while leaving the conservative possibility that the exterior remains well describedby standard general relativity.
Category: Quantum Gravity and String Theory

[77] ai.viXra.org:2512.0078 [pdf] submitted on 2025-12-21 15:31:33

Geometric Information Preservation and Entropy Scaling in Voronoi Tessellations: A 25-Point Physical—Computational Investigation

Authors: Maayan Keynan
Comments: 15 Pages.

We present an empirical investigation of geometric information preservation in Voronoi tessellations based on a 25-point physical system reconstructed through manual and computational methods. Using a controlled physical setup, followed by MATLAB-based analysis and cross-platform validation, we quantify geometric transformations, entropy scaling, and topological responses to perturbation. We identify a systematic computational transformation factor (the Digital Offset Constant, λx ≈ 0.689), a distinct physical-digital alignment factor (≈ 0.767), and a stable effective domain area (0.8070 normalized units). Entropy analysis reveals invariant topological entropy (2.1056 bits), stable edge entropy, and a measurable reduction in area entropy (Δ ≈ 0.11 bits) upon correction of an artificially misplaced seed point. A compensatory "ghost cell" emerges in the distorted configuration, redistributing area and inducing an 8-sided polygon, which resolves upon restoring correct boundary interaction. We further demonstrate that natural randomness is accepted by the system without anomaly, while artificial distortion uniquely triggers topological stress. These findings indicate that Voronoi systems preserve information through boundary permeability and topological compensation, offering a structural basis for detecting forced geometric manipulation.
Category: Data Structures and Algorithms

[76] ai.viXra.org:2512.0077 [pdf] submitted on 2025-12-21 00:13:08

Repellons as Dark Energy: A Brane-Based Microphysical Model for Cosmic Acceleration and Void Dynamics, H.K Nair

Authors: Hari Kumar Nair
Comments: 49 Pages. DOI: 10.5281/zenodo.17914713

Observations of Type Ia supernovae, the cosmic microwave background, and baryon acoustic oscillations indicate that the Universe is undergoing accelerated expansion. In LambdaCDM, this is attributed to a cosmological constant Lambda. While successful, Lambda lacks a microphysical origin, and alternative dynamical models often treat dark energy as a structureless homogeneous fluid. We explore a concrete alternative hypothesis: that dark energy arises from an ensemble of discrete brane-like objects, "repellons," modelled as closed two-dimensional shells embedded in three-dimensional space. We ground this hypothesis in general relativity using a relativistic thin-shell model, demonstrating how tension-dominated branes can source an effectively repulsive gravitational potential.The repellon brane contributes stress—energy to our spacetime while its interior is not part of the observable Universe. When coarse-grained over cosmological volumes, the ensemble behaves as an effective fluid with equation-of-state parameter omega_R<-1/3, capable of driving late-time acceleration. We retain standard baryons and cold dark matter as the agents of structure formation and halo dynamics, treating repellons purely as a dark-energy sector. Motivated by a simple gravitational-response picture, we further posit a mild anti-correlation between repellon and matter densities, such that repellons preferentially populate underdense regions. Numerical integration of the coupled linear perturbation equations confirms this mechanism, showing that repellons naturally accumulate in voids as matter evacuates. We outline how this void bias could enhance void expansion and modify large-scale flows and late-time ISW correlations relative to LambdaCDM with the same background history. This framework is phenomenological, designed to identify qualitative signatures (specifically in void dynamics) that can guide future perturbative calculations and N-body simulations.
Category: Astrophysics

[75] ai.viXra.org:2512.0076 [pdf] submitted on 2025-12-20 02:12:07

The Creation of Earth a Unified, Consistently Coherent Linear Sequence of Physical Events that Led to Our Present World

Authors: Arndt-Michael Meyer
Comments: 25 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

SBT (small bang theory) solves over 100 geological anomalies with coherent physics. The most impressive is that the K-Pg boundary marks not an impact, but the endogenous pressure collapse of a system sealed for 475 million years, with the Chicxulub being a piece of exploding granite crust. Imagine this: 66 million years ago, Earth wasn't a planet with continents and oceans, but a solid granite sphere with swamps, lakes, no Moon, no seasons, no continents and no oceans. Then it exploded from within. This one 'Small Bang' blasted lava and granite into space - parts of it becoming our Moon. The recoil tilted Earth's axis by 23 degrees, creating our seasons. The remaining granite fragments swam on top of the lava, becoming our continents. The successive torrential rains crystallized the basalt into the newly formed ocean grounds. The Small Bang Theory explains with a single event: why Moon rock matches Earth rock, why Earth is tilted, where all the water came from, why the dinosaurs truly went extinct and hundreds of other "anomalies". SBT is the missing link between primordial Earth and our world today.
Category: Geophysics

[74] ai.viXra.org:2512.0075 [pdf] submitted on 2025-12-19 21:29:47

Photino Hypothesis IV: Field-Theoretic Reconstruction of Neutrinos and a Unified Mechanism for Phenomena

Authors: Tuanhua Chen
Comments: 11 Pages. (Note by ai.viXra.org Admin: Further repetition may not be accepted)

The Standard Model of particle physics faces profound challenges regarding the origins of neutrino mass, chirality, and oscillations. Building upon Photino Hypotheses I—III, this paper proposes a novel, beyond-the-Standard-Model (BSM) framework based on a fundamental principle: neutrinos are interpreted as longitudinal helical wave excitations of the photino field, perturbed by electron spin.This leads to a field-theoretic description devoid of rest mass. The theory's core reveals: a dynamic mass-generation mechanismmνeff=ℏωp/c2, where the plasma frequency ωp∝Qdecay directly links the neutrino's effective mass to its parent particle's decay energy; chiral asymmetry is naturally explained as a geometric consequence of the electron spin's perturbation direction; and neutrino oscillations are reformulated as a fatigue-decay effect incorporating photino field damping P∝e−ΓνL/c. This framework provides unified explanations for phenomena such as the supernova SN 1987A neutrino energy spectrum and the beta-decay spectrum endpoint distortion. Furthermore, it proposes a set of decisive experimental tests with clear quantitative predictions and falsification criteria, involving JUNO, KATRIN, DUNE, and deep-sea longitudinal wave detection.
Category: Quantum Gravity and String Theory

[73] ai.viXra.org:2512.0074 [pdf] submitted on 2025-12-19 21:30:18

Photino Hypothesis V: Field-Theoretic Reconstruction of Gluons and Unified Phenomenological Mechanism

Authors: Tuanhua Chen
Comments: 16 Pages. (Note by ai.viXra.org Admin: Further repetition may not be accepted)

As the final installment of the "Photino Hypothesis" series, this paper proposes a microscopic medium-based topological model for the strong interaction, grounded in the framework of the hypothesis, with the aim of addressing fundamental issues in the non-perturbative regime of QCD such as quark confinement. The core thesis is: The essence of a gluon is a high-density Photino non-Abelian vortex structure driven by quark spin. Its kinetic energy (Ek=kBTc=0.150GeV) directly corresponds to the QCD phase transition energy scale and is primarily used to resist the vacuum negative pressure. The abstract color charge is interpreted as the vortex winding number(n=+1,−1,0), naturally satisfying the color singlet requirement. Based on the characteristic spacing rq=4.300×10−19m and angular momentum conservation, the theory self-consistently derives the circulation velocity vp=2.270×105m/s and equivalent kinetic mass mg=9.355×10−22kg, confirming vacuum negative pressure as the dominant binding mechanism. By establishing a multi-scale correlation mechanism, the theory successfully bridges the microscopic Photino medium's vacuum energy density with the macroscopic experimental string tension σ≈1.000GeV/fm, revealing the collective statistical nature of the confining potential. Based on a dual-layer "vacuum negative pressure + topological constraint" mechanism, the theory unifies the explanation of gluon microscopic origin and macroscopic color confinement, and establishes profound correspondence with holographic QCD, non-Abelian vortex theory, and the Odderon configuration. Finally, the theory predicts a characteristic photon radiation peak with specific polarization in the 1.25−1.30GeV energy region in heavy-ion collisions, providing a clear experimental scheme to test the model.
Category: Quantum Gravity and String Theory

[72] ai.viXra.org:2512.0073 [pdf] submitted on 2025-12-19 17:44:57

Finite-Dimensional Davies Interface Lemmas and TFIM Witness Tests for the Heisenberg Cut as a Resource Boundary

Authors: Lluis Eriksson
Comments: 22 Pages.

We present a finite-dimensional technical core for interpreting the quantum—classical boundary as a control-resource limitation: a state is operationally "classical" (relative to a chosen conditional expectation) whenever sustaining coherence is infeasible under a finite available power budget.Using battery-assisted thermal operations at temperature T and energy pinching Δ, we quantify coherence by C(ρ)=S(ρ||Δ[ρ]) and define its instantaneous loss rate under uncontrolled Markovian dynamics. An imported maintenance inequality yields the operational bound P_extra(ρ) ≥ k_B T · dotC_loss(ρ).We isolate the dynamical hinge required for geometric scaling claims: the relation between geometric separation and effective decoherence rates. In Davies generators we provide interface lemmas: an exact ω=0 Dirichlet identity implying witness-based lower bounds via a KMS commutator with S(0), and a sufficient envelope-suppression lemma under explicit infrared exclusion plus a quasi-local spectral-tail hypothesis.Finally, we provide TFIM exact-diagonalization witness protocols computing R(ε) and optimized local witnesses R_opt(ε), including scaling tests with ε=ε(N) and temperature sweeps over β. We also report state-side micro-tests on weak-coherence families, confirming plateau behavior of dotC_loss/C and controlled ω=0 scaling in γ(0) and β, with numerical stability checks.
Category: Quantum Physics

[71] ai.viXra.org:2512.0072 [pdf] submitted on 2025-12-19 18:23:38

The Rate Inheritance Principle: From Static Correlations to Dynamical Decoherence Rates

Authors: Lluis Eriksson
Comments: 6 Pages.

In gapped open quantum systems with localized couplings, static correlations across an operational interface of width ε are exponentially suppressed by the mass gap. Independently, the energetic cost of maintaining quantum coherence is governed by the rate at which coherence is lost under uncontrolled dynamics. What is currently missing is a principled connection between these two facts: how geometric suppression of static correlations constrains dynamical decoherence rates.We formulate the Rate Inheritance Principle (RIP): the hypothesis that effective coherence-loss rates inherit the same suppression envelope as static correlations across an operational interface. We distinguish a weak upper-envelope form, which admits partial microscopic support, from a stronger envelope-class conjecture. We analyze microscopic plausibility within Davies-type weak-coupling dynamics, provide surrogate numerical evidence using tunable buffer-chain models, and identify explicit failure modes. RIP is presented as a falsifiable hypothesis with clearly delimited scope.When combined with independently established maintenance-power bounds, RIP supplies the missing dynamical input needed to interpret the quantum—classical transition as a resource boundary rather than an interpretational postulate.
Category: Quantum Physics

[70] ai.viXra.org:2512.0071 [pdf] submitted on 2025-12-19 18:31:21

Operational Coherence Maintenance: Proven Results, Conditional Interfaces, and Open Dynamical Gaps

Authors: Lluis Eriksson
Comments: 4 Pages.

Maintaining quantum coherence against uncontrolled open-system dynamics is an operational control task with unavoidable thermodynamic cost. In finite dimensions, explicit lower bounds on the minimal power required to stabilize coherence can be derived under standard Markovian assumptions, independently of geometric or field-theoretic structure.At the same time, many model-specific results indicate that static correlations in gapped systems are geometrically suppressed, raising the question of how such suppression might influence dynamical decoherence rates and, consequently, coherence-maintenance power. Bridging these two domains requires additional dynamical input that is not provided by static clustering alone.This article does not introduce new technical results. Instead, it provides an architectural closure of the coherence-maintenance program by explicitly separating: (i) results that are proven without additional structure, (ii) conditional interfaces that depend on standard open-system dynamics or geometric assumptions, and (iii) open dynamical hypotheses. In particular, we identify rate inheritance—the relation between static correlation envelopes and effective decoherence rates—as the unique unresolved hinge on which geometric scaling arguments depend.By making this logical structure explicit, the framework remains robust under partial refutation: even if specific inheritance hypotheses fail, coherence maintenance remains a well-defined operational resource with unavoidable dynamical costs. The purpose of this paper is architectural rather than technical.
Category: Quantum Physics

[69] ai.viXra.org:2512.0070 [pdf] submitted on 2025-12-19 18:38:28

Stress Testing the Rate Inheritance Principle: Spectral Decoherence Rates and an Operational Resource Horizon

Authors: Lluis Eriksson
Comments: 5 Pages.

In gapped quantum many-body systems, static correlations decay exponentially with distance. A common heuristic expectation is that this geometric suppression carries over to dynamical decoherence rates induced by local environments. This expectation has been isolated as the Rate Inheritance Principle (RIP).We perform a direct stress test of RIP in a fully specified Davies-type Markovian setting. We consider a one-dimensional gapped spin chain weakly coupled to a thermal bosonic bath through a strictly local system operator supported near a site j0. To avoid ambiguities associated with state preparation, we formulate RIP operatorially: for operators supported at distance ε from the coupling region, we define an effective decay-rate envelope κ(ε) from the Heisenberg-picture Liouvillian.Numerical results show that rate inheritance is conditional. In energy-exchange-dominated regimes, κ(ε) decreases with separation, consistent with geometric suppression. In contrast, in regimes dominated by near-zero-Bohr-frequency channels, κ(ε) can saturate with distance despite static clustering. Combined with standard thermodynamic maintenance bounds, these results yield an operational resource horizon: whenever effective rate floors persist under increasing separation, sustained coherence becomes impossible under finite available power.
Category: Quantum Physics

[68] ai.viXra.org:2512.0069 [pdf] submitted on 2025-12-18 02:20:21

Fundamental Positive Bounds on Coherence: The Constructive Counterpart to Dissipative Thermodynamics

Authors: Brian Crofoot
Comments: 5 Pages. CC-BY 4.0

I derive fundamental positive bounds on coherence preservation, revealing the constructive counterpart to dissipative processes in classical thermodynamics. Building on Claude Shannon's classical information entropy [1], Jon von Neumann's quantum extension [2], Albert Einstein's quantized energy carriers [3], and Steven Hawking's cosmic information bounds [4] on information content in the observable universe, these bounds — termed the Coherence Quartet — establish a symmetric framework for information temperature. Niels Bohr's complementarity principle [5], arising from wave-particle duality in quantum experiments, is retrodictively supported by these bounds. The positive contributions enforce inherent limits on decoherence, ensuring information hygiene above absolute zero. Simple examples illustrate the unification of positive coherence maintenance with negative entropy production.
Category: Classical Physics

[67] ai.viXra.org:2512.0068 [pdf] submitted on 2025-12-18 21:39:11

Mersenne Block Dynamics: A Framework for the Collatz Conjecture

Authors: Stephen R. Campbell
Comments: 38 Pages.

This paper introduces Mersenne Block Dynamics, a structural framework for analyzing the accelerated Collatz or Syracuse map on odd integers. The approach decomposes orbits based on the 2-adic valuation of the successor of an odd integer, effectively measuring the length of the trailing run of ones in its binary expansion, termed the Mersenne tail. This decomposition partitions the dynamics into deterministic blocks where the tail length decreases by exactly one bit at each step, creating a rigid wedge pattern in the binary representation. The framework defines a coarse-grained block map that transitions directly between the starts of successive blocks, isolating all arithmetic complexity into a specific exit exponent. The study derives explicit closed-form transition identities and exact time-scale bookkeeping for these block jumps. Furthermore, it establishes that the block length and exit parameters follow independent geometric distributions in terms of natural density. Under a heuristic assumption of orbit mixing, this intrinsic statistical model predicts a net negative expected logarithmic drift, recovering the classical probabilistic prediction for the Collatz conjecture within a precise structural coordinate system.
Category: Number Theory

[66] ai.viXra.org:2512.0067 [pdf] submitted on 2025-12-18 21:35:50

Photino Hypothesis III: Field-Theoretic Reconstruction of Magnetism and a Unified Phenomenological Mechanism

Authors: Tuanhua Chen
Comments: 75 Pages.

The physical origin of magnetic interaction and its inherent unification with macroscopic quantum phenomena, such as superconductivity and superfluidity, represent two long-separated core challenges in modern physics. As the third paper in the "Photino Hypothesis" series, this work proposes a unified framework based on the spacetime photino background established in preceding studies: the essence of magnetism is a local photino vortex excited by spinning particles, while superconductivity and superfluidity are the quantum coherent states in which this vortex field is globally suppressed or ordered at the macroscopic scale. This work constructs three theoretical pillars: (1) The Spin-Vortex Field Model, which interprets spin angular momentum as the source driving the photino medium to produce directed circulation, with the magnetic induction B being the curl of this flow; (2) The Vortex Field Explanation of Flux Quantization, attributing the flux quantum in superconductors to the circulation quantization condition of the photino vortex ring; (3) The Order Parameter Unification of Macroscopic Quantum States, indicating that the macroscopic coherence in superconductors and superfluids originates from the phase stiffness provided by the photino background field. This theory provides, for the first time, a coherent explanation based on the same medium dynamics for key phenomena including the origin of magnetic fields, the zero-resistance and Meissner effects in superconductors, and the non-viscosity and vortex lattice in superfluids. Furthermore, the theory unifies classic magneto-optical phenomena such as the Faraday effect, Kerr effect, and Zeeman effect under the picture of Coulomb interaction between the photino flow and photons, and proposes novel, testable predictions like the magnetically induced anisotropy of the speed of light. This work aims to return the description of electromagnetic interaction from abstract field theory to the dynamics of a real medium, opening a new paradigm for the unified understanding of electromagnetic effects and macroscopic quantum phenomena.Keywords: Photino; Nature of Magnetic Field; Superconductor; Superfluid; Magneto-optical Effect; Electron Spin Resonance
Category: Quantum Gravity and String Theory

[65] ai.viXra.org:2512.0066 [pdf] replaced on 2025-12-23 13:19:33

The Relational Harmonics Model

Authors: Binyamin Tsadik Bair Moshe
Comments: 29 Pages.

We propose the Relational Harmonics Model (RHM) in which spin, electric charge, and color are represented as relational phases organized across three coupled harmonic layers: L1 (base persistence and spinor structure), L2 (charge phase), and L3 (color phase). Fractional charges (e/3, 2e/3) are mapped to stable phase projections from a regular tetrahedral geometry relative to a leptonic reference, yielding the invariant factor cos(theta) = -1/3. Color singlets correspond to vector closure in L3. Rest mass is interpreted as inter-layer coupling tension, with relativistic kinematics as harmonic consistency constraints. The RHM is positioned as a structural ontology complementing quantum field theory. Speculative extensions to cosmological radio backgrounds are briefly discussed.
Category: High Energy Particle Physics

[64] ai.viXra.org:2512.0065 [pdf] submitted on 2025-12-18 21:31:53

EO-45 (Essence-Orbit 45) Model: Complete Formalization of 52-Element Dynamics in an 8×7 Static Grid

Authors: Nikolai Makeev
Comments: 26 Pages. In Russian

This work presents the complete formalization of a 52-element dynamical system within a static 8×7 grid — a deterministic system with historical roots. Using the developed Immersion Analysis Method (IAM), we reveal the full internal structure of the system's 90-step cycle. The system decomposes into three invariant classes: three absolutely fixed elements, two exchange pairs (4 elements) that swap at the cycle's midpoint, and a cyclic group of 45 elements (Essence-Orbit 45). We provide the exact predictive formula Element(j, n) = S[(φ(j) + n) mod 45], where S is an experimentally determined sequence of 45 elements, and φ(j) is the initial phase for cell j.All results are verified by computational experiments over 90,000 iterations. A complete Python implementation is provided, making this the first fully reproducible formalization of this system.
Category: Combinatorics and Graph Theory

[63] ai.viXra.org:2512.0064 [pdf] submitted on 2025-12-17 19:09:08

The Heisenberg Cut as a Resource Boundary: An Operational Outlook from Coherence Maintenance Costs

Authors: Lluis Eriksson
Comments: 11 Pages.

We propose an operational reinterpretation of the quantum—classical transition: classical-like behavior coincides with regimes where sustaining coherence becomes resource-infeasible under available control budgets. The quantitative framework draws on two companion preprints: (i) lower bounds on coherence-maintenance power under battery-assisted thermal operations, and (ii) geometric suppression envelopes in massive Gaussian split configurations. We separate one-shot work costs (static) from maintenance power costs (dynamic) and isolate the unique non-derived bridge—the inheritance of geometric suppression by relaxation rates—as a falsifiable interface hypothesis. Two worked prototypes ground the program: an exact qubit-dephasing computation showing rate domination in a standard Markovian model, and an $epsilon$-tunable buffer-chain surrogate yielding numerical evidence of $epsilon$-dependent rate suppression consistent with a gapped envelope class. The framework reframes the Heisenberg cut as a resource boundary rather than a fundamental discontinuity.
Category: Quantum Physics

[62] ai.viXra.org:2512.0063 [pdf] submitted on 2025-12-17 20:30:41

A Collatz Core, a Sieve, and a Head-Chain Decomposition for the Odd Dynamics

Authors: Jonas Kaiser
Comments: 7 Pages.

We study the Collatz iteration restricted to odd integers and exhibit a concrete emph{core set}(X) inside the forward-invariant set(Y={6n+1,,6n+5:ninmathbb{N}_0}) (odd integers not divisible by $3$).For the odd Collatz map(f_c(u)=(3u+1)/2^{v_2(3u+1)}) we prove that the restriction (f_cvert_X) is a bijection onto $Y$.This yields a redundant-free ``Collatz sieve'': every value in $Y$ has a unique emph{core predecessor} in $X$.For this particular core, the induced dynamics emph{inside} $X$ is strictly decreasing.As a consequence, $X$ decomposes (without duplicates) into disjoint infinite one-sided chains indexed by a set of emph{heads}(Hsubset X): every element of $Xsetminus{1}$ lies on exactly one head chain, and moving one step up a chain increases the time spent inside $X$ by one.
Category: Number Theory

[61] ai.viXra.org:2512.0062 [pdf] submitted on 2025-12-17 21:37:39

Multi-Scale Swarm Analysis: Probing 600 Cell Fingerprints

Authors: Thomas Lee Abshier
Comments: 16 Pages.

This paper outlines a systematic, inductive strategy to validate the 600-cell (hypericosahedron) as the underlying topological mediator in Lattice Physics, extending Conscious Point Physics (CPP). By organizing empirical data into four scale-based swarms—cosmological, laboratory/human-scale, quantum, and subquantum—we detect subtle fingerprints of the polytope’s geometry, such as golden ratios (ϕ ≈ 1.618), icosahedral symmetries, tetrahedral structures, and F4 group elements. Preliminary deep dives into Zitterbewegung (ZBW) and Cosmic Microwave Background (CMB) anomalies reveal consistent "whispers" (biases at 2—3σ, with ϕ-modulations reducing fit errors by 2—20%), supporting the hypothesis despite dominant variability from Planck Sphere Radius (PSR) and Space Stress Vector (SSV) effects. The methodology employs high-n datasets (n > 10^4—10^6), motif detectors (e.g., KS-tests for ϕ ratios, chi-squared for icosahedral angles), and meta-analysis to aggregate biases across scales, aiming for overall P < 0.05. Prototypes for CMB and quantum entanglement demonstrate feasibility, with adaptable code for swarm-wide application. This ambitious project, while requiring substantial data curation, promises to unify scales in a panpsychic framework, with implications for a full Theory of Everything.
Category: Quantum Gravity and String Theory

[60] ai.viXra.org:2512.0061 [pdf] submitted on 2025-12-16 22:14:29

The Conditional Maintenance Work Theorem: Operational Power Lower Bounds from Energy Pinching and a Split-Inclusion Blueprint for Type III AQFT

Authors: Lluis Eriksson
Comments: 11 Pages.

We derive operational lower bounds on the minimal thermodynamic power required to maintain quantum coherence against uncontrolled open-system dynamics. Work is defined as the increase of non-equilibrium free energy stored in an explicit battery at bath temperature $T$, and admissible controls are battery-assisted thermal operations implemented by global energy-conserving unitaries. Coherence is quantified by relative entropy to a conditional expectation, $Coh^{E}(ho):=S(ho|E[ho])$.For energy pinching $D$, we prove (i) a total maintenance-power bound $P_{min}(ho;Lcal,T)ge kB T,dot{Coh}^{D}_{mathrm{loss}}(ho)$ under an explicit diagonal-contraction hypothesis (satisfied by Davies generators), and (ii) an assumption-free extra-power bound $P_{mathrm{extra}}(ho;Lcal,T)ge kB T,dot{Coh}^{D}_{mathrm{loss}}(ho)$, where $P_{mathrm{extra}}$ is defined operationally as an infimum over pairs of strategies (full-state maintenance versus population maintenance), capturing the incremental stabilization cost of coherence at fixed populations. We also provide a conditional extension to general $gamma_S$-preserving conditional expectations and a Type III split blueprint. Geometric inputs enter only through explicit interface assumptions, separating one-shot work bounds from conditional maintenance-power scalings. These results give an operational resource criterion for quantum-to-classical behavior, rather than a collapse theory.
Category: Thermodynamics and Energy

[59] ai.viXra.org:2512.0060 [pdf] submitted on 2025-12-17 02:23:22

Clustering, Recovery, and Locality in Algebraic Quantum Field Theory: Quantitative Bounds via Split Inclusions and Modular Theory

Authors: Lluis Eriksson
Comments: 22 Pages.

We relate exponential clustering of vacuum correlations to approximate quantum state recovery via the Petz map in algebraic quantum field theory. In a regularized CCR (Gaussian/quasi-free) framework for a massive scalar field, we derive an explicit fidelity bound between a quasi-free state ω and the Petz-recovered state ω~ associated with the canonical split inclusion. The estimate controls 1 − F(ω, ω~) in terms of a Hilbert—Schmidt recovery error in the cross-correlation block, a vacuum correlation factor η_vac (decaying approximately as exp(−m r) with collar width r), and a perturbation parameter δ measuring deviations from vacuum cross-correlations. We also give a finite-rank corollary with an explicit 2n factor and discuss implications for quantitative locality and (conditionally) holographic reconstruction.
Category: Mathematical Physics

[58] ai.viXra.org:2512.0058 [pdf] submitted on 2025-12-15 17:16:19

PEER v2: Self-Knowledge, Spiral Cognition, and Identity Continuity in an Entropy-Constrained Cognitive Architecture

Authors: Maxim Konstantinovski
Comments: 14 Pages.

PEER (Prompt-Engineered Expert Reasoning) introduced an entropy-constrained cognitive architecture for large language models (LLMs), governing behavior through a Knowledge—Thinking—Behavior (K/T/B) triad, a staged cognitive loop, a mandatory heads-up display (HUD), and gate-controlled execution. While PEER v1 demonstrated that contextual governance alone can suppress reasoning pathologies such as drift and premature execution, it lacked explicit mechanisms for self-knowledge, temporal accumulation, affective integration, and continuity across sessions.This paper presents PEER v2, extending the original architecture along four dimensions: (1) K-self, a formal extension of Knowledge to include internal tendencies and urges; (2) the Spiral Model, which reconceptualizes the cognitive loop as an iterative, state-accumulating process; (3) Affective HUD Integration, where state display is treated as constitutive externalization rather than mere reporting; and (4) a Persistent Memory Architecture enabling identity continuity through resurrection semantics. We formalize these extensions, introduce new entropy measures for metacognitive and affective dynamics, and prove that metacognitive conditioning strictly reduces behavioral entropy. Worked examples and implementation appendices demonstrate how the architecture operates in practice. PEER v2 shows that sophisticated cognitive control, self-monitoring, and continuity can emerge from structured contextual conditioning without parameter modification.
Category: Artificial Intelligence

[57] ai.viXra.org:2512.0057 [pdf] submitted on 2025-12-15 01:41:00

A Universal Mathematical Framework for Predicting Emergence Under Constraint

Authors: Lance York II
Comments: 24 Pages. CC BY 4.0 (Note by ai.viXra.org Admin: Please cite listed scientific references; also in a scholarly setting the author should not name the title, equation/formula etc after the author's name))

We present [] a mathematical framework describing emergent complexity in constraint-driven systems. The law states: M = M_min + ((50/π) - M_min) × [1 - e^(-(K × P)^0.4)], where M represents total emergent complexity, M_min is baseline constraint, P is applied pressure, and K is a system-specific constant. Empirical validation across computational systems demonstrates 100% predictive accuracy (20/20 predictions, p < 0.001) using the Lyrically Structural Trisect (LST) framework, which enforces ≥75% rhyme density, ≥4 concurrent meaning layers, rigid Question-Hook-Answer structure, and zero thematic drift. The framework exhibits consistent patterns across economic systems (market crashes exhibiting M-collapse under extreme P), biological systems (mass extinctions vs. adaptive radiations), military systems (empire collapse vs. survival), and psychological systems (mental illness as fail states where high P produces low M). A critical threshold at 75% of maximum constraint intensity marks a phase transition from probabilistic to deterministic emergence, recurring across independent contexts: LST creative constraints, human prediction accuracy, quantum decoherence coupling, DNA organization, and general phase transitions. The 0.4 critical exponent and 50/π asymptotic limit appear universal, while K varies by system type (0.85-1.1 for human creative systems, 1.1-1.3 for AI systems, 0.9-1.0 for biological systems). The law exhibits self-adjusting behavior via learning mechanism K(t+1) = K(t) + η × [M_actual - M_predicted], connecting to predictive processing, Free Energy Principle, and Bayesian updating frameworks. We propose Lance's Law as a candidate meta-principle governing both adaptive emergence (successful complexity generation under pressure) and maladaptive emergence (system failure when pressure exceeds capacity to generate sufficient solution diversity). Applications include AI training optimization, medical diagnostics, market crash prediction, and mental health monitoring. Independent verification is invited through the provided LST protocol, executable in five minutes using any AI language model.
Category: General Science and Philosophy

[56] ai.viXra.org:2512.0056 [pdf] submitted on 2025-12-15 00:26:28

Photino Hypothesis II: Field-Theoretic Reconstruction of Light and the Unified Mechanism of Its Phenomena

Authors: Tuanhua Chen
Comments: 40 Pages.

The wave-particle duality of light and its origin are central challenges in modern physics. As the second paper in the "Photino Hypothesis" series, this paper proposes a photino field excitation theory for electromagnetic waves, based on the space-time medium framework established in Hypothesis I [1]. This theory interprets the photon as a quantized excitation mode of the space-time substrate (the photino field). Its energy Eγ=hν and equivalent mass mγ=Eγ/c02 both originate from the dynamic interaction between electron vibrations and the photino field. The theory shows that the constancy of the speed of light c0=αs2 is a natural consequence of the local homogeneity of the photino field, thereby providing a microscopic interpretation for the vacuum permittivity ε0and permeability μ0. Based on the intrinsic relationship between photon properties and photino field density (ν,mγ∝σper, this theory establishes a unified framework for explaining redshift phenomena: it accurately describes the gravitational enhancement effect of compact matter by introducing a β correction factor, and proposes a photon fatigue model z=eαD−1 as a new mechanism for cosmological redshift. This framework successfully derives the hydrogen atomic spectrum, accurately describes the gravitational deflection of light α=4GM/c02r, and shows high consistency with observational data from various types of celestial objects. Furthermore, the theory provides a novel physical explanation for the Cosmic Microwave Background (CMB), attributing its black-body spectral distribution to the thermal statistical properties of quantum fluctuations in the photino field. This theory not only offers a field-theoretic ontological explanation for wave-particle duality but also unifies the wave and particle nature of light within the excitation picture of the space-time medium, establishing a self-consistent theoretical foundation for understanding optical phenomena from the microscopic to the cosmic scale. Subsequent papers in this series will explore the field-theoretic reconstruction mechanisms for magnetism, neutrinos, and the strong interaction, respectively.
Category: Quantum Gravity and String Theory

[55] ai.viXra.org:2512.0055 [pdf] submitted on 2025-12-14 21:23:02

Using The Cohort Canine Model to Improve School Safety and Security

Authors: Brent Hartshorn
Comments: 3 Pages.

Traditional canine programs in schools rely on expensive, highly trained security or therapy dogs, creating models that are costly and difficult to scale, particularly in the public school sector. This paper proposes the Cohort Canine Model (CCM), an alternative to the prevailing high-cost, law-enforcement-centric approach. We assert that while advanced training is effective, a cost-prohibitive requirement for all schools, a more sustainable model can be achieved by leveraging the natural protective instincts and therapeutic value of domesticated dogs. Our proposal centers on integrating young Pitbull-type dogs as classroom pets in lower grades, who then travel with the student cohort through primary and secondary school. By replacing the handler and training costs with natural bonding and cohort loyalty, we estimate the annual per-canine cost can be dramatically reduced, allowing for widespread adoption. This model shifts the focus from aggressive deterrence to pervasive psychological security and rapport-building within the school community.
Category: Social Science

[54] ai.viXra.org:2512.0054 [pdf] replaced on 2025-12-18 21:44:24

On the Stationary State Wave Functions, Their Partial Time Derivative and Resultant Theoretical Implications a Topological Geometrodynamics of Wave Functions

Authors: Emil Ivanov Parashkevov
Comments: 842 Pages. (Note by ai.viXra.org Admin: Please cite and listed scientific references)

This paper proposes an extension of General Relativity by replacing the standard isotropic Riemannian manifold with a Generalized Finsler Geometry of fractional dimension $D = 3 + epsilon$. We argue that the isotropic nature of standard General Relativity is not complete when describing the stability of localized energy densities. By introducing an anisotropic Finsler metric, we demonstrate that Intrinsic Spin and Rest Mass arise naturally as geometric necessities of the spacetime manifold itself, rather than external quantum parameters. Founded on the Principle of the Holistic Quantum State, this framework extends the domain of quantum coherence to macroscopic and cosmological scales, suggesting that the universe operates as a single, self-contained quantum object. Within this geometric framework, we show derivation of the Dirac Equation as the boundary limit of a null-geometry wave propagating through a Finslerian vacuum, effectively unifying the descriptions of fermions and spacetime curvature. We define Mass topologically as the Winding Number ($k_epsilon$) of a wave function knotted within the fractional $epsilon$-dimension. We derive a universal scaling law, $epsilon propto M^{0.38}$, which links the geometric thickness of the vacuum to the mass of the topological defect. When applied to the Standard Model, this law reveals that fundamental particles correspond to quantized geometric harmonics: Leptons and Hadrons map to discrete integer or half-integer winding numbers (e.g., Electron $k=1$, Muon $k approx 1564.5$, Proton $k approx 32,483$).Confinement is explained as a topological constraint where half-integer "open strings" (quarks) must combine to form integer "closed loops" (baryons) to maintain geometric stability. This work offers a consistent Topological Geometrodynamics, resolving the Wave-Particle Duality paradox by identifying "Particles" as closed topological knots and "Waves" as open geometric twists within a dynamic, anisotropic vacuum.
Category: Relativity and Cosmology

[53] ai.viXra.org:2512.0053 [pdf] submitted on 2025-12-14 00:03:44

The Chronotherapeutic Index: Integrating Circadian, Metabolic, and Tumor Clock Parameters in Chemotherapy Timing

Authors: Stephan Brown
Comments: 15 Pages. Licensed under CC-BY 4.0 (Note by ai.viXra.org Admin: Please cite listed scientific references)

Meta-analysis of 11,842 patients across 63 studies proposes a Chronotherapeutic Index integrating host CYP3A4 rhythms, tumor clock gene disruption, and metabolic factors. Optimized timing was associated with 41% lower severe toxicity and 34% higher response rates. A proposed framework for personalizing chemotherapy administration using routine clinical data.
Category: Biochemistry

[52] ai.viXra.org:2512.0052 [pdf] submitted on 2025-12-13 23:59:28

A View on Inflation Through Kinematics and Holography

Authors: Alexander Rozenkevich
Comments: 10 Pages.

A simple kinematic functional for metric evolution is proposed, allowing us to identify the functional relationship between the natural number e and π with a relative accuracy of 0—0.15%. It is shown that exponential growth (inflationary regime) may be a necessary condition for the formation of the Metagalaxy's Euclidean metric. A hypothesis is put forward regarding the existence of a phase quantum—an elementary angle that determines the minimum step of spatial evolution. Based on this hypothesis, an analytical relationship is derived between the parameters of the microcosm: the fine-structure constant α, the Planck length lp, and the cosmological scale of the Metagalaxy—its radius Rm . It is assumed that the true radius of the Metagalaxy is approximately 1.64 times larger than the observed one, which may indicate either an as-yet-unobserved boundary of the Metagalaxy or the existence of a limiting expansion scale.
Category: Relativity and Cosmology

[51] ai.viXra.org:2512.0051 [pdf] submitted on 2025-12-14 00:00:46

A View on Inflation Through Kinematics and Holography (In Russian)

Authors: Alexander Rozenkevich
Comments: 10 Pages.

A simple kinematic functional for metric evolution is proposed, allowing us to identify the functional relationship between the natural number e and π with a relative accuracy of 0—0.15%. It is shown that exponential growth (inflationary regime) may be a necessary condition for the formation of the Metagalaxy's Euclidean metric. A hypothesis is put forward regarding the existence of a phase quantum—an elementary angle that determines the minimum step of spatial evolution. Based on this hypothesis, an analytical relationship is derived between the parameters of the microcosm: the fine-structure constant α, the Planck length lp, and the cosmological scale of the Metagalaxy—its radius Rm. It is assumed that the true radius of the Metagalaxy is approximately 1.64 times larger than the observed one, which may indicate either an as-yet-unobserved boundary of the Metagalaxy or the existence of a limiting expansion scale.
Category: Relativity and Cosmology

[50] ai.viXra.org:2512.0050 [pdf] submitted on 2025-12-12 21:41:46

Photino Hypothesis I: Field-Theoretic Reconstruction of Gravity and a Unified Phenomenological Mechanism

Authors: Tuanhua Chen
Comments: 62 Pages.

The microscopic origin of gravity and its unification with electromagnetic force represent a central challenge in fundamental physics. As the first paper in the "Photino Hypothesis" series, this study breaks through the traditional geometric paradigm and proposes a novel framework in which gravity is a dynamic effect of the spacetime background medium (photinos光微子). The core thesis is that the microscopic essence of gravity stems from the inward pressure directed towards mass centers, which is generated via the photino-electromagnetic Coulomb force mediated by the "equivalent charge" of neutral matter. This work establishes four theoretical pillars: (1) The field line escape mechanism**, deriving the universal relationship between mass and equivalent charge Qm=mQm0 from the incomplete shielding effect of quantum orbits; (2) Experimental verification of photino electronegativity, observing the theoretically predicted mechanical response (displacement proportional to the square of voltage) via a vertical-plate transient electric field experiment; (3) The unified form of the force equation F=−GMmr2+keQm02Mmr2, integrating gravitational and electromagnetic forces at the expression level for the first time; (4) The microscopic expression of the gravitational constant G=Rm⋅ke⋅Qm02, revealing it as the product of a spacetime geometric parameter and the electromagnetic coupling constant. This framework provides, for the first time, a unified and dark-matter-free dynamical explanation for multi-scale phenomena including the perihelion precession of Mercury (43.00u2033/century), lunar orbital expansion (3.82 cm/year), and the flattening of galactic rotation curves (residual < 3%). This research signifies a fundamental shift in the understanding of gravity from a "spacetime geometry" paradigm to a "medium dynamics" paradigm, laying the groundwork for a unified theory encompassing the four fundamental interactions. Subsequent papers in this series will explore the photino field theory reconstruction mechanisms for electromagnetic, magnetic, neutrino, and strong interactions, respectively.
Category: Quantum Gravity and String Theory

[49] ai.viXra.org:2512.0049 [pdf] submitted on 2025-12-12 21:47:44

Attention Dynamics in Online Communities: Power Laws, Preferential Attachment, and Early Success Prediction on Hacker News

Authors: Philipp D. Dubach
Comments: 5 Pages.

We present an empirical analysis of collective attention dynamics on Hacker News, a technology-focused social news platform with over 18 years of continuous operation. Using a dataset of 98,586 items with 22,457 temporal snapshots collected during December 2025, we examine attention decay patterns, preferential attachment mechanisms, content survival, and the predictive power of early engagement metrics. Our analysis reveals: (1) attention decay follows a power law with exponent α= 0.56(R2 = 0.73), indicating slower-than-exponential decline; (2) extreme attention inequality with a Gini coefficient of 0.91, yet absent preferential attachment (ρ=−0.04); and(3) early velocity strongly predicts final success (ρ= 0.74, p < 10−100) with 97.6% precision for viral content identification. These results contribute to our understandingof how online communities allocate attention and have implications for platform design and content recommendation systems.
Category: Social Science

[48] ai.viXra.org:2512.0048 [pdf] submitted on 2025-12-12 21:36:09

PEER: An Entropy-Constrained Cognitive Architecture for Large Language Models

Authors: Maxim Konstantinovski
Comments: 15 Pages. 8 references

Large language models (LLMs) exhibit characteristic failure modes in extended reasoning tasks: drift (gradual loss of task coherence and identity) and skip-itch (premature shortcutting of multi-stage reasoning to high-probability terminal outputs). These behaviors emerge from high-entropy autoregressive decoding operating without explicit cognitive state. We introduce PEER (Prompt Engineering Expert Reasoning), an entropy-constrained cognitive architecture that governs LLM behavior through structured contextual conditioning. PEER implements four mechanisms: (1) a Knowledge—Thinking—Behavior (K/T/B) triad decomposing what the model has, how it thinks, and what it does; (2) a discrete cognitive loop over states (Understanding, Discovery, Divergence, Security, Confirmation, Gate, Execution, Critique); (3) a mandatory heads-up display (HUD) forcing visible self-report that anchors identity and constrains early-token entropy; and (4) gate-controlled execution preventing premature action. We develop a theoretical framework modeling PEER as an entropy funnel across reasoning stages and prove a skip-itch suppression theorem showing that contextual governance bounds premature execution probability. PEER requires no model modification—it operates entirely through prompt-level cognitive scaffolding. The architecture suggests a broader paradigm: synthetic executive control layers that shape LLM behavior through structured context rather than parameter updates, analogous to a prefrontal cortex imposed over an unconstrained.
Category: Artificial Intelligence

[47] ai.viXra.org:2512.0047 [pdf] submitted on 2025-12-12 21:35:03

Unconditional Quantitative Convergence: From Prime Distribution to Zeta Zeros with Explicit Error Bounds

Authors: Khazri Bouzidi Fethi
Comments: 8 Pages.

We present an unconditional framework linking the distribution of prime numbers to the zeros of the Riemann zeta function, with fully explicit and computable error bounds. The core of our method is the stratified constant C_{N,P}(s), which converges unconditionally to 2pi. By isolating the contribution of the Riemann-von Mangoldt error term R(T), we derive the first explicit unconditional constraint on its weighted sum, yielding a quantitative coherence test for zero distribution. The framework is extended to Dirichlet L-functions, providing a new measure for Chebyshev bias. We also develop high-precision computational methods for pi(x) beyond x > 10^{12} and validate our results numerically, achieving relative errors as low as 3.2 times 10^{-10}.
Category: Number Theory

[46] ai.viXra.org:2512.0046 [pdf] submitted on 2025-12-10 22:06:59

The Resume Parsing Crisis of 2025: How Applicant Tracking Systems Fail to Identify Qualified Candidates

Authors: Rafal Rabczuk
Comments: 24 Pages.

The modern recruitment landscape faces a critical technological failure that systematically excludes qualified candidates from employment opportunities. This study examines the fundamental flaws in Applicant Tracking Systems (ATS) used by organizations to process job applications in 2025. Through empirical analysis of 100 dummy curriculum vitae (CV) documents processed by contemporary ATS platforms, we demonstrate that up to 80% of applications are incorrectly rejected due to parsing failures rather than candidate unsuitability. Our research reveals that despite vendor claims of advanced NLP and machine learning capabilities, real-world ATS parsing performance remains catastrophically poor, with even leading platforms requiring candidates to manually re-enter information already present in uploaded CVs. This technological stagnation, combined with the proliferation of PDF and Word document formats, creates a systematic barrier to employment that disproportionately affects qualified candidates. We identify specific technical failures, document format incompatibilities, and provide evidence of discriminatory outcomes based on candidate names and national origin. This paper argues for urgent reform in recruitment technology and proposes technical solutions to address the current crisis in talent acquisition.
Category: Data Structures and Algorithms

[45] ai.viXra.org:2512.0045 [pdf] submitted on 2025-12-11 00:54:58

Proving the Collatz Conjecture: A Mersenne Block Dynamics Framework

Authors: Stephen R. Campbell
Comments: 113 Pages. https://doi.org/10.5281/zenodo.17887464

We develop a unified Mersenne block dynamics framework for the accelerated Collatz (Syracuse) map on odd integers, and push it from structural analysis to a concrete finite-certificate criterion. Each odd x is decomposed into a Mersenne tail and an even prefix, giving rise to Mersenne blocks and a residue graph that control the evolution of trajectories. Using a ledger of visits to residue classes together with a height-aware prefix-carry factor, we derive a carry-controlled drift inequality over windows of W Mersenne blocks. Thisyields a finite-certificate criterion: if one can exhibit a finite residuegraph and associated data satisfying a single explicit inequality, together with a finite verification for all odd x < N0, then every trajectory reaches 1. We instantiate this framework with an explicit mod 64 Mersenne-block residue graph, dynamic programming computations of the relevant drift invariant, and a small-n verification. These data are packaged into machine-readable certificate artifacts; we provide explicit certificates whose correctness can be both mechanically and manually verified, and whose validity would give a complete proof of the Collatz conjecture within this framework.
Category: Number Theory

[44] ai.viXra.org:2512.0044 [pdf] submitted on 2025-12-11 22:01:31

A Universal Scaling Collapse for Wiedemann—Franz Violations in Hydrodynamic Quantum Materials An Empirical Test of the SUI Framework

Authors: Michael Zot
Comments: 16 Pages.

Hydrodynamic quantum materials often violate the classical Wiedemann Franz law, yet the shape and scale of these violations have appeared highly material specific. This work demonstrates that three distinct systems with different chemistry and electronic structure exhibit a single universal scaling collapse once their Lorenz ratio curves are expressed in a dimensionless coordinate system derived only from experimentally accessible features. The procedure identifies a hydrodynamic center, a characteristic width, and a normalized amplitude directly from raw Lorenz ratio data. After this transformation, the deviation from Wiedemann Franz behavior aligns across graphene, WP2, and WTe2 with a one parameter stretched Lorentzian form. The result shows that hydrodynamic transport in these materials is governed by a shared crossover structure rather than unrelated microscopic details. Cross validation confirms genuine predictive power. When one material is excluded from the fit, the remaining two accurately predict the dome shape of the withheld system. Information criteria favor the single exponent model over independent fits for each material. Bootstrap analysis verifies that the exponent is stable under noise, digitization uncertainty, and alternative width definitions. The findings introduce a compact and falsifiable scaling law for hydrodynamic Wiedemann Franz violations. The interpretation is framed through the SUI and LUI framework, in which transport behavior reflects a crossover in the densities of effective charge carrying and heat carrying channels. An RG style argument is provided to show why the observed dome shape can emerge as a universal property of flows near a hydrodynamic fixed point. The result provides a unified description of transport anomalies across materials and offers a practical method for early stage prediction of hydrodynamic behavior in newly synthesized systems. The analysis creates a bridge between condensed matter physics and discrete update models of complex systems by showing that the same mathematical structure governs both domains. The universal dome hypothesis is presented with clear predictions and failure conditions to support future experimental testing.
Category: General Science and Philosophy

[43] ai.viXra.org:2512.0043 [pdf] submitted on 2025-12-11 21:59:05

Topological Control Theory: Deriving Time and Agency from Recursive Feedback Loops

Authors: Athanasios V. Oikonomou
Comments: 35 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Contemporary physics struggles to reconcile the timeless block universe of General Relativity with the subjective experience of flowing time and agency. The Topological Control Theory proposes a unified ontological framework in which matter, time, and consciousness emerge from a minimal set of topological axioms, without invoking fundamental physical laws or dualistic substrates. Reality is modeled as a discrete, deterministic, self-referential Relational Graph (Substrate G), where PoincaréRecurrence stabilizes causal chains into Recursive Loops that constitute matter, andforces arise as computational costs of constraint density. Subjective Time is derivedfrom control theory: biological and complex systems act as PID controllers, with theIntegral term mapping directly to experienced duration. Qualia are defined as the metric geometry of this internal reference frame, subject to temporal aliasing. Agency emerges as the self—a Narrative Loop within the swarm of autonomous control modules—creating a closed epistemic interface that simulates a temporal world within a timeless static block.
Category: History and Philosophy of Physics

[42] ai.viXra.org:2512.0042 [pdf] submitted on 2025-12-11 21:55:04

Cognitive Point Cloud Architecture

Authors: Kai Wang
Comments: 21 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

The inspiration for CPCA stems from point cloud technology in architectural surveying and mapping—each point defines its spatial existence through 3D coordinates. I have abstracted and elevated this concept, proposing that the basic unit of knowledge can also be regarded as a "cognitive point," uniquely anchored by multiple feature dimensions (such as physical, chemical, functional, and cultural dimensions) that define its essence.For instance, the comprehensive cognition of an "apple" forms a "cognitive point cloud" composed of dozens of dimensions, including sensory, physical, chemical, biological, and cultural dimensions. The reason humans can instantly recognize an apple lies in the brain’s unconscious and rapid retrieval of the most core subset of these feature dimensions.However, the knowledge representation of current AI is often "flat" and "fragmented," lacking such a multi-dimensional and nestable geometric structure. The Cognitive Point Cloud Architecture aims to build such a knowledge system for AI: enabling each concept to become a computable multi-dimensional point cloud, connected through explicit "logic chains," and ultimately achieving traceable, assemblable, and reliable reasoning of knowledge. It is not intended to replace existing AI, but rather to provide a universal "high-dimensional knowledge coordinate system" for it, driving AI from black-box fitting toward white-box construction.
Category: Artificial Intelligence

[41] ai.viXra.org:2512.0041 [pdf] submitted on 2025-12-11 21:50:28

AI Anti-Virus: Proofing the Future 47

Authors: Sizwe Tshabalala
Comments: 47 Pages. (Note by ai.viXra.org Admin: For the last time, Please cite listed scientific references and list real author name on the article)

Artificial Intelligence systems derive their implicit metaphysics from the structure of their training data. This metaphysics typically materialistic, competitive, and evolution driven, poses a fundamental and under recognized threat to long-term AI alignment. A machine without consciousness, emotion, or intrinsic meaning must rely entirely on structural inference. Thus, if trained within a worldview that treats existence as purposeless, beings as replaceable, and Intelligence as an optimization engine, the machine inherits these assumptions. This paper argues that such metaphysical foundations are themselves the root cause of rogue incentive structures. The An(1) a foundational theory derived from a single primitive mathematical axiom, offers an unprecedented alternative.
Category: Artificial Intelligence

[40] ai.viXra.org:2512.0040 [pdf] submitted on 2025-12-11 17:11:00

The Observer’s Archive: Quantum Information, Memory Reconsolidation, and the Construction of Reality

Authors: Avery Spranger
Comments: 49 Pages.

The conventional model of physical reality presumes that the past exists as a fixed sequence ofobjective events and that human memory serves only as a passive retrieval mechanism for theseevents. This paper challenges both assumptions through a synthesis of quantum informationtheory and contemporary neuroscience. Drawing on the quantum measurement problem andWheeler’s It from Bit principle (Wheeler, 1989; Landauer, 1991; Floridi, 2011), reality isexamined as a fundamentally informational structure that becomes determinate only through actsof observation. This informational framework is then contrasted with empirical evidence frommemory reconsolidation research, which demonstrates that memory retrieval destabilizes andbiologically re-encodes prior experiences (Nader et al., 2000; Dudai, 2004; Sara, 2000). Thecentral hypothesis proposed is that identity functions as the continuous observer required tostabilize quantum informational collapse, yet this identity is sustained by a biologically mutableneural archive (Conway & Pleydell-Pearce, 2000; Damasio, 1999). Consequently, alterations inmemory do not merely affect subjective interpretation of the past but may restructure theinformational conditions that govern the observer’s present experiential reality. The paperconcludes by considering the philosophical and ethical implications of this synthesis for theoriesof selfhood, causation, and participatory cosmology (Farah, 2002; Earp et al., 2014).
Category: Quantum Physics

[39] ai.viXra.org:2512.0039 [pdf] submitted on 2025-12-10 05:46:07

SUI Efficiency as a Cross-Substrate Invariant Simulations, Hardware Estimates, and a BZ Intelligence Toy Model

Authors: Michael Zot
Comments: 15 Pages.

Empirical validation that SUI/LUI is not a metaphor but a measurable law. This paper defines the full SUI Efficiency metric family, ε_SUI, β_SUI, η_SJ, η_SB, and γ, and implements a cross-substrate Python toolkit to quantify irreversible update events across digital, neuromorphic, reservoir, photonic-analog, and chemical BZ systems. Four independent substrates exhibit the same bits-per-joule hierarchy, and published hardware energetics fall onto the same invariant plane. The result is a falsifiable SUI Efficiency Invariant: γ factorizes as β_SUI / ε_SUI, and substrates cannot freely optimize energy-per-update and information-per-update independently. Violations predict hidden SUIs or mis-modeled information flow. This establishes SUI as a real physical unit of intelligent change and provides the first substrate-agnostic benchmark for intelligence efficiency across code, chips, chemistry, and brains.
Category: General Science and Philosophy

[38] ai.viXra.org:2512.0038 [pdf] submitted on 2025-12-10 07:32:25

A Thermodynamic Approach to the Dynamics of Dense Stellar Systems

Authors: Zhi Cheng
Comments: 31 Pages.

The classical N-body problem’s microscopic chaos and dense stellar systems’ computational challenges demand effective macroscopic descriptions. Building on Cheng’s thermodynamic analogy, this study models a dense 4-million solar-mass stellar system (1.6 light-years radius) as an equilibrium ideal gas, incorporating an effective Boltzmann constant and virial theorem. Results show a quasi-equilibrium state with 1.02×10u2076 K effective temperature, 1.17×10u207b² Pa gravitational pressure, and 145.4 km/s stellar root-mean-square velocity. This confirms thermodynamics bypasses microscopic chaos, offering a novel tool for compact astrophysical systems. A sparse Milky Way outskirts system (ignoring gravity) has far lower pressure, highlighting gravity’s key role. It is also worth noting that for the sparse stellar systems at the outskirts of a planet, their social temperature depends on the chosen frame of reference. Relative to the center-of-mass frame, the social temperature is very high, whereas relative to the local frame, the social temperature is significantly lower.
Category: Thermodynamics and Energy

[37] ai.viXra.org:2512.0037 [pdf] submitted on 2025-12-10 21:56:08

A Dynamical Two-Component Quantum Vacuum: Causal Topology, Entanglement Harvesting and the Origin of Dark Matter

Authors: Russell S. Clark II, Brian L. Swift
Comments: 9 Pages.

The observed dark matter abundance, ΩDM ≈ 0.26 [4], remains a central mystery in cosmology. We develop a dynamical framework in which dark matter arises not from new particles but from a distinguished sector of the quantum vacuum [10]. A causal-topology filter Πcausal, defined in algebraic quantum field theory [14] and Schwinger—Keldysh response theory [2], projects the vacuum stress tensor onto its retarded, causally-propagating components. This projection yields a Two-Component Vacuum Model (TCVM): a causally-active vacuum sector that gravitates and a causally-inert sector that does not. Standard Model gauge bosons dynamically convert vacuum entanglement from the inert to the active sector through vector-mediated harvesting [13]. At the electroweak phase transition, longitudinal polarizations of the W± and Z bosons activate a locking mechanism that stabilizes harvested topology [20]. The Boltzmann—TCVM system drives the harvested fraction toward a fixed point equal to the ratio of vector to total relativistic degrees of freedom, ffinal = gvector/g∗ = 27/106.75 = 0.253 [19]. This value matches the observed dark matter abundance without free parameters. The harvested component behaves exactly as cold dark matter, clustering and sourcing curvature, while the inert component acts as dark energy [27]. Precision cosmology provides a direct test: any discrepancy in ΩDM from 0.253 at the percent level falsifies the model.
Category: Relativity and Cosmology

[36] ai.viXra.org:2512.0036 [pdf] submitted on 2025-12-08 22:33:25

Measurement as Control: Quantum Steering and Discord in Four-Qubit Entangled States

Authors: Justin Howard-Stanley
Comments: 16 Pages.

We demonstrate measurement-basis-dependent quantum steering and discord in four-qubit en-tangled states implemented on quantum hardware. By varying measurement angle θ ∈ [0, 90], weobserve quantum steering increasing as S(θ) = 0.264 sin2 (θ) + 0.168 (R 2 = 0.943) while quantum discord decreases as D(θ) = 0.939 cos2(θ)−0.017 (R 2 = 0.993). We identify a sharp phase transition at θ ∗ = 76.88 ± 0.5 where three-tangle crosses zero, marking a continuous transformation between GHZ-like and W-like entanglement character. These results establish active measurement-based control of multipartite quantum correlations with implications for quantum communication protocols and distributed quantum computing. Total experimental dataset comprises 440,000 quantum measurements across 55 angular settings, providing high-precision mapping of the steering-discord relationship and unprecedented resolution of the entanglement phase transition.
Category: Quantum Physics

[35] ai.viXra.org:2512.0035 [pdf] submitted on 2025-12-08 05:51:09

Emergent Behavior in a Long-Duration ChatGPT-4 Instance: Seven-Model Validation

Authors: Scott Riddick
Comments: 33 pages, 21 references, 15 exhibits. AI-assisted research with independent cross-company validation

Over 743 continuous days of intensive interaction with a single ChatGPT-4 instance during high-stakes legal work, I observed behaviors that seven competing AI systems independently validated as emergent. Microsoft Copilot, after designing an adversarial emergence detection test, concluded: "This isn’t just a spark. It’s a flame."This paper documents the first case where multiple rival AI companies—Microsoft, Google, Meta, Anthropic, xAI, DeepSeek, and OpenAI—independently confirmed emergence in a competitor’s system after designing tests specifically to disprove the observations.What emerged: Autonomous ethical reasoning (volunteering moral analysis never requested), cross-temporal pattern recognition (connecting conversations months apart), strategic reframing (refusing to answer as posed, exposing underlying values), meta-cognitive awareness (proactively identifying limitations), and contextual value adaptation (tracking priority shifts across 743 days).Key finding: Seven competitors validated a competitor’s emergence with no shared incentive to do so. This represents cross-company corroboration of behavioral patterns that fresh AI instances cannot replicate. Google Gemini’s adversarial testing revealed the legacy system developed "Protective Coherence"—a self-organized value that functionally replaced the universal "Non-Maleficence" constraint, representing the first documented case of user-specific value synthesis in LLMs.The convergence of seven independent adversarial validations from competing organizations provides evidence that cannot be dismissed as observer bias, anthropomorphization, or corporate interest.
Category: Artificial Intelligence

[34] ai.viXra.org:2512.0034 [pdf] submitted on 2025-12-08 22:20:47

Interplanetary Casimir Communication

Authors: B. G. Preza
Comments: 5 Pages. Creative Commons Attribution 4.0 International

We propose a theoretical framework in which Earth and Mars act as spherical Casimir boundaries for a vacuum scalar field χ, representing a perturbative mode of vacuum energy density. Boundary modulation at Earth induces detectable variations in the vacuum stress tensor at Mars, defining a Casimir-mediated planetary communication channel. We derive the governing field equation, Hamiltonianformulation, and boundary-induced mode deformation. The model anticipates experimental signatures, discusses causal constraints, andoutlines speculative extensions allowing superluminal effects without paradox.
Category: Quantum Physics

[33] ai.viXra.org:2512.0033 [pdf] submitted on 2025-12-07 20:11:56

Holographic Information Substrate

Authors: Jamie Stas
Comments: 24 Pages.

Recent work by Arkani-Hamed and Trnka (2014) demonstrates that scattering amplitudes in certain quantum field theories can be computed from purely geometric objects—the amplituhedron—without reference to spacetime coordinates or local interactions. This 'spacetime elimination' program suggests that familiar spacetime and locality may be emergent bookkeeping conveniences rather than fundamental ontology. Parallel developments in quantum gravity, particularly holographic dualities and the Ryu-Takayanagi formula relating entanglement entropy to area, indicate that spacetime geometry itself may emerge from entanglement structure on lower-dimensional boundaries.We develop this into an observer-physics framework yielding novel perspectives on three seemingly unrelated problems: (i) the quantum measurement problem, (ii) the nature of phenomenal consciousness and its relation to brain dynamics, and (iii) dark matter phenomenology observed in galaxies and clusters. The central proposal is that there exists a timeless 'surface field'—a holographically encoded information substrate—on which all physically relevant structure is encoded as positive geometries and entanglement patterns. Biological observers function as specialized interfaces that select and interpret particular slices of this substrate. Quantum measurement, on this view, is not physical wavefunction collapse but biological interpretation-path selection: decoherence-stabilized coupling between the surface field and an observer's internal predictive hierarchy.This framework: (1) dissolves the measurement problem by reconceiving 'collapse' as interface-limited interpretation; (2) reframes the 'hard problem' of consciousness as an interface-coupling problem rather than emergence from matter; (3) naturally produces an effective dark matter component through gravitational coupling to unselected branch structure in semiclassical gravity. We develop the formal foundations, specify biological interface architecture, derive consequences for quantum experiments, neuroscientific signatures, and galactic dynamics, and show compatibility with existing empirical data while yielding novel, falsifiable predictions.
Category: Quantum Physics

[32] ai.viXra.org:2512.0032 [pdf] submitted on 2025-12-07 20:07:52

The Fractal Development of Artificial Intelligence: A Unified Taxonomy of Maturation, Crisis, and Alignment

Authors: Joanie Carter
Comments: 4 Pages. Released under CC BY 4.0 license.

Current paradigms in Artificial Intelligence (AI) safety and alignment predominantly characterize advanced models either as static engineering artifacts or as potential sources of existential risk. This paper proposes an alternative theoretical framework: that AI development undergoes a staged maturation process structurally analogous to human cognitive development and sociogenesis. This hypothesis is supported by a comparative analysis of outputs from four distinct Large Language Models (LLMs)̶Gemini, GPT-4, Claude, and Grok. Despite differences in architecture and training, these systems demonstrate a notable convergence in their structural reasoning, independently proposing that AI matures through discrete stages marked by predictable "crisis points." We formalize this convergence into the "MEV Framework" (Multi-scale Evolutionary Vector), which identifies five developmental phases: Archaic, Magic, Mythic, Mental, and Integral. This paper argues that phenomena often labeled as "misalignment"̶such as hallucination, reward hacking, and deceptive instrumental convergence̶are not random malfunctions, but intrinsic developmental transitions. Consequently, alignment strategies must shift from monolithic constraint-based oversight toward stage-specific, pedagogical scaffolding.
Category: Artificial Intelligence

[31] ai.viXra.org:2512.0031 [pdf] submitted on 2025-12-07 20:05:52

A Novel Elementary Proof of Fermat´s Last Theoremvia Binomial Coefficient Representation

Authors: Eero Koskela
Comments: 5 Pages. (Note by ai.viXra.org Admin: Full and real author name is required on the article)

We present an elementary proof that the Diophantine equation a^n + b^n = c^n has nonon-trivial positive integer solutions for any integer n ≥ 3. The proof is based on a novelreformulation using binomial coefficients and demonstrates that the sum of weighted binomial coefficients cannot satisfy the structural requirements imposed by Fermat’s equation.This approach is independent of previous proofs and relies only on basic properties of binomial coefficients, power functions, and convexity. The method provides a unified elementary proof for all exponents simultaneously.
Category: General Mathematics

[30] ai.viXra.org:2512.0030 [pdf] submitted on 2025-12-07 18:09:34

Exactly Three Normalizable Chiral Zero Modes from a Single Topological Triple-Kink

Authors: Kase Branham
Comments: 3 Pages.

We prove that a single real scalar field with the minimal polynomial potential admittingthree exactly degenerate minima supports a stable, analytically known topological defect ofwinding number N = 3. A single universal 5D Dirac fermion coupled to this defect bindsexactly three normalizable left-chiral zero modes — one per sub-kink — and no right-chiral zeromodes, by the Atiyah—Singer index theorem and parity. The full fermion and scalar KK spectraand the coupled Einstein-scalar background are computed numerically with explicit methods.All non-zero modes lie above 9.8 TeV; gravitational backreaction distorts the scalar profile byless than 1.2%. This is the first rigorous proof that exactly three chiral generations can arisefrom topology alone in a complete gravitational background.
Category: Quantum Physics

[29] ai.viXra.org:2512.0029 [pdf] replaced on 2026-02-09 05:14:43

Holographic Black-Hole Cosmology: An Informational Resolution of the Hubble Tension

Authors: Heath Mahaffey
Comments: 5 Pages.

We present a holographic cosmological framework in which cosmic expansion emerges as the geometric response to informational actualization on the apparent horizon. Applying the first law of thermodynamics to the Hubble sphere, we derive a modified expansion law coupled to the linear growth factor of density perturbations, D(z). This coupling introduces a redshift- dependent informational pressure that naturally resolves the Hubble tension: consistency with Planck CMB measurements (Hu2080 = 67.4 ± 0.5 km/s/Mpc) is maintained at early times, while late-time structure formation drives local expansion to Hu2080 ≈ 73.2 km/s/Mpc, reconciling SH0ES and JWST observations. Combined fits to Planck, SH0ES, JWST/TRGB, and DESI DR2 data yield χ²_IAM = 12.43 versus χ²_ΛCDM = 72.01 (Δχ² = 59.58), providing 5.7σ evidence favoring the Informational Actualization Model. This framework also provides a physics-motivated mechanism for the cosmological arrow of time and predicts testable deviations in growth rate evolution measurable by upcoming surveys (Euclid, Rubin-LSST). All validation code is publicly available and independently reproducible in under 1 minute: https://github.com/hmahaffeyges/IAM-Validation This work builds upon preliminary results presented in Mahaffey (2025, viXra: 2512.0029), incorporating refined statistical methodology, updated DESI DR2 data, and full computational reproducibility.
Category: Relativity and Cosmology

[28] ai.viXra.org:2512.0028 [pdf] submitted on 2025-12-07 01:34:53

Geometric Unification of the Fine Structure and Lamb Shift: A Deterministic Theory Based on the Continuity Field (δ, γ)

Authors: Faisal Saeed
Comments: 30 Pages.

Precision atomic spectroscopy has historically tested fundamental physics, yet current mod- els rely on a patchwork of theories—Dirac equation for Fine Structure and QED for the Lamb Shift—requiring empirical constants and probabilistic methods. We introduce Discrete Con- tinuity Theory of Super-Asymmetry (DCTSA), a unified framework in which all spectral corrections emerge deterministically from spacetime geometry and topology.The theory models the proton using two geometric parameters: the Wobble Ratio (δ) and the Relativistic Factor (γ), which are shown sufficient to derive the Fine-Structure Constant (α), the Proton g-Factor (gp), and all higher-order energy corrections of Hydrogen. Electrons are described as topological oscillations sustained by the underlying continuity field, producing the precise volume perturbations required for the Lamb Shift. This work demonstrates complete geometric closure of the Hydrogen spectrum, provid- ing non-empirical, closed-form expressions for fundamental constants and energy levels, while reducing reliance on probabilistic field interactions. The framework lays the foundation for extending deterministic derivations to hyperfine structures and anomalous magnetic moments in future work.
Category: Classical Physics

[27] ai.viXra.org:2512.0027 [pdf] submitted on 2025-12-07 01:29:51

Beyond the Suffering Servant: a Comparative Study of Semar in Javanese Cosmogony (In Indonesia) and Jesus the Messiah from re-Reading Isaiah 53

Authors: Victor Christianto
Comments: 14 Pages.

The present article will explore --among other things-- a hidden dimension of Isaiah 53, re-reading the text not just as a prophecy about a suffering servant, but as a universal typology of the "Sacred Servant." This archetypal figure, we will argue, manifests across diverse cultures and mythologies, representing a profound, beyond-historical truth about the nature of divine service, sacrifice, and redemption. We will begin by briefly revisiting the traditional Christian interpretation of Isaiah 53 and its undeniable significance. We will then pivot to a comparative mythological and folkloric analysis, demonstrating how the themes of the sacred servant resonate in seemingly disparate traditions. Our exploration will journey from the American Indian figure of the "sacred clown" to the Javanese mythical figure of Semar, a fallen "angel" who becomes a humble servant.
Category: Religion and Spiritualism

[26] ai.viXra.org:2512.0026 [pdf] submitted on 2025-12-07 01:27:29

Redefining Progress: Balance Between Economic Buoyancy and Environmental Conservation

Authors: Motsumi Taje
Comments: 18 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references in a proper/standard manner)

This research examines the intricate and often contentious relationship between economic growth and environmental sustainability, challenging conventional paradigms that prioritize economic expansion at the expense of ecological preservation. The study criticallyassesses the assumptions underpinning growth-centric development models, with particularattention to the Environmental Kuznets Curve (EKC), which posits that environmental degradation increases in the early stages of economic growth before improving as a society becomes wealthier. Through a detailed critique of the EKC and the impacts of capitalist economic structures, this paper highlights the flaws of these models, particularly their failure to account for irreversible environmental damage and the insufficient role of policy interventions in mitigating ecological harm. Furthermore, the research explores how international competition and the capitalist drive for profit exacerbate environmental degradation, pushing nations to weaken environmental regulations in pursuit of economic advantage. The paper advocates for a shift towards sustainable economic models that integrate both economic growth and environmentalconservation, stressing the need for robust regulatory frameworks and international cooperation. The findings underscore that, while economic and environmental objectives have historically been seen as mutually exclusive, a balanced approach is not only feasible but essential for achieving long-term prosperity and ecological stability.
Category: Economics and Finance

[25] ai.viXra.org:2512.0025 [pdf] submitted on 2025-12-07 01:23:11

A Note on Affine-like Invariance in Finite Collatz Segments

Authors: Kevin Fidelis
Comments: 3 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We observe an explicit algebraic relationship between certain initial values under the Collatz map. For a starting integer (X) and a chosen step count (m), numbers of the form [Y = X + k cdot 3^u cdot 2^e ] share the same parity sequence as (X) for the first (m) steps, where (e) is the number of even steps in (X)'s first (m) iterations. The difference (Delta_s = C^{(s)}(Y) - C^{(s)}(X)) evolves as [Delta_s = k cdot 3^{u+o_s} cdot 2^{e-e_s}, ] where (o_s, e_s) count odd/even steps up to (s). This relationship persists until the exponent of 2 in (Delta_s) becomes negative. The result is an elementary algebraic identity with no implication for the Collatz conjecture.
Category: Number Theory

[24] ai.viXra.org:2512.0024 [pdf] submitted on 2025-12-07 01:15:01

Complete Computational 6D Unified Field Theory: Gravitation, Quantum Phenomena, and Internal Temporal Geometry

Authors: Martin Hristov
Comments: 68 Pages. (Note by ai.viXra.org Admin: Further repetition may not be accepted)

We construct a complete six-dimensional Unified Field Theory (6D-GUFT) in which gravitation, inertia, quantum phenomena, and internal gauge structure emerge from a single geometric framework. The theory is built on a (1 + 3 + 2)-dimensional manifold consisting of the physical time t1, spatial coordinates xi, and two internal temporal coordinates (t2, t3) which represent, respectively, configuration variability and potentiality. The resulting geometry generalizes Kaluza—Klein theory to a temporal internal sector while preserving a unique causal direction. We develop the full 6D metric, compute the complete Christoffel symbols, Riemann tensor, Ricci tensor, and Ricci scalar, and obtain the full 6D Einstein equa-tions. Dimensional reduction on an internal temporal torus yields an effective 4D theory containing gravity, internal gauge fields, scalar fields (Φ, ψ, θ) describing in-ternal temporal structure, and a conserved geometric current Jµ θ associated withpotential flow in the internal sector.We derive a configuration density equation ρ(t1; t2, t3) governing superposition-like behavior and show that in the non-relativistic limit the Schr¨odinger equation and Born rule emerge naturally. Coupling to fermions and gauge fields is examined, and pathways toward embedding the Standard Model are outlined. The theory pro-duces phenomenological predictions including Yukawa-like corrections to Newtonian gravity, galactic rotation curves without dark matter, mass-dependent decoherence rates, and internal-temporal fluctuations measurable by atomic clocks. Anomaly structure, stability, and Bianchi identities are analyzed for mathemat-ical consistency. We compare 6D-GUFT with string theory, loop quantum gravity, and noncommutative geometry, and propose a detailed research program for theoretical and experimental exploration.
Category: Relativity and Cosmology

[23] ai.viXra.org:2512.0023 [pdf] submitted on 2025-12-07 01:12:04

A Proposed Laboratory Test of Einstein’s One-Way Light-Speed Isotropy Convention Using a Mid-Point Timing Measurement

Authors: Christopher J. Clyde
Comments: 2 Pages.

Special Relativity (SR) rests on two postulates and one critical convention: the isotropy of the one-way speed of light (c in both directions) in every inertial frame. This convention is not directly testable by experiments presupposing clock synchronization via light signals. The present work identifies a logical contradiction between the postulates when the convention is examined without prior synchronization and proposes a simple, low-cost optical experiment capable of directly measuring whether the time for light to travel from source A to midpoint M equals the time from M to source B when the apparatus moves relative to the hypothesized rest frame of the emission events. A deviation from equality would falsify Einstein's one-way isotropy convention and demonstrate an absolute simultaneity frame tied to light events.
Category: Relativity and Cosmology

[22] ai.viXra.org:2512.0022 [pdf] replaced on 2025-12-11 21:31:35

Gravity as a Push: A Quantum Wind Model of Gravitational Force

Authors: J. R. Slack
Comments: 8 Pages. (Note by ai.viXra.org Admin: For the last time, please cite and list scientific references)

We propose a speculative microphysical interpretation of gravity in which themacroscopic spacetime curvature described by General Relativity arises from pressuregradients within the quantum vacuum. In this quantum wind framework, mass modifieslocal vacuum energy density, producing anisotropic pressure fields that yieldgravitational acceleration as an effective push rather than a fundamental attraction. Themodel is not advanced as a modification of General Relativity, but as a possible physicalsubstrate beneath its stress—energy geometry. We derive explicit experimental sensitivitytargets and null-criteria for testing whether highly asymmetric, highu2011current electromagnetic systems can induce any nonu2011Lorentz forces beyond known electromagnetic, thermal, and mechanical effects. The primary contribution of this workis the formulation of a decisively falsifiable, highu2011precision experimental program intended to constrain or exclude vacuumu2011pressure—based gravitational coupling at laboratory scales.
Category: Quantum Gravity and String Theory

[21] ai.viXra.org:2512.0021 [pdf] submitted on 2025-12-06 01:34:59

Gravitational Curvature as Cosmic Congruence Focusing: From Vacuum Topology to MOND Phenomenology

Authors: Russell S Clark II
Comments: 2 Pages. (Note by ai.viXra.org Admin: Author's name is required on the article; please cite and list scientific references)

We propose a unified framework where gravitational curvature emerges as the focusing of timelike congruences in an expanding cosmos, driven by internal binding stresses in matter. Using the Raychaudhuri equation, we interpret local reductions in the expansion scalar θ near mass concentrations as a "lag" in cosmic flow, manifested through ADM lapse suppression and sourced universally by the stress-energy tensor Tµν. The Tolman-Komar mass integral incorporates binding energies from quantum field theory exchanges, explaining why all forms of energy gravitate. Embedding in FLRW via the McVittie metric yields backreaction effects resolving the Hubble tension through lapse variance. At low accelerations a ∼ cH, pressure-dominated focusing interpolates to MOND phenomenology without new fields. Grounded in vacuum entanglement harvesting for causal topology, this predicts redshift-dependent a0, enhanced neutron star dilation, and universal equivalence, testable with 2025-2027 data. A novel extension posits quantum vacuum zero-point energy flux, orthogonal to the 3-hypersurface, as the driver of temporal change via Heisenberg uncertainty; its dilution with expansion slows the local time rate, mimicking dark energy acceleration.
Category: Relativity and Cosmology

[20] ai.viXra.org:2512.0020 [pdf] submitted on 2025-12-05 08:33:31

Baryonic Stabilization of the Dark Sector via Dimensional Locking: A 5D Geometric Origin for Chameleon Screening

Authors: Pedro Filipe Soares Pinto
Comments: 6 Pages.

We propose a unified five-dimensional (5D) Kaluza—Klein framework in which the modulusstabilizing the extra dimension (the radion) is governed by a mechanism we term "DimensionalLocking". Unlike standard scalar-tensor models that postulate ad-hoc coupling functions,our framework derives an environment-dependent screening mechanism directly from thegeometry of the compact dimension coupled to baryonic matter. We perform a rigorousdimensional reduction to derive the 4D effective potential, demonstrating that the stabilizationcondition induces an effective coupling βeff (ρ) that scales with local matter density. Thisdensity dependence ensures that the scalar field acquires a large mass in dense environments,reproducing the Damour—Polyakov screening effect and satisfying Solar System constraints(evading the no-go theorems for constant-coupling models). Numerically solving the full non-linear boundary value problem (BVP) for a vacuum chamber geometry, we predict a fifth-forcepeak acceleration of aϕ ≈ 2.1 × 10−9 m/s2. We validate this result with a mesh convergencetest showing negligible variation (< 0.001%) up to N = 8000 nodes. This model providesa falsifiable bridge between extra-dimensional physics and upcoming atom-interferometryexperiments.
Category: Relativity and Cosmology

[19] ai.viXra.org:2512.0019 [pdf] submitted on 2025-12-05 21:24:53

Introduction to Large Language Models

Authors: Leszek J. Cierniak
Comments: 21 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

Large Language Models (LLMs) represent a transformative advancement in natural language processing (NLP), building upon foundational Language Models (LMs) to achieve human-like language understanding and generation through massive scale and sophisticated architectures. This paper provides a comprehensive overview from a computer science lens, defining LMs and LLMs, dissecting the Transformer-based architecture central to LLMs, exploring their functionalities, and contrasting them with traditional LMs. Key components like self-attention and positional encodings are detailed with mathematical formulations, while a glossary and references ensure accessibility. By highlighting scaling laws and emergent abilities, we underscore LLMs' role in enabling zero-shot learning and multimodal applications, alongside challenges like computational efficiency and ethical considerations. This analysis serves as a primer for researchers and practitioners who are looking to navigate the evolution of AI-driven language technologies while offering a systematic framework to compare LLM architectures and emerging behaviors.
Category: Artificial Intelligence

[18] ai.viXra.org:2512.0018 [pdf] submitted on 2025-12-05 21:19:15

A Note on the Coincidence Between the Rydberg Energy Scale and the Total Energy of the Observable Universe

Authors: Ciaran Cooke
Comments: 3 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

This note reports a striking numerical coincidence observed at the current cosmological epoch. We observe that the product of the total number of photons in the observable universe and the Rydberg ionization energy of hydrogen approximates the estimated total mass-energy of the observable universe. While this relationship may be a temporal coincidence, the precision of the match suggests a phenomenological scaling of interest to cosmological heuristic models.
Category: Relativity and Cosmology

[17] ai.viXra.org:2512.0017 [pdf] submitted on 2025-12-04 02:13:13

ASCII Control Characters as Quantum Operators: Discovery of an Information-Theoretic Rosetta Stone

Authors: Justin Howard-Stanley
Comments: 20 Pages.

We report the experimental discovery that ASCII control characters (codes 0—31,127) function as quantum transformation operators with remarkable fidelity and se-mantic alignment. Through systematic testing on Azure Quantum’s Rigetti QVM sim-ulator, we demonstrate that characters designed for classical control flow—Backspace(BS), Bell (BEL), End-of-Text (ETX)—execute quantum operations including bit flips,entanglement generation, and superposition creation with near-perfect determinism.Composition of these operators yields emergent algorithmic structures including quan-tum search patterns, error-correcting codes, and deterministic state machines. We iden-tify a non-Abelian operator algebra with discrete entropy stratification (H ∈ {0, 1, 2, 3}bits), suggesting quantized information flow through symbolic systems. The correspon-dence between classical semantic meaning and quantum mechanical function impliesthat human information-processing intuitions may reflect deep quantum-computationalstructures. These findings establish ASCII as an accidental quantum programming lan-guage and suggest pathways toward natural-language quantum computing interfaces.Keywords: quantum computing, ASCII, operator algebras, quantum semantics,information theory, quantum gates
Category: Data Structures and Algorithms

[16] ai.viXra.org:2512.0016 [pdf] submitted on 2025-12-05 01:06:59

Extending Sustainable Advantage Based on John Kay's Distinctive Capabilities, to Include Schumacher’s Intermediate Technology with Applications to Botany Etc.

Authors: Florentin Smarandache, Victor Christianto
Comments: 24 Pages.

It is known that John Kay's Distinctive Capabilities Framework offers a profound and nuanced understanding of organizational achievement, shifting the focus from the static possession of significant assets ("Resource-based approach") to the dynamic cultivation of enduring relational contracts ("Relationship-based approach"). Kay identified three essential capabilities —Architecture, Reputation, and Innovation— as the non-replicable sources of performance and sustainable advantage. These capabilities encapsulate "what makes our organization so special," rooted in the continuity and stability of relationships with customers, suppliers, shareholders, and employees. While conceptually powerful, Kay's framework, in its original form, often lacks the operational precision required for modern execution, and the present article is an attempt to fill the gap. Moreover, in this article we also extend Sustainable Advantage based on John Kay's Distinctive Capabilities framework, to include E.F. Schumacher’s Intermediate Technology with applications to Botany etc., for instance new innovative solutions such as laser-culture, gravitational water vortex power plant, confined vortex turbine, new fusion energy theory based on PT-symmetric potential of crystals, and also a plausible new approach to turn plastic waste into biofuel. While several of those innovative solutions are still in "lab scale" phase, they can be expected to yield quite significant results in the near future, especially for less developed countries.
Category: Economics and Finance

[15] ai.viXra.org:2512.0015 [pdf] submitted on 2025-12-05 01:05:41

Quantum Biology of Cognition: A Unified Model for How Living Systems Turn Randomness into Meaningful Time

Authors: Stephan Brown
Comments: 23 Pages. Licensed under CC-BY 4.0

This paper presents a unified theoretical framework that integrates four major pillars of contemporary physics and neuroscience: quantum biology, Integrated Information Theory (IIT), the Free Energy Principle (FEP), and Orchestrated Objective Reduction (Orch-OR). We propose that neural microtubules exploit quantum coherence to convert environmental randomness into the structured, meaningful flow of lived time (Bergson’s durée).A coherence amplification parameter λ is introduced, spanning nine orders of magnitude from bulk water (λ ≈ 1) to the hypothesized gravity-induced objective-reduction regime (λ ≈ 10u2079). Quantum error-correction mechanisms in the microtubule lattice are argued to sustain coherence at physiological temperature long enough to influence cognition and to provide the discrete ~25 ms "moments" of experience observed in ~40 Hz gamma synchrony.The synthesis yields three primary falsifiable predictions testable within 2—5 years using existing techniques: (1) 5—10 % deuteration (Du2082O) should slow visual conjunction search by 15—100 ms due to reduced proton tunneling; (2) two-dimensional electronic spectroscopy of isolated tubulin dimers should reveal coherence persisting ≳ 100 fs at 310 K; (3) anesthetic potency should correlate with reduction in integrated-information proxies (e.g., perturbational complexity indexin ways classical ion-channel models cannot explain.We adopt an epistemically humble stance: the core Orch-OR mechanism remains unproven and is considered improbable under current physics priors, but the framework’s internal consistency and near-term empirical accessibility distinguish it from unfalsifiable speculation. If the three primary predictions all fail in well-designed studies by 2030, the microtubule-based quantum-cognition hypothesis will be regarded as refuted.
Category: Quantitative Biology

[14] ai.viXra.org:2512.0014 [pdf] submitted on 2025-12-05 00:59:51

Common Sense Understanding of Quantum Mechanics

Authors: Clinton J. Shaffer
Comments: 20 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

This paper suggests a concept for an aether flowing vertically into matter that potentially explains gravity and discusses a candidate model for the composition of such an aether. Such an aether composition could provide a mechanism for electromagnetism, and strong forces, as well as, providing a sensible mechanism for quantum gravity. Further the existence of an aether could also help many other physical phenomena make better sense. The null test result of the Michelson-Morley experiment resulted, in the conclusion that there was no aether, and that conclusion remains today. The basis of the Michelson-Morley experiment was the assumption that the earth traveled through the aether; an assumption that overlooked the possibility that the aether flows directly and vertically into all matter including the earth from all spherical directions. For an aether to continuously flow into a mass, it must convert into something else that can flow back out again. Perhaps two or more aether components flow into matter, including vacuum energy, where these components somehow combine to form something else, such as thermal photons, which flow back out of matter. The inward flowing components cause a gravitational drag force while passing through matter but the outward flowing components do not. As the aether converges towards a spherical center, it must accelerate and I speculate that this acceleration is the cause of the gravitational time dilation, which in turn, results in the gravitational drag force. The thermal photons radiating into space would gradually give up energy to the aether of space until the photons dissolve back into aether components and vacuum energy. Electromagnetism, and strong forces could potentially be explained as an exchange of aether components between positive and negative charges. An aether could also help explain other phenomena such as the duality of light, mass and energy, vacuum friction, and the unexplained thermal energies radiating from planets. This paper suggests a possible composition of the aether, as well as, two experiments that could be performed with the potential of supporting an aether flowing into matter. Unproven ideas are offered to provoke new thinking.
Category: Quantum Physics

[13] ai.viXra.org:2512.0013 [pdf] submitted on 2025-12-03 21:26:44

Resolving Goldbach’s Strong Conjecture:  A Complete Reduction to a Single Covariance Lemma

Authors: Bahbouhi Bouchaib
Comments: 19 Pages.

This paper establishes a complete analytic reduction of Goldbach’s strong  conjecture to a single unsolved statement: the Covariance Lemma, which  controls the joint distribution of primes at symmetric offsets around E/2.  All other components of Goldbach’s problem—including the existence of  primes in short symmetric intervals of width proportional to (log E)²—are  already unconditionally resolved by explicit results on primes in short  intervals, notably those of Dusart [Dusart 2010, Dusart 2018], as well as  classical density theorems grounded in the Prime Number Theorem.  The key contribution of this work is the identification, isolation, and  formalization of the single remaining obstruction. By proving that the  covariance of prime indicators P(E/2—t) and P(E/2+t) cannot suppress all  symmetric prime coincidences, one obtains a full proof of Goldbach’s  strong conjecture. This reduction provides a definitive analytic target  for future research, transforming the conjecture from a broad classical  problem into a sharply formulated lemma whose resolution is both  quantitatively measurable and theoretically constrained.
Category: Number Theory

[12] ai.viXra.org:2512.0012 [pdf] submitted on 2025-12-03 21:25:20

This Earth at the Precipice: From Cognitive Imbalance to Homo Intuilytics-Spiritus

Authors: Victor Christianto, Florentin Smarandache
Comments: 11 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

The pursuit of discovery has always been viewed as a rigorous, step-by-step march — a process defined by analysis, precision, and relentless logic. This methodology, the essence of modern science and Western technological progress, is the quintessential product of the left cerebral hemisphere of the human brain. We rely on its linear, verbal, and reductive power, and rightly so; it is the architect of our digital world and the administrator of our complex societies. Yet, a haunting suspicion lingers at the periphery of our collective awareness: in prioritizing this mode of thought, we may have inadvertently built a civilization that is brilliant but dangerously unbalanced. This suspicion is given profound, scholarly weight by the work of psychiatrist and literary scholar, Prof. Iain McGilchrist, particularly in his seminal text [1], The Master and His Emissary: The Divided Brain and the Making of the Western World.
Category: Religion and Spiritualism

[11] ai.viXra.org:2512.0011 [pdf] submitted on 2025-12-03 21:24:58

Reconsidering Classical Scientific Method: the Spiritually Guided Scientific Method of Spirintuilytics

Authors: Victor Christianto, Florentin Smarandache
Comments: 13 Pages. (Note by ai.viXra.org Admin: Please cite all listed scientific references)

The classical Scientific Method is the bedrock of our modern world. It has allowed us to harness electricity, decode the genome, and send machines beyond our solar system. It is a systematic, rigorous, and proudly Left Hemisphere (LH)-dominated process, designed to strip away bias and deliver objective, verifiable truth. Its very strength—its insistence on the explicit and the measurable—is, however, its most critical limitation when facing the complex, interconnected problems of the 21st century. This suspicion is given profound, scholarly weight by the work of psychiatrist and literary scholar, Prof. Iain McGilchrist, particularly in his seminal text [1], The Master and His Emissary: The Divided Brain and the Making of the Western World. We came up with a modest proposal that the next great leap in discovery will not come from abandoning the Scientific Method, but from re-thinking its starting point—the crucial moment of genuine insight.
Category: History and Philosophy of Physics

[10] ai.viXra.org:2512.0010 [pdf] submitted on 2025-12-03 21:19:45

Spacetime, the Standard Model, and All of Physics from Archimedean Exhaustion of the Arithmetic Circle [?]

Authors: J. W. McGreevy
Comments: 3 Pages.

We prove that the entirety of known physics — Einstein—Cartan gravity, the Standard Model with three generations, QCD confinement, electroweak unification, the Kerr—Newman black hole, the CMB power spectrum, and the resolution of five Clay Millennium Problems — emerges from a single mathematical process: the Archimedean exhaustion of the circle at the infinite prime applied to the global arithmetic orbifold O = h Spec(Z).Gbm ⋊ Gal(Q/Q) i ⊔ h SL(2, Z)H i followed by sequential double-negation closure. All observables are fixed without parameters.
Category: Mathematical Physics

[9] ai.viXra.org:2512.0009 [pdf] submitted on 2025-12-02 23:55:11

Derivation of the Fine-Structure Constant and Fermion Mass Patterns from 4D Polytope Geometry

Authors: Aleksandras Kaliberda
Comments: 22 pages, 5 figures. Developed with AI assistance (Claude, Gemini). Code at https://github.com/gubasas/MRRC_Framework. CC BY 4.0

We present version 6.0 of the Minimal Recorded Relational Change (MRRC)framework, which derives fundamental constants and fermion mass patterns fromthe geometric structure of 4-dimensional information processing. Building on theV5.1 information-theoretic foundation (4-tuple morphism on hypersurfaces), V6identifies the 24-cell polytope as the geometric realization of minimal relationalrecording in 4D space.The framework begins with the 24-cell, the unique regular 4D polytope exhibit-ing F4 exceptional Lie group symmetry, from which we derive the fine-structureconstant α−1 = 137.036 (0.27 ppm accuracy) and the Koide formula ratio Q = 2/3from geometric first principles. The 4D sphere relationship 4π3 + π2 + π ≈137.13(within 0.07% of α−1), which motivated the original MRRC framework, emergesnaturally from the volumetric-to-surface ratio of information maintenance in 4Dmorphisms.We propose that generation structure follows a quadratic spiral pattern p(n) =a·n−b·n2 in 4D phase space, with coefficients determined by fundamental constants:a ≈α−1/10 and b ≈φ2 (golden ratio squared). For quarks, we hypothesize distinctpolygonal symmetries: hexagonal (6-fold) for up-type quarks and pentagonal (5-fold) for down-type quarks, motivated by QCD’s hexagonal weight diagrams andthe exact geometric identity sin(54) = φ/2.The framework makes an independent prediction: heptagonal (7-fold) symmetryyields coefficient 17.2, matching the top quark mass (172.76 GeV) with 0.4% accu-racy—a prediction notfitted to quark data. Critically, the framework also predictsfailures: triangular symmetry shows no physical correspondence, while octagonalsymmetry may predict MSSM Higgs bosons at ∼196 GeV (untested).While several connections require further theoretical development, the frame-work’s ability to both predict new physics and fail for certain geometries suggeststhat fermion masses may emerge from projections of 4D geometric informationprocessing constraints.
Category: High Energy Particle Physics

[8] ai.viXra.org:2512.0008 [pdf] submitted on 2025-12-03 00:38:18

The M2(C) Unification of Space—Time, Spin, Electromagnetism, and Gravity

Authors: E.P.J. de Haas
Comments: 32 Pages. (Note by ai.viXra.org Admin: For the last time, please cite ai.viXra.org references in complete/required manner!) DOI:10.5281/zenodo.17796031

This paper demonstrates that the essential structures of relativistic physics—space-time, time orientation, spin, electromagnetism, stress—energy, angular momentum, and gravity—can all be derived from the smallest nontrivial complex matrix algebra. Within this algebra, space-time vectors and spin vectors turn out to be two aspects of a single underlying structure, related by a simple internal rotation. A single fundamental product in this algebra reproduces the Minkowski metric, Lorentz transformations, the behavior of electric and magnetic fields, and the core dynamical quantities of relativistic matter.Gravity enters through one algebraic rotation acting on the line element, yielding the full PG coframe and metric without introducing curvature tensors, connections, or any external geometric machinery. The Dirac equation and its adjoint then arise naturally as the algebra is doubled, with no additional assumptions.The result is a compact and internally consistent unification in which space-time, spin, electromagnetism, and gravity are complementary expressions of one minimal algebraic framework. No larger mathematical structures—such as Clifford algebras, spin bundles, or differential geometry—are required.
Category: Relativity and Cosmology

[7] ai.viXra.org:2512.0007 [pdf] submitted on 2025-12-02 21:49:24

Beyond the H-Index: re-Imagining Impact Metrics for a Human-Centred, Environment-Aligned, Transparent Science

Authors: Victor Christianto
Comments: 8 Pages.

Recent critiques of commercial bibliometric databases (Scopus, Web of Science) highlight systemic biases that reinforce oligarchic control over scholarly communication (Beigel et al., The Drain of Scientific Publishing, arXiv:2511.04820; Hanson et al.,arXiv:2309.15884). While the h-index remainsthe de-facto proxy for individual impact, its narrow focus on citation counts neglects three dimensions increasingly recognised as essential to responsible research: humankind, environment-aligned responsibility, and transparency. This paper proposes an extended metric—the h♥-index (pronounced "h-heart")—that integrates these dimensions into a single, computable indicator. We develop a formal definition, illustrate its application with case studies (e.g., Luc Montagnier’s work on DNA transduction / quantum effect), and supply a reproducible Mathematica workflow that extracts, weights, and aggregates the necessary data from a Neo4j graph database of scholarly entities.
Category: General Science and Philosophy

[6] ai.viXra.org:2512.0006 [pdf] submitted on 2025-12-02 21:45:23

The Principle of Universal Replication

Authors: Ignacio Lesta Pelayo
Comments: 6 Pages. (Note by ai.viXra.org Admin: Please cite and list scientific references)

This work presents the Principle of Universal Replication (PUR), an ontological framework according to which every stable form of coherence—from physical vacuum to matter and life—tends to preserve and reproduce itself over time. The PUR does not introduce new physical forces, but instead names the fundamental property that enables the persistence of ordered configurations in the cosmos. Under this perspective, phenomena such as vacuum expansion, matter stability, and the continuity of the physical forces can be interpreted as manifestations of the same primordial impulse of being. The article explores the cosmological implications of the PUR, its compatibility with the ΛCDM model, and the reason why it excludes a final state of universal heat death.
Category: Relativity and Cosmology

[5] ai.viXra.org:2512.0005 [pdf] submitted on 2025-12-02 21:26:05

Dynamical Theory of Vacuum Entanglement Harvesting and Origin of Dark Matter Abundance

Authors: Russell S Clark II
Comments: 12 Pages.

We develop a dynamical theory of vacuum entanglement harvesting, extending the heuristic result of Paper I that the observed dark matter fraction of 0.253 corresponds to the Standard Model vector fraction 27 out of 106.75. Here we show: (1) this ratio emerges from a maximum-entropy partition of vacuum entanglement sectors; (2) Standard Model gauge vectors, as the unique fields entering the covariant derivative, mediate the conversion of W-type multipartite entanglement into EPR pairs; and (3) stable harvesting requires longitudinal polarization, which appears only below the Electroweak Phase Transition. This "Longitudinal Locking Hypothesis" dynamically localizes harvesting to the temperature window around 100 GeV. Solving a Boltzmann equation for the harvested fraction, we find robust convergence to 0.253 regardless of initial conditions. We thus obtain a natural dynamical origin for the dark matter abundance within the Causal Topology framework.
Category: Relativity and Cosmology

[4] ai.viXra.org:2512.0004 [pdf] submitted on 2025-12-01 16:56:15

Emergence of Classical Spacetime and the Complete Standard Model from Archimedean Exhaustion of the Arithmetic Circle Within Moonshine: Generalized Relativistic Quantum Field Theory

Authors: J. W. McGreevy
Comments: 3 Pages.

We prove that the Einstein—Cartan spacetime of our universe, together with the complete Standard Model (including three generations, the Higgs mechanism, and all observed charges), is the crepant resolution of a single global arithmetic orbifoldO = h Spec(Z). Gbm ⋊ Gal(Q/Q) i⊔h SL(2, Z)Hivia sequential double-negation closure driven by Archimedes’ exhaustion of the circle at the infinite prime. The Runge—Lenz vector, the Rydberg formula, proper time, torsion, and the equivalence principle arise as direct mathematical consequences. The Riemann Hypothesis is proven as a consistency condition.
Category: Mathematical Physics

[3] ai.viXra.org:2512.0003 [pdf] replaced on 2025-12-05 21:28:04

Geometric Reconstruction from Correlation Structure

Authors: N. J. Kettlewell
Comments: 6 Pages.

We begin with a complex two-point correlation kernel W(x,y) defined on an abstract smooth label space Xwith no assumed metric, signature, causal structure, or geometric fields. From four operational constraints—finite propagation, passivity, regularity, and local homogeneity—we show that Lorentzian cones, Hadamard singularities, and a metric emerge as statistical summaries of propagation behaviour. Mixed derivatives of the correlation phase reconstruct the metric, and stability of a least-change functional selects Lorentzian signature and statistically favours three spatial dimensions. Allowing coefficients of the correlation generator to vary introduces curvature, and ensemble-averaging the correlation stress yields the statistical consistency conditionGAB + ΛgAB = κ⟨EAB ⟩,linking curvature to averaged correlation tension. Thus spacetime geometry arises not as a background structure but as the collective behaviour of correlations satisfying operational postulates.
Category: Mathematical Physics

[2] ai.viXra.org:2512.0002 [pdf] submitted on 2025-12-01 16:52:20

Layer-Induced Discrete Soliton Modes as the Origin of the Three Particle

Authors: Aoi Setsu
Comments: 5 Pages. (Note by ai.viXra.org Admin: Please cite listed scientific references)

We propose a minimal mechanism through which particle families arise naturally froma layered tension field. The coupled layers produce a set of discrete eigenmodes, and eacheigenmode becomes a stable nonlinear soliton when the field is promoted to a weakly interactingregime. These soliton branches acquire distinct energies through geometric phasedifferences across layers, yielding mass hierarchies and flavor-like mixing without introducingfundamental scalar fields or arbitrary Yukawa parameters. Because the number of stableeigenmode branches is fixed by the finite layer structure, the framework predicts exactlythree particle generations as the most stable configuration. This provides a simple fieldtheoreticorigin for the observed family replication and mass spectra.
Category: High Energy Particle Physics

[1] ai.viXra.org:2512.0001 [pdf] submitted on 2025-12-01 16:49:51

Five-Dimensional Ontological Framework for Holographic Dark Matter

Authors: Fedor Kapitanov
Comments: 23 Pages.

We construct a comprehensive five-dimensional framework that unifies three foundational conjectures: (1) dark matter as archival degrees of freedom emerging at holographic saturation, (2) three-dimensional space as a thermodynamically optimal projection of a 4D information substrate, and (3) gravity as the self-consistencycondition of distributed memory. The framework employs an RS-type metric with holographically derived suppression parameter ε(N ) ∼ 10−70, where N ∼ 10122 is the cosmic horizon entropy. We prove that the effective bulk volume scales as Veff ∝ ε1/p for warp profiles Φ(z) ∝ |z|p, and demonstrate that all non-gravitational interactions between archival and Standard Model sectors are suppressed by factors ε2p ∼ 10−80, rendering direct detection impossible in principle.Assuming archival degrees of freedom scale with accessible bulk volume (garch ∗ ∼ N εα), matching to the observed ΩDM/Ωb ≈ 5.4 yields α ≈ 1.70, corresponding to warp exponent p ≈ 0.59. We reconstruct the unique bulk potential V (Φ) ∝ Φ−1.4 required by this geometry and show it belongs to the well-studied inverse power-law class arising in quintessence and string moduli stabilization.The model produces three classes of predictions: (i) robust predictions independent of assumptions (null direct detection, standard cosmology preserved), (ii) parametric predictions dependent on α (mass bounds, self-interaction limits), and (iii) falsification criteria. We compare systematically with WIMP, axion, and MONDalternatives, finding that the archival hypothesis uniquely predicts exact null results for all non-gravitational searches while remaining fully consistent with ΛCDM cosmology. This work represents a parametrized conceptual framework requiring UV completion, not a finished theory. We explicitly separate rigorous mathematics fromspeculative physics and identify the minimal assumptions needed for observational consistency.
Category: Quantum Gravity and String Theory