User:2A02:2149:8A27:B100:B96A:F9FA:C69D:2DF1/sandbox

create paragraph: Hubble volume function
According to the "Hubble volume function" a more complex behaviour exists, but we (standard astrophysics) don't have: 1. the function of the evolution of the percentages of matter, dark matter, dark energy considering that flatness is a given (a constant property of the definition of the universe, in huge orders of magnitude [without any given the function doesn't work]), 2. the definition of maximal degeneracy (maximal degeneracy of spacetime is tautological to the maximal degeneracy of matter because the overall maximally degenerate area acts like a wavefunction with all its uncollapsed Everettian interpretations); nowadays we think that the densest part of black holes are maximally degenerate spacetime [or probabilistic matter; it cannot be differentiated]; the densest parts of black holes aren't necessarily everything behind the inner event horizon but even closer to the Kerr ringularity (which is actually a degenerate torus and not a volumeless ringularity); we don't have the mathematical formulas of the degenerate torus [astrophysical black hole's spin, but volumeless matter doesn't have any probabilistic spatiotemporal range to be exhibited; thus the degenerate torus is closer to truth; micro black holes immediately evaporate, because their maximally degenerate region doesn't have a corrective headroom behind the micro-event-horizon to twirl back and reabsorbe the maximally degenerate spacetime it wants to tunnel out; because all its degrees of freedom are outside the micro black hole. Astrophysical black holes have a corrective headroom behind the inner event horizon to reabsorbe the maximally degenerate spacetime it NECESSARILY tries to escape [it has no other degree of freedom and its ground state gets immediately disturbed by the slightest quantum jitter [quantum jitters are preinformational; statistical subparticles totally necessary to describe particles/particles are informational = they can be measured directly; quantum jitters are subinformation = exist along with many alternative wavefunctional variations but cannot be measured directly but we can probabilistically describe how they behave. Subinformation is totally necessary in foundational physics, because a. information cannot be recorded on nothing[ness]/a preinformational = wavefunctional connectome is necessary in order some subinformational variations become informational = particle matter in an Everettian universe with many alternatives [Sabine Hossenfelder is wrong for claiming that "the role of science is to hide problems in the ontological axiomatics = foundations of substantiality and particularly quantum mechanics [for our universe; in other universes different foundations of substantiality are mathematically possible; but the axiomatics of substantiality aren't tautological to the common mathematical axiomatics, ontophysics [= the true physics and not science/scientophysics] has to be an "ontological Turing machine" = a computer which is its own axiomatical algorithm, software and hardware all made of the same Deutschian constructor (common mathematics doesn't have a causally coherent axiomatics; substantial physical axiomatics require an axiomatic algorithm which works around the problems of axiomaticity (incalculability, incompleteness and inconsistency) without to cancel at least one of them but it avoids them (quantum uncertainty, the accelerating expansion of the Universe = ontological entropy are some ontoaxiomatic tricks; other universes would have different ones). Allomathematics means mathematics based on different axiomatics. (the capitalized Universe is our own; claiming that other universes don't exist its wrong, because any foundational theory has consequences; people like Sabine Hossenfelder who hate the consequences created by theories, still have theories which still cause more ontoaxiomatic monstrosities like the unmathematical hidden variables; because people like Sabine Hossenfelder [superdeterminists] don't care at all about ontoaxiomatics = the connectome-like axiomatics of physical systems and not the axiomatics of common mathematics which were initially created to be calculational but without a common algorithm which produced the axioms. Sabine Hossenfelder and the superdeterminists STILL do cause with their theories ontoaxiomatic consequences = unmeasurable theories, way worse that people who act consciously. Sabine Hossenfelder is trying to overscience science, being afraid to say to much, but ANY THEORY WHICH DESCRIBES FUNDAMENTAL PHENOMENA HAS ONTOAXIOMATIC CONSEQUENCES. Sabine Hossenfelder doesn't know the proper definition of science. Science is NOT to shift problems to the deeper axiomatics by falsely claiming you don't care about deep physical foundations = ontoaxiomatics but by actually creating deep ontoaxiomatics you never look and which by no means are generic/neutral/all-compatible, etc.) Our Universe is actually allomathematical in its foundations, thus common mathematics can only describe it in a fragmented manner (except if we find its specific allomathematics; Deutsch's constructor theory of the universe has many subconstructors in all orders of magnitude, but causopermeability = causal permeability between the constructors of different orders of magnitude always exists)...

3. Some (too many exist) cyclic cosmology is necessary in order we understand the Hubble volume function (and the "Hubble volume field" = block universe which doesn't get a full definition if the origin/center of a single Hubble volume is taken into account; Hubble volumes are infinite and overlapping thus the Hubble volume field holds together a bigger universe and the Everettian Hubble volume field even includes infinite different initial Hubble volumes. Infinite because quantum field theory which is closer to ontophysical truth isn't discrete. Quantum mechanics in ontologically continuous, but only specific state changes are discretized; but gravity and many other phenomena are mostly based on intermediary transitions of states.

Keep the main idea of the Penrosean cyclic universe, but the phase transition can be described based on the PR = EPR formulas.

The pre-big-bang state was an omniblackhole = maximally degenerate full universe without any free spacetime outside it. The only degree of freedom of an omniblackhole = omniblackhole universe is to expand according to official Big Bang theory (any perturbation disturbs its ground state and it happens immediately). Information = particle properties we can directly detect isn't the deepest foundation of our Universe's substantiality. We need the Everettian preinformatiinal quantum jitters in order we have an "Everettian constructor" to create a better (but fundamentally without causal closure; see: David Tong, the deeper we look in quantum field theory the more complex patterns we get; without causal openness = ontoaxiomatic Everettism, we don't have self-causation/self-causality, and no ontoaxiomatic system = physical world can exist without self-causation/self-causality; [God still would require self-causation, but also the self-causation of personhood and the involvement of personhood in cosmogony but that's wrong because personhood is the result of thinking, and even idealized topological personhood-yielding computers [brains which yield minds] would require spacetime. Time to think (otherwise we have permanent structure which doesn't meet the criteria of substantiality because it cannot workaround the problems of axiomaticity) and volume for its connectome. Topology doesn't have a proper measure, but for specific proceduralities (see: topological quantum field theory) subroutines are created in topological systems and comparison amongst them within a common system is possible thus size gradually and procedurally makes sense even in specific topological systems which exhibit specific proceduralities which give limits and ranges to their subroutines.

The PR = EPR can be used for the preinformational correspondence of a parental omniblackhole universe, which expands into infinite Everettian indiscrete offspring universes.

The initial Hubble volume of our local universe-strain (Everettian subuniverse) wasn't a singularity, because singularities don't meet the criteria of substantiality because they don't have a spatiotemporal probabilistic range to be exhibited. The omniblackhole started as an omnisingularity (= the overall universe was maximally degenerate spacetime; singularity and omnisinglarity are physically impossible being anti-Everettian and anti-ontoaxiomatic = without a subinformational connectome / incontextual) thus the homogeneity, the isotropy and the slight energy variations (due to subinformational quantum jitters) can be explained. That omniblackhole had only its maximally degenerate black hole region without the other components of the common astrophysical black holes. Black holes have a pace threshold of enlargement due to "mass enlargement inertia" per size. Modern astrophysics doesn't have the particular formula. A faster rate of enlargement is impossible per black hole size (due to: a. black hole enlargement inertia and b. maximal degeneracy pressure = the maximal degeneracy pressure of spacetime is a final threshold which only tunnels matter outside the event horizon), and the extra mass is jetted and also omnidirectionally radiated out. The omniblackhole didn't have an external spacetime thus it big-banged = standard Big Bang cosmology (with all its Everettian alternative universe-strains) starts.

After the Big Bang the Hubble volume grows because it doesn't keep permanently the maximal rate of expansion, because newly generated spacetime = field energy (initially preinformational jitters and then some informational particles originated by them) consume expansionary energy because they are using it to be created.

The universe continues to expand due to ontoaxiomatical workaround (it has to avoid the problems of axiomaticity but it cannot, thus it's definition = ontological size is always entropic/nonstationary). We don't have the formula which treats energy, matter, dark energy, dark matter as variables and the large-scale flatness of the universe as a given (core of the definition of the universe). We absolutely need that formula otherwise we cannot understand many things in cosmology (here the phase transition between parental dying universe and Everettian offspring big bangs [the Big Bang of our local universe is only capitalized]).

After the Big Bang the local universe initially has an expanding central Hubble volume (because the fields consume expansionary energy; according to the aforementioned formula and other field formulas we don't have nowadays), but when the universe gets diffused enough (we need a function to describe the expansionary changes), the fields consume lesser and lesser dark energy and then the universe either the universe immediately big-bangs which is wrong because that theory doesn't create an isotropic, homogeneous and with the exact energy variations universe, or correctly (here is the correct version) the initial Hubble volume field Unruh radiation (the Hubble volume field Unruh radiation always exists, nowadays is negligible; when the Hubble volumes become marble-sized they produce omniblackhole threshold energy and an omniblackhole is formed; which big bangs because bigbanging is its only degree of freedom because perturbations/jitters are immediate and it cannot stay at ground level) and Big Rip impact on the fields create enough energy per volume to create the omniblackhole = omniblackhole universe. The omniblackhole phase isn't immediate because a very old universe isn't very homogeneous. Enormous blackholes and comparatively regions deemed as gaps are formed, the compressed black holes tunnel energy to the less dense regions, that turbulent pre-big-bang era prepares the perfectly homogeneous, isotropic and with the correct jitter variations omniblackhole which Everettianly (it creates all its wavefunctional offspring) big-bangs.

create page: Hubble volume field = the infinite universe defined via (mutually) overlapping Hubble volumes (Everettian alternatives are introduced the further the distance)
Causality is very important in physics. The present "sole Hubble volume" article doesn't cover the topic of the Hubble volume field (the main view is that infinite intermediary Hubble volumes exist (partial causal link + decoherence), but more and more variation is permitted and that generates an observer-centric bulk universe with less Everettian alternative arrangements (= spacetime can have many alternative arrangements before one interacts with it; that one doesn't have to be a thinker/person; it works the same with particles) close to the arbitrary observer, which increase in number the further we diverge (from the arbitrary Hubble volume origin).