All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Original Article

, Volume: 11( 8) DOI: 10.37532/2320-6756.2023.11(8).373

The Importance of Pre-Material Quantum Information in Addressing Physics' Limitations: Uncovering the Numerous Manifestations in Our Material Universe of a Neglected Quantum Scenario

*Correspondence:
Carles Udina i Cobo
Independent Researcher, Spain,
E-mail: cucobo@gmail.com

Received date: 18-August-2023, Manuscript No. tspa-23-110606; Editor assigned: 20-August-2023, Pre-QC No. tspa-23-110606 (PQ); Reviewed: 28-August-2023, QC No. tspa-23-110606 (Q); Revised: 30-August-2023, Manuscript No. tspa-23-110606 (R); Published: 31-August-2023, DOI. 10.37532/2320-6756.2023.11(8).373

Citation: UDINA C. The Importance of Pre-Material Quantum Information in Addressing Physics' Limitations: Uncovering the Numerous Manifestations in Our Material Universe of a Neglected Quantum Scenario. J. Phys. Astron.2023;11(8):373.

Abstract

Although today, information technology has revolutionized our civilization, demonstrating that information management allows for the simulation and management of any imaginable process, whether it's mechanical (automatisms), biological (genetic information), or behavioral-cognitive (information and psychic faculties), physics continues to ignore its own underlying information. We will enumerate the multiple physical phenomena that require such pre-material underlying information, its full empirical justification, its absolute consistency, and the paradoxes and contradictions that it resolves, such as Relativity and the Big Bang (BB). This implies that Quantum physics is a scenario whose components are exclusively the underlying Information and Energy (photons), entirely independent of all the material physics of the Universe since it predates it. This also applies to Space (delocalization) and Time, as both are emergent characteristics in the materialization of the Universe.

Keywords

Quantum physics; Big bang; Relativity

Introduction

This article is based on [1], which presents a new testable and consistent hypothesis. This hypothesis reinterprets relativistic and non-Newtonian phenomena by relying on the necessary underlying quantum information, thereby implying the existence of an empirical and informational-virtual fifth dimension. It also seeks to redefine the concept of the photon and offers an explanation for the "c" limit.

Additionally, this article incorporates summaries from three sources: [2], which provide a summary and recension of [1, 3], which offers definitions for delocation ("non-locality"), immediacy ("action-at-a-distance"), and “massless energy" (photon), and "singularity." It also discusses the necessity of pre-material quantum information and redefines the controversial concept of the ether. Finally, [4] discusses the necessity of information and its transfer between different systems for any "unification" in physics. It suggests the extension of these principles to the biological and psychic realms and addresses common misunderstandings of physics concerning time.

All of these sources are derived from [1], which the author recently published in the "pre-print" of "viXra" [5]. As a result, this article will refer to them repeatedly due to the complexity of explaining a new and different paradigm. The aim is to present the information concisely, minimizing the use of parentheses whenever possible.

First ignored manifestations of underlying or quantum or pre-material information in physics

The first need for the existence of underlying information resulted from "MAXWELL's Demon" (1867) [6], which was finally solved by SZILÁRD (1929) [7]. However, today, it is still ignored as such an informational need [3].

The next need appears with radioactive Decay, discovered by BECQUEREL (1896) [8] even before relativity. This phenomenon was also given priority study at the beginning of the 20th century with several Nobel Prizes.

But surprisingly, it was never considered how such a simple law as that of decay was possible among so many millions and millions of radionuclides. Even at the end of the 1970s, when I worked in nuclear medicine with the regional Cerebral Blood Flow (rCBF), and in the 1980s, I made important contributions in radiation protection [9,10], nobody could ever answer my basic question: "How is it possible that a radionuclide, among so many, knows how to wait until it must transmute?" I was only answered, "Calculate and shut up." Any hypothesis based on physical parameters, such as temperature, mass, densities, dilution, pressure, etc., is useless. Later, in "Heuristics," I explain that information is the only satisfactory answer.

The next quantum manifestation appeared at the end of the 19th century with the work of POINCARÉ and LORENTZ [11] on what was to become known as "Relativity." Because of their seriousness and honesty, and above all humility, they accepted that they could not give a reasonable explanation. It must also be understood that at that time, information as we know and apply it today was totally unknown.

As explained summarily in [2] and extensively in [1], what the LORENTZ Transformation derived from the MAXWELL Equations shows is not a “dilatation” at all but shows the intervention of underlying or pre-material or quantum Information Processing Time, which is superimposed on and inserted into the material Time measured by clocks. As in the usual and natural alternation of sleep-wakefulness.

Its pre-material character makes it undetectable to material phenomenology, clocks included, so that this new Time, variable according to the speed of each system, can only be deduced by comparing the times of the clocks in two systems at different speeds.

The intermittent actions of this processing Time are like unconsciousness while we sleep; they are infinitesimal fractions of "sleep" so that the "control" of the dynamic can intervene, which logically are more frequent the faster the system goes. This is why the faster a system goes, the more Processing Time intervenes, and its clocks fall further behind the clocks of a slower system with fewer quantum control requirements.

As previously mentioned, it was entirely impossible for such reasoning to be presented at the outset of the 20th century. Therefore, it is understandable that neither POINARÉ nor LORENTZ nor the physicists of that era could conceive an explanation based on information, especially given that other non-Newtonian phenomena had to wait for 20 years (such as nuclear and atomic phenomena).

Furthermore, they were associated with a new "mechanics," namely "quantum mechanics," a name that would prove to be significantly misleading, as we will discuss, since information is virtual, whereas mechanics deals with the tangible. Quantum has no connection to mechanics.

In the face of any new and unexplained phenomenon, generating hypotheses or conjectures to explain it is easy. What proves challenging is ensuring that these hypotheses do not contradict existing knowledge and that they remain consistent, or even more challenging, that they align with new discoveries. Consequently, faced with the absence of hypotheses, Einstein introduced his well-known hypothesis (his "Postulates") in 1905.

Surprisingly, despite its numerous contradictions, as we will explore, this hypothesis managed to establish itself, especially leveraging EDDINGTON's contentious experiment in 1919. It did so without the openness to consider alternative hypotheses, which should be inherent in scientific research.

By imposing itself, it obscured the most profound quantum manifestation in the realm of material physics: the Processing of quantum information, which introduces a new necessity for controlling material dynamics, represented by the aforementioned LORENTZ Transformation. So, where do we find the intrinsic Laws of physics that we strive to model with mathematical formulas as accurately as possible? Where is the causality of physics?

Even during my school days in the 1950s, I recall the emphasis placed on the imperative fidelity of measurement standards, specifically the constancy of their units, like the Paris "Metro" alloy, which exhibited minimal temperature-induced expansion. The Cesium atomic clocks, introduced by Louis ESSEN in 1955 (ironically, one of the staunchest opponents of Einstein's postulates), further exemplify this notion – that a Unit must, by definition, remain invariant.

However, with Relativity and its alleged "dilations" and "contractions," we encounter a concept that stands in stark contrast to the traditional understanding of a Unit, as it becomes subject to modification by the velocity of the system. In coherence, one might expect a new name to be assigned to such a concept, but never that of the unchanging "Unity."

To obfuscate this perplexing induced polysemy, it is incorrectly asserted that "Time dilates" with increasing velocity, omitting the fact that, on the contrary, it is the Unit of time that dilates, causing the measured time to contract or expand (a fundamental tensor characteristic underpinning the covariance vs. contravariance dichotomy between base and coordinates). From this point onward, tensorial "tricks" become commonplace in the theory of Relativity.

Such gibberish leads us to recall Heinrich Rudolf Hertz's wise criterion: "When conceptual contradictions are eliminated in science, our minds, free of bewilderment, will cease to pose spurious questions." As long as this condition remains unmet, the current state of affairs in physics is comprehensible. Hertz's statement, which is entirely psychological and linguistic in nature, as discussed in the "Abstract" of [3], is extensively explored in the "Information Systems and Theory of Knowledge" section of my website [12].

The fallacy (ad populum, ad verecundiam) of the alleged "dilatation"

For all of the above, the following parenthesis is necessary [1].

It is an empirical fact, proven since Hafele-Keating in 1971 [13], that in a faster system, time passes more slowly, and its clocks are slowed down with respect to another, less fast system. This phenomenon is correctly represented by the Lorentz Transformation. However, its justification, "since its time (I insist, strictly its Unit of Time) dilates," is only a possible hypothesis that has not been proven and, moreover, presents contradictions, both with many other phenomena and within itself.

The term "in itself" refers to the "symmetrical" behavior proposed by Relativity in the comparison of time between two systems at different speeds. This includes the paradoxical notion that if a clock of system A is slowed down with respect to B, the clocks of B are also slowed down with respect to those of A. On this topic, please refer to [1], page 5, for the argumentation by Ian McCausland and page 11 for that of Miguel Alcubierre. In the Addendum on page 17, the inadmissible logical contradiction implied by this "symmetry" is detailed. This is why Gödel sent Einstein the poisoned gift of his "Rotating universes," which physics has since reinterpreted as a simple curiosity, unlike the disaster that Gödel's Theorems meant for mathematical formalism.

By asserting this hypothesis perhaps millions of times, it has become a fallacious truth (Goebbels [14]). What's worse, I insist, it also censors the proposal of any other more accurate hypothesis that could be argued in its place.

Strictly speaking, if clocks go back, it has never been confirmed that this delay is due to the alleged dilation hypothesis. Every time a new experiment observes such a delay, it is automatically, but erroneously, claimed that dilation is confirmed. No, only the delay is confirmed, never the dilation.

The first quantum recognitions in physics: the "wave-corpuscle duality", entropy, the quarks

Parallel to the development of the Theory of Relativity, the study of atomic structure in the 1920s gave rise to what we now refer to as "Quantum Mechanics." Despite today's exclusive use of the term "Quantum," as previously mentioned, the inclusion of "Mechanics" was an unfortunate choice, contrary to HERTZ's criteria. This choice has hindered the incorporation of fundamental information into physics and made it challenging to detach quantum from the realm of material mechanics. This independence, at the very least, will intrigue the reader and will be explored further below.

This phenomenon is known as "wave-corpuscle (or particle) duality" (BROOGLIE, 1924) [15]. Here, information was once again overlooked due to the exclusively materialist-realist context of that era, disregarding the informational-virtual aspect. On one hand, it is not immediately apparent that the "wave" should be associated with information, representing the manifestation of underlying quantum pre-material information ("the electron does not vibrate; the electron is the vibration"). On the other hand, there is confusion between "being" and "seeming" concerning "particles" or "corpuscles," a confusion that is detailed in [3]. Even more concerning, as explained on page 4 of [3]:

• Contrary to HERTZ's criterion, modern science often confuses the real with the virtual, despite it being clear in the 19th century.

• There is a frequent mix-up between reality and the models used to represent it.

Therefore, it is understandable that during the same period, SZILÁRD's solution to "Maxwell's Demon" through the relationship between entropy and information was completely disregarded.

In the same vein, underestimating the power of information processing and, contrary to HERTZ's criterion, confusing the virtual with the real, were the attempts to "confine" the supposed "Quarks" as genuine components of Hadrons, which preoccupied researchers during my years of study at the University. They are not "confinable" because they are not "matter"; they are virtual, not real, and they are the effects of information, as discussed in [3], with Georges SARDIN's strong final remark.

Personal Heuristics

In addition to my work in 1979, involving the trivial simulation of Xenon 133 dilution in regional Cerebral Blood Flow (rCBF), my professional dedication shifted away from physics and radiation protection. This change was prompted by my successful resolution of the existing deficiencies in Catalonia and Spain, where international legislation had been ignored [9] and [10].

Therefore, starting from the mid-1980s, I focused on various powerful computer simulations, including the UNESCO MAB Programme Scenarios [16], environmental pollution, and cognitive processes simulations (which encompassed the resolution of LEIBNIZ's "Characteristica Universalis" and "Mathesis Universalis") [12].

In light of the aforementioned developments, I quickly realized that the answers to the increasingly complex phenomena in physics lay within the overlooked underlying information. Specifically, my inquiry revolved around how nuclides determine their transmutations, a knowledge that must be shared by all of them through a common information system. This realization prompted my departure from the field of physics to dispel preconceived notions and gain a more accurate informational perspective on physics.

Finally, as the new century began, I comprehended that processing the essential underlying information, necessary for controlling material dynamics and adhering to the intrinsic Laws of physics, could prevent the problematic "Dilation." If the processing time of this information were inserted with the fixed material time of clocks, it would simultaneously "freeze" material dynamics during its influence. This effect resulted in the misattribution of clock discrepancies in the fastest systems to the supposed "Dilation." Moreover, this hypothesis aligned with the observed "hemi-symmetry" of time in systems moving at different speeds, contradicting the principles of Relativity. To clarify, if the clock of system A advances relative to the clock of system B, then the clock of system B lags behind that of system A, thereby refuting the notion of Relativity.

These initial breakthroughs in my work, which resolved multiple issues simultaneously (such as the previously absurd and unreal "symmetry" in relativistic comparisons, the "non-voluble" fixed unit, and the restoration of causality through information control), motivated me to seek out new phenomena that could only be explained by the informational hypothesis. The reader can undoubtedly imagine these phenomena, which we will explore shortly.

Determinism

Perhaps Einstein was at his best when he employed his common sense rather than his imagination. This is evident in his concept of "Determinism," famously expressed as "God does not play dice," although the reference to God's involvement in science is entirely inappropriate.

Quantum indeterminism is not absolute. Lacking humility, we justify our current lack of understanding about quantum phenomena by labeling it as "indeterminism." For instance, Heisenberg's "Principle of Indeterminacy" (more incorrectly described as "uncertainty,") as explained in [1]), is not truly indeterminate. It merely reflects infinitesimal fluctuations between the fixed material time of clocks and the overlapping processing time periods that control dynamics. Within each infinitesimal period, there is no inherent "indeterminacy" in its associated variables. Another paradox resolved.

Precisely because of these control periods devoid of material consciousness, the trajectory and velocity of any particle are perfectly determined. Without this determination, it would be impossible for something as minuscule as an atmospheric muon to ascertain its trajectory, speed, or its inevitable "death" [1, 2, 4].

However, the concept of indeterminism primarily stems from our macroscopic perspective, which struggles to discern individual microscopic physical particles, especially within time frames approaching the Planck Unit. Quantum computing promises to expedite research in this direction much more efficiently than our current capabilities allow.

The intractability of the "n-body problem" has persisted for many years, and even with our powerful computers today, it remains unsolved. As a result, representing the macroscopic outcomes, which result from complex interactions at the microscopic level, through mathematics is an arduous task. Attempting to explain something as simple as an infection solely based on particle interactions is unrealistic. Therefore, we should not assume these interactions to be "indeterminate."

The most recent phenomena "verschränkung" or "entanglement", quantum computing, etc.

More than 100 years have passed since the concept of "Verschränkung" or "Entanglement." Almost 60 years have passed since Bell's work in 1964, and articles continue to proliferate about his inequalities, hidden variables, and the fact that, despite his theorem, Einstein was not wrong. However, the most crucial aspect of Bell's Theorem is that it necessitates some exchange of unknown information—neither material nor temporal. Experiments since 1972, conducted by Clauser, Aspect, Zeilinger, Gisin, Kaiser, Jacques, and many more, have repeatedly confirmed this. This is the only thing that matters. Therefore, I refer to the detailed explanations in [3], emphasizing that the visceral rejection of information in physics has made us wait for 50 years, half a century, to award a Nobel Prize for such a transcendental verification.

Even with Quantum Computing, we often fail to consider the role of information. In the same article [3], I must emphasize my significant reflection:

"A capricious and pre-scientific God is assumed to have placed these impressive calculation capacities in particles, rendering them utterly useless for 14 billion years until humanity, made in their image and likeness, discovered them and began to utilize them."

Despite:

• The "tiny" Polariton;

• Despite the significant work of B. R. Frieden;

• Despite the notion that "Information is the basis of Physics" (Wheeler);

• Despite intrinsic neutrino changes ("oscillations") without external interaction;

Physics "censors" information, even though physicists have been using computers for as long as they have existed.

Demonstration of the Processing Time Hypothesis

Is something more needed than what was mentioned above? Well, as quoted in [1] on page 11, demands such as: "If you can design an experiment where your model correctly predicts one outcome and the SR predicts a different and incorrect outcome, it would be worth studying."

The aforementioned fallacies of dilation predispose the unconscious mind to a lack of understanding when reading. In no case is it a question of modifying the experimental data i.e., the millions and millions of observed times since HAFELE - KEATING. It is simply that the observed times are not exclusively material times whose unit is dilated, but rather the observed times are the result of two components:

• Clocks whose unity does not dilate,

• Plus the emergence/contribution of variable time depending on the speed of each system due to the processing of underlying information, pre-material or quantum, which is misinterpreted as dilation of the unit of material time.But this, which does not contradict anything empirically proven, implies a transcendental theoretical change: the use of "dt" as an integrating differential is wrong; it invalidates all current equations, since it must be integrated with respect to two variables:

as well as the derived inconsistency of the 4-dimensional geometric structure of "Space-Time", since reality responds to a geometry of 5 dimensions, three of Space and 2 of Time.

What certainly cannot be asked of me, much less demanded, is that I rewrite all the physical theory of the last 100 years. This includes starting from Maxwell's Equations and leaving aside the four dimensions of "Space-Time" of Minkowski, which inevitably affects all of relativity, whether special or general. This corresponds to mathematicians, such as Minkowski or Hilbert in their time, with the ease of knowing what to do and what not.

It is clear that no specific experiment is needed to justify the validity of the Processing Time hypothesis. Since all the data compatible with relativity are also compatible with this informational hypothesis. I insist that this approach avoids numerous contradictions that arise with the relativistic hypothesis, including those related to the Big Bang.

But even after clarifying the above, there is a proof that is easy to understand, if you wish to explore it. It is the possibility of placing an atomic clock on the Moon. With airplanes, due to simple Ptolemaic reasoning, the experiment was restricted to referring exclusively to the Earth as a "static" system. On the Moon, such reasoning can no longer be justified; the Moon serves as a non-negligible "static" reference, just as valid as the Earth's. Therefore, from the Moon, a reversed relativity will be observed: instead of "dilating," the unit of time will be contracted with respect to the Earth, and the times observed on the Moon will be delayed compared to the "mobile" clocks on Earth.

This will also provide a second confirmation: the Lorentz gamma factor predicted by the current theory, besides being inverted as mentioned earlier (something unassumable by the theory of relativity), will be reduced by 1/81 due to what has been stated in [1], [2], and [4]. This factor is not a scalar but a function, inversely related to interacting masses.

It is also evident that in a binary star system, despite their respective rotation velocities, there would be no Lorentz factor differentiating between the two stars; it would be 1 in both cases.

We cannot continue to deny velocities greater than "c" by claiming that "annihilation photons are an exception." This applies to "Stückelberg Diagrams (misnamed Feynman's)," "the Polariton," etc. Additionally, particles, regardless of their observed external speed, have their proper-real speed associated only with real time, excluding their processing time. This supposes real-internal-proper speeds much higher than "c," which can modify the concept of relativistic mass.

For example, with an atmospheric Muon traveling at a velocity close to "c" and a Lorentz gamma factor of 20, its material time would be only one-twentieth of the rest (the processing time required to control its high-speed interactions). Therefore, referring to its own material time, its own velocity would be close to 20c (twenty times "c"). As it happens when we fall asleep on a train, thinking that our journey has been shortened, similarly, the Muon will "perceive" that the space it travelled has contracted 20 times, allowing it to reach sea level without altering the duration of its material-real life.

What about kinetic energy? It's a good question to investigate because if, instead of being referred to the external velocity of the observer, it were referred to its own velocity, there would be no increase in mass with velocity; instead, kinetic energy would simply increase. It is significant to note that mass is not given in units of mass but in units of energy. Perhaps the mathematical formulation confuses quantities of magnitudes from different systems by mixing measurements of quantities specific to one system, such as own-real Time, with externally observed Time from another system.

What has just been explained for the Muon is valid for any other physical particle.

In the face of what is not understood, we can no longer resort to new hypothetical fundamental forces or new arbitrary dimensions, such as the recently proposed "fifth force" motivated by the Muon, the implicit force of the HIGGS field, the dimensions imposed by the "Theory" of strings, or the conjecture of the Field associated with the "Inflaton."

Everything can be explained by the existence of single underlying information, represented analytically by the necessary fifth dimension. It is an informational manifestation without direct material support and, therefore, versatile, allowing it to encompass any representation, such as the psyche or a CPU processor, which can manage entirely different processes-psychomotor skills, emotions, knowledge, calculations, or automatisms, graphics, texts, calculations, and many more.

Finally, the substitution of Processing Time for Dilation and the use of the Lorentz gamma Function instead of the Factor are what allow the harmonization of the two Relativities, Special (SR) and General (GR), since it is known that they are incompatible with each other.

Conclusions on the Quantum Scenario

It is not necessary to seek a unification of Relativity and Quantum (the so-called Quantum Gravity), since Relativity in the material Universe is the most direct and clear consequence of Quantum, as exposed in [3] and [4]. It is not necessary to "unify" a mother with her son.

The relation in our Universe between information and Electromagnetic Waves (EM) is discussed in [3]. Thus, the aforementioned waves of the "wave-particle duality," like the in-confinable Quarks, like the wanted gravitational waves (always assuming their detection without any doubt), all of them are only manifestations in our Universe of quantum Information. So, we have a single informational process, which through methodological transfers of information () originates four possible scenarios, in which we are immersed (what LEIBNIZ called “Mathesis Universalis”, see [4]):

In [4], with the well-known instincts, the last transfer between genetics and the psyche is explained, and then the presumed appearance of life, the result of the transfer between physics and genetics.

The Quantum Scenario is characterized by two dimensions, Energy and Quantum or pre-material Information (-M, +S), which is the underlying Information for our material Universe. The Genetic information (+M, +S) is the underlying Information for life; and the Psyche (-M, +S) is the underlying Information of the behavior/mind. The parentheses (-M, +S) and (+M, +S) are explained in the introduction to [1].

In the case of physics and that of the psyche (-M; +S), their informational symbolisms are not directly supported in something material (for this reason, -M) but they have an important difference between them, the symbolisms/signals of the psyche they end up being supported indirectly in matter, in the neurons of the nervous system, while the quantum’s symbolisms/signals are strictly without material support, neither direct nor indirect, let's say they are “pure” symbolic.

It is indifferent how long the quantum scenario could have lasted before the appearance of the material Universe because, as has been said, in that quantum scenario our Time did not yet exist; there was only information and energy. Therefore, it can be said indifferently that:

• The Quantum Scenario was only immediately prior to the beginning of the generation of the Universe (the alleged Big Bang), like that

• This scenario has always existed.

• Which, in any of the two cases, does not imply any creationism or a-causality, nor does it require any question of what came before. With this, the domain of physics has just been delimited.

Which in any of the two cases does not imply any creationism or a-causality, nor does it require any question of what came before. With this, the domain of physics has just been delimited.

The "Arrow of Time" only appears with the emergence of the material Universe, but I insist, with the two components already mentioned (Unraveling Time):

• The traditional Time of material dynamics (that of the clocks), and

• The still ignored Processing Time to control material dynamics.

To propose a backward movement in time is absurd, mere science fiction. Little could GÖDEL have imagined that such absurdity could be turned around and conceived as something conceivable.

Some Contradictions of the Big Bang

We are now going to explore the contradictions that the Big Bang theory poses and how they can be resolved with the hypothesis of Quantum Theory and Processing Time. For brevity, a short addendum on page 18 in [1] is required, addressing the tautological-circular nature of quantum fluctuation, its creationism, and a-scientific a-causality.

Perhaps the most absurd aspect of this "explosion" is the explosion itself, a subject I addressed in 2009 in a registered but unpublished document titled "The Holistic Theory (Physics Perspective)," which spans 153 pages in the Catalan language. In "An Explosion That Could Never Occur Due to Lack of Explosive," I argue that the sudden emergence of the mass of the Universe, if it adheres to the equivalence requires an utterly unimaginable contribution of energy (as noted by PRETTO 1903 and MEITNER 1938).

Furthermore, the extremely rapid inflation needed to encompass the entire observable Universe (and possibly the infinite total Universe) necessitates additional energies for practically infinite accelerations and velocities, far exceeding the speed of "c." After 14 billion years, we have yet to receive a significant portion of photons traveling at the speed of "c" from the primordial Universe, yet it expanded almost instantaneously as if by magic. With delocalization, as explained in [3], we can examine why all of the above might be unnecessary.

How do we account for the infinite entropy associated with such an explosion? Proposing yet another "Principle" (as suggested by PENROSE [17]) to set entropy to zero is inadmissible. Can we simply manipulate the odometer? Where do these immense energies originate from? Nothingness? Are they bestowed by a generous God? In comparison, the "Vacuum Catastrophe" would be inconsequential. Whether we consider the "dark" aspects of the Universe or not, only two significant numbers matter. Only a preceding quantum pre-material scenario, as described earlier, with available energy, although not as much as we will see, as nothing truly "explodes."

It's as straightforward as Information (such as pre-material or quantum information) generating structures (materials) through energy, inherently implying zero entropy. Entropy manifests in irreversible processes, like an explosion, not in an economical construction design; it's the opposite of an "explosion."

No need for a "dogma" to manipulate the entropy or energy odometer; we just need to fundamentally shift the current erroneous paradigm. If we are to move away from anthropocentrism and later from Ptolemysm, now is the time to transition from "materialism-realism" to informational virtuality.

It's that simple. If delocalization, as explained in [3], is correctly understood, the concept of "inflation" becomes unnecessary since the quantum scenario, without spatial dimensions, can instantly generate (act-at-a-distance) any space of desired magnitude. Haven't we already spent 50 years with "Verschränkung" or "Entanglement" as a fully empirical fact?

Lastly, discussing time in the initial moments of the Big Bang (10-33 seconds) when atoms have not yet formed (they appear after 3 minutes) in the context of material Time is an unjustifiable extrapolation (trillions of times before its time of appearance). However, this is resolved and comprehended through quantum immediacy, in a Quantum Scenario where the Time we experience does not yet exist, as also explained in [3], along with three other significant misconceptions about Time exposed in [4].

It is insufficient to obscure inflation with the depiction of a bell and its harmonious roundness or to speculate like the Inflaton. So, what is the correct hypothesis (based on reality, common sense, and devoid of contradictions), and what is the incorrect one that implies an erroneous theory and should be discarded? Without delving into the specifics of the Big Bang, even by solely considering our knowledge of quantum physics, LEGGETT (Nobel 2003) [18] remarked, "...quantum mechanics vanishes and must be supplanted by another type of theory that we cannot even envision." (And to commence, what has already been mentioned has no connection to mechanics).

As DIRAC previously stated: "...Renormalizations are a mathematical construct, with no connection to physical reality."

All these contradictions cannot be resolved with mere touch-ups or cosmetics. They can only be addressed by a radically different theory from the current paradigms – one that is "intriguing," "revolutionary," or any other term you prefer. Undoubtedly, this theory of Processing Time in our Universe and the information from the preceding quantum scenario is that theory.

However, when you examine it closely, contrary to what may initially seem due to long-standing biases for over a century, it aligns perfectly with the principle of "OCCAM's Razor" like no other hypothesis. Information, because of its virtuality, versatility, and methodological applicability, elegantly resolves all these problems.

So, where lies the issue? As Francesco di CASTRI, an ecologist, creator of the MAB Program, and Vice-President of UNESCO, remarked to me in 1980, referring to our exceedingly grave environmental concerns back then:

"The real and definitive solution, the action that must penetrate to the very core of the problem, should pass through a comprehensive understanding (interdisciplinarity, holism) of what is occurring and should occur within our own coastlines and cities. This requires a deep and profoundly sincere comprehension of the obstacles and resistances that exist, not only around us but, most importantly, within ourselves."

References