The Dark Sector
The central part of Coma Cluster of galaxies covers a roughly circular area about a degree and a half across (9 times the area of a full moon). The full cluster may in fact extend further, and numerous other galaxy clusters are in the same area of the sky.
Dark matter allegedly works as a kind of "super glue" that keep structures within and between galaxies constant
The Coma Cluster is one of the richest galaxy clusters known. It contains as many as 10,000 galaxies, each housing billions of stars.
The Swiss astronomer, Fritz Zwicky, in the 1930s discovered that the stars are rotating too fast in relation to the known content of this cluster. The reason this fact was possible to establish was because he captured a large numbers of galaxies in a single wide-angle photograph.
This discovery of major deviations in the behavior of celestial bodies in relation to gravity and Newton's second law, using simple Newtonian equations to describe the power relationships of one of the universe's most dense galactic populations -
(because the net force are assumed to be constant)
- lead to the suggestion, that something had to be be wrong with the galactic cluster, and no questions asked regarding method or sufficiency for the purpose of calculating this seriously complex context, have left us with the idea of "dark matter" in an abundance, previously unknown - in a scale out of charter relative to known matter - constituting the greater part of all matter in the universe.
Like ever since this postulate, the brightest minds have tried to figure out what that matter could be
It's a bit problematic to suggest the whereabouts of such a substantial mass of matter. Science have showed us that the dark matter/dark energy cannot be caused by self inflicted changes in the local matter - nor can it be regional transfers from different areas of space - as that will cause a collapse of space, and since it is not matter as we know it - with nuclei and electrons, protons and neutrons and electromagnetism, weak and strong core forces - it's something else.
Numerous flaws have led to this (mis)understanding(?)
Considerations whether the natural forces can support other atomic structures than currently known - or in fact the basic problem is a misinterpretation of informations derived from outdated scientific methods - due to the fact that the actual universe they are explaining, fundamentally has scaled out of comparison to the tiny solar systems the theories initially was intended for.
The fundamental forces of the universe: Gravity, the electromagnetic force, and the weak and strong nuclear powers are the bonds that provide the universe with the power of expansion - and viability - from the invisible tiniest relationships between matter and energy to the unimaginable cosmological contexts that we may or have not yet understood or discovered.
We are struggling to get this general world view - scientific heritage from the classical science - more than 100 years old, to play along with what we can actually see has played out in space:
- We find that the energy balance does not sum up
- nor are there enough matter in the world to account for what can be confirmed.
The question is though whether there actually is a lack of mass, a lot of other strange kinds of matter, or whether gravity simply works different over large scale galactic distances, than the applied theory assume.
Two things can happen when we discover that reality is acting strange according to what we know - we can change the way we think of reality and call for new stuff or new understandings - or we can call for new science.
One of the most interesting properties within a galaxy is the way sustainability are achieved. Depending on Newton and Einsteins theories leaves us with no other option that to invent some new stuff. Dark matter.
Or we can go by the dynamical view. Orbital cycles, angular momenti and morphological and kinematical equilibria - like Schwarzschild in a higher scale - are more likely the kind dynamics that straps down an initial part of matter to the highest efficiency of self-consistency over the aeons.
Distribution of the (primordial) dark matter and it’s role in matter formation
Within the Physical standard model we can only think back as far, as to a point we call the singularity, the dominant natural force here was gravity, and kept everything in that moment. But since it was before time, it was not part of this universe.
No other natural force apart from gravity could play any role. Yet, until the moment that this universe became, and we call that the big bang.
But still - this is not how everything was made - only the event that define the start of this universe. In fact no one really have any good answers to how the singularity went through a momental big bang and became to this universe.
But if we forward time just past the big bang a lot of stuff happens. Astrophysicists believe, that a very dense mixture of protons, neutrons, photons, electrons - and other subatomic particles - filled the universe. The temperature by then was so extremely high, that the electrons was unable to bind with the protons to form atoms.
Because the temperature is so high, all the particles are very energy-rich, and most photons at that time are gamma rays. The universe is still only 1/100 billionth of a trillionth of a second old, 10 trillion degrees hot – big as a house and lit up with extremely bright sapphire blue light but had no visual visibility what so ever, because of all the electrons roaming freely in space, exiled by ions.
Instead, all the particles scattered off of each other at high rates, keeping all the different species at the same temperature - in a thermal equilibrium - with each other. The photons also scattered off of the electrically charged protons and electrons in a degree so that they could not travel very far.
As the universe expanded, the temperature dropped to about one billion degrees Kelvin (K). At that point, the protons and neutrons began to bind together to form atomic nuclei. At roughly 390,000 years after the Big Bang, the continued expansion of the universe and subsequently cooling had dropped the temperature of the universe to about 3000 K.
By that point, all the electrons and protons had bound to form electrically neutral hydrogen atoms, and all the other charged particles had decayed. As the primordial hydrogen formed, the universe became so transparent to photons that they have been traveling throughout and within it for the entire ever since then.
These relic photons from the early universe have a microwave wavelength, and are known as the cosmic microwave background, or CMB. Given that space started inflation per se exponentially from within the first picosecond of time and ran for approx. a quarter of an hour, the electromagnetic radiation we monitor today are stretched with space, in what we see as the cosmic background radiation.
Density fluctuations, dark matter infrastructure and the cosmic microwave background
Before the neutral hydrogen formed, matter was distributed almost uniformly in space - although small variations occurred in the density of both normal and dark matter because of quantum mechanical fluctuations. Gravity pulled the normal and dark matter in toward the center of each fluctuation, thus founding the cosmic web, matter has been following ever since.
While the dark matter continued to move inward, the normal matter fell in only until the pressure of photons pushed it back, causing it to flow outward until the gravitational pressure overcame the photon pressure and the matter began to fall in once more.
Each fluctuation “buzzed” in this way with a frequency depending at its size. This constant traction in and out had a high influence on the temperature of the normal matter. It heated up when it fell in and cooled off when it flowed out.
The dark matter, which does not interact with photons, remained unaffected by this effect. When the neutral hydrogen formed, areas into which the matter had fallen were hotter than the surroundings. Areas from which matter had streamed out, by contrast, were cooler.
The temperature of the matter in different regions of the sky - and the photons in thermal equilibrium with it - reflected the distribution of dark matter in the initial density fluctuations and the "buzzy" normal matter.
This pattern of temperature variations was frozen into the cosmic microwave background when the electrons and protons formed neutral hydrogen. So a map of the temperature variations in the CMB traces out the location and amount of different types of matter 390,000 years after the Big Bang.
The american physicists Ralph Alpher, Robert Herman, and George Gamow predicted the existence of the CMB in 1948. Seventeen years later, Bell Labs scientists Arno Penzias and Robert Wilson detected them.
Initial measurements showed the intensity of the relic photons to be constant across the sky to a fraction of 1 percent. In the early 1990s, however, NASA's Cosmic Background Explorer (COBE) spacecraft used a pair of radio telescopes to measure differences among relic photons to one part per million between two points in the sky.
A subsequent spacecraft, the Wilkinson Microwave Anisotropy Probe (WMAP), made an even more precise map. This revealed hot and cold spots about 1.8 degrees in size across the sky that vary in intensity by a few parts per million.
The angular size and the extent of variation indicate that the universe contained about five times as much dark matter as normal matter when the neutral hydrogen formed. Combined with measurements of supernovae and the clustering of galaxies, this indicates that dark energy comprises 73 percent of the universe, dark matter 23 percent, and normal matter just 4 percent.
Mathematical formalisms or calculations?
The world becomes in our understanding - when Newton's gravitational force makes sense it is because we can prove mathematically that events in the gravitational field are not only probable but also universal, thus qualifying gravity in the class of universal fundamental forces.
But with the same kind of mathematical skills - and other precautions - one can also prove that gravity in Newton's mechanical universe is not the only plausible explanation for a body following a particular path when it "fall" - the principle of the least energy cost, where a body follows the path that is most energy-stable, applying the same physical constants.
Instead of introducing a gravity constant, the effect is being calculated as the integral of the kinetic energy, subtracted the potential energy over the time interval - and the results are equivalent with Newton's calculation method.
Although Einstein's theories of relativity present explanatory models for phenomena that would not otherwise make sense, since his theories bridges reality in a way that one can count on quantitative universal connections in his 4-dimensional time-space, the model does not need to be anything but an advanced calculation trick.
Since general relativity is based on the simple idea of the equivalence principle. That states that there is no difference between gravitational mass (the mass that causes the gravitational force) and inertial mass (the mass that resists acceleration).
There is though no fundamental reason to expect these two masses to be the same, nor is there any reason to expect them to be different. But their equivalence forms the cornerstone of Einstein's general theory.
Coherent explanations
We are inclined to see the world in the light of Newton and Einstein's discoveries - two scientists who basically have defined how the world is linked and evolves into a plausible explanatory model, and have contributed to the understanding of how things - in this place we call the universe - play out in an interaction between mass and gravity - and time (and space), and space time.
Most of all perhaps because it is convenient to assume that the world is simple and predictable, and perhaps also because it is problematic to reconcile theory and practice otherwise with what we can actually see unfolding otherwise in modern cosmology, even though we can see that something actually is missing.
In praxis, it will never be possible to prove or disprove any theory - which claims to be able to explain the universe as it is, because it is beyond our ability to verify events in the astronomical scales that apply to the universe.
What You See Is (Not) What You Get
We cannot even prove that there is actually a universe out there, since we can only observe analogue electrodynamic radiation contexts - in different resolutions - which do not reflect reality, but just how it once was. A very, very long time ago.
To establish order and predictability in the universe, we have invented the natural constants which are fundamental constants - as well as constants that occur when the basic laws of nature are described as quantitative connections - and enables us to generalize the conditions in which the universe unfolds over time.
Historically we have benefited from the invention of natural constants, that makes it possible too not only explain coherence during time, but also even establish important improvements of the laws of physics along.
Gravity and light speed are fairly precise inventions
E.g., the gravitational constant g is included in Newton's law of gravity, and the speed of light c in R?mer's description of the hesitation of light and later in Einstein's special theory of relativity.
The universal constants define our perception of which relationships are fundamental, and thus reflect the accepted theory of the evolution of the universe.
When we consider space as a definitive result - where the universal constants of nature enter into the world and unite the forces that lead to this universe - it is actually because the universe is created in our imagination.
Theoretical physics, applied physics, mathematical models and philosophical considerations all serve the same purpose.
To confirm the world as we see it.
To understand extreme density phases, as when the universe was very small - very large – or even expanding ever faster - you need a theory that combines general relativity with the principle of uncertainty. Connecting space and energy, rather than energy and matter and their relationships in space (-time).
The universe is not acting out as we think it should
Basically, we currently know of two sorts of natural forces: Gravity and electromagnetism. The reason we can't explain dark matter are that it is not interacting with either of them, but affect matter in peculiar ways. The behaviour - on the galactic scale:
Matter can, on a very large scale - galactic way, sort of - be attracted/stabilised, like in a fixed way by dark matter. Gravity are stronger than it should be and the galactic rotation are fixed - everything rotate with the same speed.
If you observe all the light that is in the sky, you can, by assuming that the rest of the galaxy is reminiscent of our own part, estimate how much visible matter there is altogether, and since matter (and energy) affects bodies with gravitational forces, it is possible to calculate how the rotational speed of the disc stars should be.
Subsequent measurements of the rotational speed of the stars show that there is a lack of inconsistent amounts of matter. (As we see only about 10% of the required mass.) This is related to dark matter (it doesn't light up, otherwise we could see it) and is impossible to measure, as we can only observe what we see, and thus it is difficult to find out what it consists of.
Theories have pointed out that it could be neutrinos (which have a vanishingly small mass), brown dwarfs, planets, WIMPs or ordinary neutral hydrogen gas.
I.e. that dark matter is already necessary to explain the fact that the galaxies do not seem to follow the basic laws of physics.
In cosmology there have been dark matter before. Within our solar system - like missing planets to account for peculiar behaviour of the celestial bodies.
Two things can happen when we discover that the reality is acting strange - according to what we know - we can change the way we think of reality and call for new stuff or new understandings - or we can call for new science.
The big question is also whether there actually is a lack of mass, a whole new kind of matter, or whether gravity is simply works different over large scale galactic distances, than the applied theory predicts.
Undoubtedly there is more in space than meet the eye
In many clever ways we utilize math and physics to establish empiric facts aimed to evolve our knowledge about the universe so we can understand what we see, and for a long time we have established large facilities to capture a dark matter particle. Without any succes.
All attempts to confirm new small strange particles, weird energies and particles so far have led into a dead end. We have though found neutrinos of all flavours.
Atoms are the powerful universal transportation system of electromagnetic energy. Beautifully suited to explain everything we see, exchanging energy across the extent of the universe. But it is still only representing less than 3% of everything.
Atoms are simply our world machine of tiny, tiny, magnetic-force spheres, that run on the electricity jumping off and being caught again. If it really gets going - there will be a lack of protons or neutrons - the weak nuclear power will then cause parts of the molecules to break - and the nuclei will break and become different kinds of particles and energy - e.g. can an atomic nucleus dissolve - when the weak nuclear power does not out balance the molecules charges accurately enough, protons loses its charge
- and there is an ion, a positron and a neutrino radiating out of the atom - letting go of exactly so and so many atoms and nuclei that the weak nuclear power again can act as a strong “glue” of magnetism between the atoms. Once again, the electrons will act as a burst of tiny lightnings that sparkles through the matter - and a super cloud of lightning-fast energy bundles that surround the molecules and stick to the content by being the electromagnetic force that causes the world machine to spin.
Both new science - and stuff - is needed
By using light and electromagnetic radiation as the “detector” we can only detect matter with corresponding properties. (That less than 3% of space explainable within the standard model is then becoming a disturbing fact). If we want “proof” of dark matter - or dark energy - we need to challenge how the natural forces can support other atomic structures other than the way we currently understand how nucleus/electrons are the foundation of the world.
The dark sector
Gravitational lensing by dark matter makes perfectly sense
The phenomenon occurs when foreground matter and dark matter contained in galaxy clusters bend the light from background galaxies — sort of like looking through a magnifying glass. Measuring the amount of the distortion of the background galaxies indirectly reveals the amount of dark matter that has clumped together in the foreground object.
The Swiss astronomer, Fritz Zwicky, in the 1930s discovered that the stars are rotating too fast in relation to the known content of the galaxies. The reason this fact was possible to establish was because he captured a large numbers of galaxies in a single wide-angle photograph.
Zwicky find that the spatial distribution of galaxies in the Coma cluster resembles what one would expect theoretically for a bound set of bodies moving in the collective gravitational field of the system.
Measuring the dispersion of random velocities of the Coma galaxies, it turns out that they move by velocities at nearly 900 km/s. For a galaxy possessing this random velocity along a typical line of sight to be gravitationally bound within the known dimensions of the cluster requires Coma to have a total mass of about 5 × 10^15 solar masses.
The total luminosity (that's how much it shines, compared to the Sun: The amount of light emitted by an object in a unit of time ) of the Coma cluster is measured to be about 3 × 10^13 solar luminosities; therefore, the mass-to-light ratio in solar units required to explain Coma as a bound system exceeds by an order of brightness what can be reasonably ascribed to the known stellar populations.
Zwicky made a survey of all the galaxies in the cluster and used measurements of the Doppler shift of their spectra to determine their velocities, and then he then applied the virial theorem.
The wavelength of light emitted by an object moving toward an observer is shortened, and the observer will see a shift to blue.
If the light-emitting object is moving away from the observer, the light will have a longer wavelength and the observer will see a shift to red.
By observing this shift to red or blue, astronomers can determine the velocity of distant stars and galaxies relative to the Earth.
The virial of a particle is defined as the product of the particle's momentum, p, and its position, x. The virial theorem states that if the time average of a particle's virial is zero, then the particle's kinetic energy, T, is related to the product of the net force, F, acting on the particle and the particle's position:
For particles—or galaxies—moving under the influence of a gravitational force, is equal to the particle's gravitational potential energy, which depends on the total mass inside the particle's orbit.
Zwicky used the virial theorem to relate the total average kinetic energy and total average potential energy of the galaxies of the Coma cluster. He argued that the virial for a pair of orbiting masses is zero, and used the principle of superposition to extend the argument to a system of interacting mass points. This allowed him to use the position and velocity measurements he carried out to find the mass of the galaxy cluster.
A straightforward application of classical mechanics, the virial theorem relates the velocity of orbiting objects to the amount of gravitational force acting on them. Isaac Newton's theory tells us that gravitational force is proportional to the masses of the objects involved, so Zwicky was able to calculate the total mass of the Coma Cluster from his measured galactic velocities.
Zwicky measured the combined light output of all the cluster's galaxies, which contain about a trillion stars altogether. He then compared the ratio of the total light output to the mass of the Coma Cluster with a similar ratio for the nearby Kapteyn stellar system, and found that the light output per unit mass for the cluster fell short of that from a single Kapteyn star by a factor of over 100. He then reasoned that the Coma Cluster must contain a large amount of matter not accounted for by the light of the stars.
He called it "dark matter."
Critic of the calculation
The bodies encircling within galaxy discs - in a galaxy cluster - galactic-wise are not all travelling at the same speed. There will be local systems of galaxies that will be moving extremely fast, some will be moving with moderate speeds according to the mass of the system. The magnitude will show great varians - and thereby the luminosity will indicate different velocities.
By aggregating the peak value of the cross-section Zwicky built-in a trend that tend to show too high magnitude. The correct value would be to cross-mean aggregate the squared values (not using the most probable highest velocity): Since the representation will tend to overrepresent the directions directly towards our viewing angle.
By the choice of calculation Zwickys cross-section tend to use "over exposed" values, and thereby produces an evidence of missing matter.
Hence no dark matter is needed.
But, with the same method several other astronomers later on has described a similar phenomenon; e.g., Vera Rubin and Kent Ford as they studied the stars' movements at the outer edge of the Andromeda galaxy.
Rubin and Ford measured the velocity of hydrogen gas clouds in and near the Andromeda galaxy. These hydrogen clouds orbit the galaxy much as stars orbit within the galaxy. Rubin and Ford expected to find that the hydrogen gas outside the visible edge of the galaxy would be moving slower than gas at the edge of the galaxy.
This is what the virial theorem predicts if the mass in the galaxy is concentrated where the galaxy emits light. Instead, they found the opposite: the orbital velocity of the hydrogen clouds remained constant outside the visible edge of the galaxy.
If the virial theorem is to be believed, there must be additional dark matter outside the visible edge of the galaxy. If Andromeda obeyed Newton's laws, Rubin reasoned, the galaxy must contain dark matter, in quantities that increased by increased distance from the galactic center.
Observations show consistently a “flattening” of the rotation curves - in which speeds remain essentially constant beyond a certain distance from the galactic centre.
Rather than challenge the theories of Newton and Einstein, it has been accepted that each galaxy must be surrounded by a vast halo consisting of dark matter (termed so because it cannot be directly observed).
However, the measurements showed that the speed did not decrease over distance. It made scientists believe that there must be an invisible mass that kept the mass together on the whole and causing the pace to be faster.
The speed of the stars far from the center of the galaxy will, according to theory, slow down because the strength of gravity diminishes.
According to Newton's Second Law, the Force Law is due to the fact that the gravitational force of encircling mass is a product of its mass and acceleration (which relates to speed).
Rotational Graph of the Messier (M33) Galaxy, also called Triangulum Galaxy
In recent decades, numerous observations of gravity systems on a very large scale has indicated the same problem.
We need to be able to explain these observable peculiarities within galaxies, and since we also see similar behaviour in galaxies that have fundamentally less density - and no dark matter are needed to maintain structural integrity on the galactic scale, showing the same peculiarities,
it becomes a hard question:
- Is it a structural problem (we need some unknown stuff to account for the missing mass)
- or is it a technical problem (we need some new science to explain what we see)
- or the known natural forces has properties or implications currently unknown.
It is very likely that we need another perspective - like a third kind of natural forces – not yet discovered, to "turn on the light" in the "dark sector" and apply some well known force to nature, and make space explainable again.
In the reign of natures forces it's all about distance or strength, short or long range. Gravity and electromagnetism.
I also suspect that the density of galactic environment and overall average density are critical for the effect of the dark matter cloud.
What we need could be a third very long range - slow - natural force that applies within the range of half a billion light years, and it must be immune to infliction from incoming electromagnetic and gravitational effects by normal matter.
- As it seems possible that a galaxy can consist of dark matter alone - the combination from known forces are only relevant in a “mixed” environment. It is possible that it is both a technical and structural problem.
- Looking at the critical path of matter distribution in a “universal” grid it indicates that both dark matter and normal matter are affected by this force, thus becoming fundamental for the evolution of cosmos.
An alternative explanation may also be that Newton's law on gravity is not universal - and only works on a limited scale, and that mass attraction at a galactic scale is beyond the scope where it meaningfully can be justified to use with a gravitational constant:
- A new gravity variable could reflect distance in light years, particle density and galactic rotational velocity measured as a full revolution, over the difference in velocity related to the mass
- Pointing out the possibility that we can identify a critical mass/distribution and density within the observable ranges with modified gravitation variables that eliminates the need for dark matter to explain the deviations.
One of the most interesting properties within a galaxy is the way sustainability are achieved:
- Orbital cycles, angular momenta and morphological and kinematical equilibria - like Schwarzschild in a higher scale - are more likely the sort dynamics that straps down an initial part of matter to the highest efficiency of self-consistency over the aeons.
In any case dark matter - apart from the role in setting a primordial path and foundation for galactic evolution - is a dead end. It's there - but burnt in at such a vast scale that you can't detect it - it is like looking for H2o in the sea.
It's everywhere...
Superstructures and beyond
What you see when you look out are not actually "out", but back - and even back look are not an adequate description, because if your stationary spot in the universe are moving faster than the observervables - space will tend to show expansion.
In my opinion superstructures are not of this world, rather a fairly good bet on how this world became, and in that case we are living within a bubble of dark matter looking "out" from a void...
Asymmetric time and the principle of least energy
Differences in the Maxwell-Boltzmann distribution of matter inflict the inflationary potential of space due to variations in the electrodynamic radiation pattern, because of variance in the particle dispersion and local temperatures - mirrored by the CMB - and indicate that the local differences in space gives rise to an inhibited inflation, that causes regional differences.
The invariance within the redshift indicate that the speed of inflation also affects the timeline - since light speed remain constant and velocity is a vector that has a direction and different redshift of the same light observed from different stationary point - and conformal time - show deviations in redshift. (View point dependency).
The discussion about dark matter / dark energy are interesting in a sense that it explains space by virtue of the descriptive property as a container of matter, at the same time as it proposes super properties for dark matter and energy.
But as we decide that space is expanding because the particle density is decreasing, it is a mistake comparable to not to take curvature into account when in fact, The Cosmos is unlikely even to exist in other representations than as the "theatre" where we see everything happening. It doesn’t mean that the natural processes actually is happening at the theater.
It is the ability to cluster that slowly decreases - as the mass can’t maintain clusters attracted over time. The limit for attraction follows an effect function similar to the Lagrange function, and will at all times result in a development that requires the least energy to maintain, regardless it is about balancing a smaller body relative to a large mass or the spin of energy in the contained field of space trapped between particles and nuclei.
The matter-centric considerations vill follow Euler and Lagrange
And it is alluring to swirl into Euler’s fluid dynamically mathematically dream of a mechanics dynamic interpretation of the natural physically events.
J[f] = δJ/δf(x), because we are especially interested in determining the zero’s and turning points of the mechanical equations that explains a differential development, and as in many other aspects of phenomena What You See Is What You Get.
Over time, any attraction will be eliminated from the gravitational field, and all we can say with certainty is that the distance between two points in space will be increased if it isn’t balanced by an effect function.
Where t is the time and the integral function is defined by the start time and end of the process. L is the Lagrange function, which is defined as the kinetic minus the potential energy. Thus Lagrange is allowed to be a function of time and integrates it over the relevant time interval.
Mass/gravitation/matter thus plays an decreasingly lesser role as the gravitation force decreases to affect clustering in the same degree as the spin of energy in the contained field between particles. This means that particular microscopic matter effectively will last longer than macroscopic clusters of mass.
But that is not necessarily the same as there should be a force in space that cause an expansion.
The substantial complexity of matter evolves from one extreme to another over time, in a space that is unbalanced in size and is not retained by any natural forces.
The unique property of space is that it is not limited to a particular place, but is always present everywhere: Both within and between matter and particles and energy, and at the same time make up the most of the composition between energy, particles and space.
Even though the distance between clusters increases over time, it does not mean that there will necessarily be more or less of anything over time.
Space spreads almost like an ocean
There is a reason that we can not figure out exactly how space is acting out in all it's wildness - it simply vanishes - it doesn't extend into depth, it just propagates over a larger area, and astronomy is a actually a pretty big surface phenomenon.
The Universe have no depth relating to other than space, all astrological events and cosmological infrastructure is from surface and 'out'. Spend time is a void inhabited only by specks of light. Space is (almost only ) 2-D
Just like air - under atmospheric pressure - are the medium that enable flames to defy gravity and dance effortlessly in the glow of fire. Space-time is the medium that gives substance an opportunity to pop into existence - encapsulate the energy and interlock the particles in an existential dance, excessing and jumping by charges.
The idea of adding light perspective as the universe's scale and meter, and the profound space-time as the universal construct is brilliant.
The theories of dark matter and dark energy provides good explanations - but are a bit of the same drawer as the singularity and The Big Bang - a bit sought and without proof, and confirming the theories through mathematical systematic and logic does not necessarily make them other than just most likely.
Hydrogen are fuelling the expansion, (dark energy is related to fusion)
The same phenomenon may have different explanations. In consideration of that the dominant entity of the universe is energy, and the most abundant matter is hydrogen, and we bring already known knowledge of special relativity in use to establish the potential for expansion.
The fusion of the primordial hydrogen delivered the potential for expansion in the first place, and the hydrogen fusion as stars formed has driven ever since.
It's as simple as that:
If 1 kg hydrogen is converted to helium, 7 grams emits as energy, and if we put it into Einstein's theory of relativity - E = m * C2 because E = M * c2 * E = m * c2, we find that conversion of 1 kg hydrogen frees 6 * 10^14 J. Photons can be "charged" to carry that energy: If the energy (E) and momentum (p). Einstein's special relativity says that E2 =p2c2+m2c? if E=pc
It is sufficiently to drive the expansion, and it is this fossil reservoir of charged particles that became critical in momentum and still are driving the expansion today emmitted as negative pressure.