In space is illuminated - by structural design - only what has already been
Building an understanding on the speed of light - by design - the future is hidden. Because by this delay of information only the past is observable.
The detail is in the eye of the beholder - the illusive fact
And a broader perspective has never led to poorer results on the detail. When Looking from a distant point of view both what was and what probably will become is imaginable. Right?
On the basis of introspective analyzes of the structure and nature of our own galaxy - by the help of instruments and equipment sensitive to high-energy radiation, it is today possible persuade observations that provide insight to events and processes taking place outside the spectrum of visible light.
In a sense we might look at a galaxy as ours as "one big accretion disk" (dis)organized by currents and fields gathering matter and spreading radiation.
It has previously been suggested that the vast bubbles of radiation was caused by dramatic events some 50K years ago.
Radiation is emitted from charged particles forming an accretion disk rotating around the object in the proximity of the black hole.
Because of friction, matter heat up and emits X-rays, and slows down, causing the innermost parts of the disc to fall against the black hole.
X-rays decrease with particle density in the surrounding space, therefore the “bubble structure”, also the emitted photons can be absorbed by electrons of an atom and knocks electrons free - a phenomenon we may know as the photoelectric effect (explaining the double bubble.)
Gamma rays is due to the fact that the nucleus itself is excited and emits the energy.
The gamma radiation is emitted from the accretion disk - therefore the "azimuthal structure" - also being in focus of observation will enhance the projection of the disc strucural organization of the gamma rays as they emit from matter, and the extent and pattern within a matter of time will distort the original sources.
Extrapolating the events
Observing the space around us, we see our solar system, our galaxy, and our local group of galaxies first. We then see significant numbers of large well-formed galaxies in our local supercluster and nearby filaments of superclusters.
The farther out we look, the further back in the past we see. And the longer duration we follow, the more we notice a reduction in the size and structure of the galaxies. Eventually, we reach as far back as the first galaxies to ever form, from the first stars that started to shine. Before that, there was just hydrogen and dark matter. No light was yet created for us to see.
As we look back, we are also observing an ever shrinking volume because the Universe was getting smaller. And the temperature is getting hotter. Eventually, it reached 3000 K. At that point, hydrogen atoms began to disassociate into protons and electrons, and space became opaque.
Coming back the other way, the surface where the transition from opaque to transparent occurred is called the Surface of Last Scattering. At that time, all the photons in the Universe were released.
Those photons are still with us today. We see them all across the sky in tremendous numbers. They are the Cosmic Microwave Background (CMB) photons. And they tell us a great deal about the past, present and future of the Universe.
The problem with nature is that it is not accurate, and that our perception largely is subjective. It is possible to set up abstract algorithms and produce sophisticated measuring equipment that can serve the purpose of man and provide us with precise tools that enables us to make use of time as a common denominator.
Time is today more than ever a criterion for being able to precisely show exact timeframes enough than an unique absolute in itself.
Time can in this role - as a common denominator - explain the organisation and development of the Universe down to the smallest fraction of duration. But nothing - in this universe of events in an untimely fashion - can explain time.
This is because time is in the scale of events, not the line of events.
The theory of general relativity is not able to deal with the origin and final of the universe. All the observations we include in confirming the theory of general relativity are observed over very long distances.
But the origin of the universe is beyond our ability to obtain knowledge about, and if we look closer to what we know of conditions similar to a minuscule universe, we are encountering Heisenberg's uncertainty principle knocking at our forehead.
It is a condition that the theory of general relativity cannot be quite accurate at very small distances since it is a classical theory (it does not consider the uncertainty principle of quantum mechanics, which states that an object cannot have both a well-defined position and a precise speed: The more accurately you measure the position, the less precise one can determine the speed and vice versa).
To understand extreme density phases, as when the universe was very small - or very large - you need a theory that combines general relativity with the principle of uncertainty.
Based on space and energy, rather than energy and matter and their relationships in space (-time).
The ΛCDM Model does that.
The supernovae observations that discovered the acceleration of our Universe’s expansion also provided key missing information for a benchmark model. What astronomers do is to plot a diagram of the expected luminosity distances for a variety of scenarios concerning the contents and curvature of the Universe. Then lay the actual observed luminosity distances over the graph to see which scenario is the best fit. The lambda cold dark matter scenario with matter accounting for 30% and vacuum energy accounting for 70% is the current best fit. This is the current ‘Benchmark Model’.
And here’s just how it works. If the expansion rate is constant, the relationship between the luminosity distance and the redshift will be constant. Given a redshift, we can compute the expected luminosity distance and therefore the expected observed luminosity. Comparing this to the actual observed luminosity would find them equal.
But if the expansion is slowing down, the expansion rate in the past would have been greater than what we see now. Which means, it took a shorter time to expand from its size at light emission time to its present size compared to a non-accelerating universe. This results in a shorter light-travel time, shorter distance and brighter supernovae.
By the same token, if the expansion is speeding up, the universe was expanding more slowly in the past than it is today, which means it took a longer time to expand from its size at light emission time to its present size compared to a non-accelerating universe. This results in a larger light-travel time, larger distance and fainter supernovae.
By this model, the early universe was dominated by matter whose gravity was slowing down the universe's expansion rate. Hubble observations confirm that the expansion rate began speeding up about five to six billion years ago. This is when vacuum energy began to overtake gravity's attractive grip.
It’s important to note that actually calculating the time for a given scale factor depends on the model of expansion used, a graph like this can be used to find the time for any given scale factor, or find the scale factor for any given time. With this Benchmark Model, we can map the expansion history of the Universe from decoupling to the present and on into the future.
Nature is in all respects arranged so that nothing lasts forever. Matter decays to other elements, and all we can see has in the universe's process of creation have a well-defined explainable origin and eventually becomes obsolete and disappears into other form(s).
Space and time are of these forms. As the vacuum-energy dissolves like salt in water, and can only be condensed when water is evaporated, the Euclidian 3-dimensional cartesian space - Einsteins two cones (of light in, light out) time, with the imaginary (space-)time, real-time and our habitable everyday cartesian space.
Euler, Maxwell, Minkowsky let us deal with energy, relativity, gravity and multidimensional quantum physical spaces with Einstein and Newtons fundamental theories, but neither are able to explain what time is or make space anything that a property related to mass and movement.
And even though we can understand the world in the perspective of a photon - and time and space are unravelled as the perfect environment for light and other radiations (and all the other stuff we don't know anything about, yet) no answers will be able to show, what become of this universe before we find a way to fully understand the time-space continuum of a Euclidean Cartesian multi-dimensional Minkowski space as anything else than a calculation trick that twists time together with distance to approximate zero's.
Considering "field-changes" a source of error, the fact that we today are left with obtaining data from a few locations in the proximity of our planet - a vanishing small spot - in a vast space - makes the resulting observations questionable.
The fact that the photon can act as a particle one moment, following a well-defined path as a small projectile, and a wave the next, duality demonstrate interference patterns as it changes state, like a ripple on the water.
Wave-particle duality is one of the key features of quantum mechanics. Experiments show that photons not only shift from wave to particle and back again, but they actually hold both wave and particle capabilities (and demonstrate both properties) at the same time.
The wave-particle duality makes the photon viable through a universe with very changed properties from the origination point.
Dark entities vs. fine structures in the cosmos
Interpretations of the observed results enables us to explain the data. Assuming then the causality is coverable by the cosmological standard model.
But the fact that space is expanding faster and faster (and that most of the content of the space is invisible) is explained by the “ghost”-effect from dark energy (and “ghostly”-dark matter), the effect demanded from the model is that it will deliver accurate results.
Einstein's theories insist that light always travels at the same speed, and since speed is distance divided by time - for the rate to remain the same with the distance increased - time will thus also be increased.
Then time is stretched. (Or duration is compressed at light speed).
Scale on the other hand runs freely:
As space literally emerged from the The Big Bang - according to the standard model - there was a time where space is folded out from the singularity, suggesting that the expansion is closely tied with events taking place in the very first tick of time, and ever since have pushed the boundaries away from what can be registered within the spectrum of light (outside the 4.0 × 1032 cubic light year time bubble we theoretically can measure.)
In space is illuminated - by structural design - only what has already been.
This doesn't mean that futures are more or less unreal than presents. This is because the universal order is not by timely fashion - but arrays of facts.
And in fact, any scientific theory explaining the state of the Universe becomes meaningless without involving a dimensional time.
In a Universal scale these frames of observable duration does not state anything about an initial frame.
In fact, duration does not state anything about the Universe, but about how you spend the facts about it.
Any formula that claim to state ability to calculate the development from one state to another is just expressing a way of conceiving it.
By using symbolic thinking - and experience - you can claim that any theory of time is true, as the structural design explaining time as a dimension is based on symbolic substitution and conventional thinking.
Building an understanding on the speed of light - by design - the future is hidden. Because by this delay of information only the past is observable.
This doesn't mean that futures are more or less unreal than presents. This is because the universal order is not by timely fashion - but arrays of facts.
Any scientific theory explaining the State of The Universe becomes meaningless without involving a dimensional time.
But in a Universal scale these frames of observable duration does not state anything about the initial frame. In fact duration does not state anything about the Universe, but about how you spend the facts about it, and any theory about moving objects or expanding spaces becomes meaningless without factual knowledge about the initial state.
Actually, any formula that claim to state ability to calculate the development from one state to another is just expressing a way of conceiving it.
That the universe is timeless does not mean that duration cannot be experienced.
This lead us to conclude that the universe evolved by the changes of conditions like everything else in this world.
The standard cosmological model inflicts that everything originate from a starting point: The Singularity in a process called the Big Bang.
To analyse this we need to assume the initial state of the singularity, and reverse the line of probable events.
Also we have to decide whether the singularity start out cold, warm or hot. Obviously logically these are contradicting states.
But seeking a natural explanation (explainable within the physical standard model) is that to produce a hot Big Bang, the singularity went from cold to hot in the process during a collapse.
Extrapolating these assumptions backwards in sequence from the cosmic radiation background, to discover the origin from where the Universe origin we find:
- A state where it was still too hot to form atomic nuclei, where the radiation was so hot that any bound protons-and-neutrons blasted apart.
- A state where matter and antimatter pairs spontaneously form, as the Universe is so energetic that pairs of particles/antiparticles spontaneously is created.
- A state where individual protons and neutrons break down into a quark-gluon plasma, as the temperatures and densities are so high that the Universe becomes denser than the inside of an atomic nucleus.
- A state where the density and temperature rise to infinite values
- A state where all the energy that would go into the matter and radiation present today was instead bound up in the fabric of space itself
- And finally, its initial super low-entropy state of primordial elements within negative equilibrium - below the absolute zero, as all the matter and energy in the Universe are contained within a single point: The singularity
Quarks precede sub particles and are indeed very special ingredients - the three dimensionality within the very core of sub particles, bound at such a energy level makes them beyond rare. Yet they are abundant and are by the peculiar force of growing strength equivalent to the distance - distinct parts of sub particles (LHC-kind of rare, and even there only to instantaneously re-form bosons or fermions).
The tripple-state paradoxical properties of quarks allow the very ingredients of sub particles to preserve the initial state and avoid annihilation, and this does also nominate them to be linked to decay from the annihilation of the singularity, thereby allowing the infrastructural building blocks of the singularity to determine how matter behaves.
This lead us to conclude, that space didn't emerge - it evolved by the changes of conditions like everything else in this world.
As the eldest light observables in the universe has traversed a path in a scale-changing universe, thus have had expanded with a much denser universe - with matter and objects in space significantly closer than today
Higher density, temperature and proximity range is influencing the path with a much more significant gravity effect - as in more curved before - and much more straight today with the expanded scale (fewer galaxies and less dense space).
All in all a significant “field”-difference, that can lead to the misinterpretation that the expansion rate has changed, when in fact the change of conditions is leading to the difference in the observables.
Since our understanding of how Cosmos is based on the Standard Cosmological model (Lambda-CDM model based on redshift, Hubbles Constant/Cosmological constant, Dark energy and a modification of Einsteins field equations), the field-change from dense to sparse and hot to cold can explain the irregularities in results to a certain extent, in combination with far more advanced technology and still more advanced application of methods in utilizing a growing data set.
But certainty is still based on a variety of ways to obtain similar results, or confirmation by mass of evidence.
Consider that the eldest light have traversed a path in a scale-changing universe, thus have had expanded with a much denser universe - with matter and objects in space significantly closer than today - influencing the path with a much more significant gravity effect, as in more curved before and much more straight today with the expanded scale (fewer galaxies and less dense space).
This did not apply in the field of gravity from which the light originates, since everything happened slower - because of the stronger gravity fields.
Since light always follows in straight lines in space, and if space has a more steep curvature - as a result of a more dense topology within a smaller space, the path of light - seen from here in cosmological time - will be perceived as curved (hence gravitational lenses). Distances near large mass bodies such as e.g. black holes and galaxies and clusters become significantly longer, and the same applies to time intervals: The time is simply slower, the stronger the gravity field.
All in all a significant “field”-difference, that can lead to the misinterpretation that the expansion rate has changed, when in fact it is a change of conditions that led to the difference in observables.
But even if we doesn't consider "field-changes" a source of error, the fact that we are left with data from a few locations in a vanishing small spot - in a vast space - makes the results questionable.
Obtaining observations from other locations in space than just locally around this planet can lead to greater certainty.
If we still consider the expansion rate faster - using the current cosmological model to explain the figures calculated - since the expansion is caused by dark energy (vacuum energy) repelling gravity - and the energy stays constant, there are several options:
1. Gravity works different from what we expect
2. The expansion is not caused by vacuum energy alone
3. The presumption that the effect of zero-point energy is cancelling out at a subatomic scale symmetrically is wrong (causing the universe to expand still faster)
4. The Cosmological "constant" is not a constant but a variable containing both inhibitors and accelerants
5. Scalarity is a self-influencing factor by size and force
6. Redshift can vary by point of view (eg. caused by different - unseeable - gravity).
All of those hypotheses could be true - or wrong - as long as we rely on the same sources (of light) observed from the same locations, and have no verifiable method of control.
Using a satellite orbiting the next planet - would make comparison and triangulation possible for correlation of the results with a greater certainty in the details.
A broader perspective has never led to poorer results on the detail.
The Cosmic evolution
There is no liberalism or democracy in particle physics, and space maintains a strict embargo between galaxies and clusters. It is simply not possible to transport star-forming material from one galaxy to another by natural processes.
Between galaxies - and clusters of galaxies - the intergalactic space holds as an effective process of blocking star-forming substances like oxygen, hydrogen and dust. The way it works is a matter of chemistry and an abundance of ions and positrons, well suited to shred any galaxy for hydrogen clouds and dust, as they are drawn into another galaxy cluster by gravity.
This limitation is supporting the built-up of extra large black holes, and aggregation of immense clouds of positrons and neutralised star-forming material in the intergalactic medium.
It acts out as a kind of cosmic fire extinguisher that shuts down and extinguishes the active galactic star formation in a merger galaxy - the cosmic way - by stripping of stellar-forming materials into large clouds of non co-moving neutralised star-forming material in the intergalactic medium.
By accumulating the larger part of any newly merged galaxy outside the cluster, the galactic inter cluster medium over time is building up a yet more steady metallic compound unbound from the galaxies seeding space for new stellar systems giving birth to more advanced stellar formation, as space is being stretched and galaxies being scarsely deported by the expansion.
Looking back to this violent past - extinguishing and shredding of galaxies, black hole forming in the extremes of scale, super dense internal galactic environments, free positrons and neutral atomic elements ready to neutralize star forming matter and separate active processes and ressources.
Building up these gravity galactic mouse traps is literally closing and shutting down regions of space. It’s hard not to think - this was billions of years ago - what will happen to the night sky in the time to come?
When we look carefully - and in the right spectrum - we see this story all over the sky.
This is not just extinction of light by expansion - this is so much more - this is cosmic evolution.