Imbalance is the driving factor in the development of Space
Nothing indicates that Cosmos necessarily need to be synchronised and knowable in all directions. The Universe can develop and organize itself in various regions of ever growing space, containing deep pockets of past as the meaningful picture of super size black hole singularities.
Break up space cluster wise, tare apart galaxies and redistribute energy and matter to other regions of space.
The whole point of “dark energy and dark matter” is that by conduite of knowing how well known light behaves and relates to time of origin, evidence is already there:
Something messes with space - and have been doing it for approx. 5 billion years. Since it is even possible to establish that the acceleration of the frequency of the expansion is increasing. The question is not whether “something” is forcing space to expand even faster all the time and tare matter apart, but more likely what it is, and how.
But as we can only observe analogue electrodynamic radiation contexts - in different resolutions - which do not reflect reality, but just how it once was. A very, very long time ago.
To establish order and predictability in the universe, we have invented the natural constants which are fundamental constants - as well as constants that occur when the basic laws of nature are described as quantitative connections - and enables us to generalize the conditions in which the universe unfolds over time.
Historically we have benefited from the invention of natural constants, that makes it possible too not only explain coherence during time, but also even establish important improvements of the laws of physics along, raising new questions about what we are observing.
The Dark Sector
The central part of Coma Cluster of galaxies covers a roughly circular area about a degree and a half across (9 times the area of a full moon). The full cluster may in fact extend further, and numerous other galaxy clusters are in the same area of the sky.
Dark matter allegedly works as a kind of "super glue" that keep structures within and between galaxies constant
The Coma Cluster is one of the richest galaxy clusters known. It contains as much as 10,000 galaxies, each housing billions of stars.
The Swiss astronomer, Fritz Zwicky, in the 1930s discovered that the stars are rotating too fast in relation to the known content of this cluster. The reason this fact was possible to establish was because he captured a large numbers of galaxies in a single wide-angle photograph.
Zwicky then figures out that matter, on a very large scale - galactic way, sort of - can be attracted/stabilized, like in a fixed way by dark matter. Because gravity appear stronger than it should be, and that the galactic rotation are fixed - so that everything rotate with the same speed.
Zwicky find that the spatial distribution of galaxies in the Coma cluster resembles what one would expect theoretically for a bound set of bodies moving in the collective gravitational field of the system.
Measuring the dispersion of random velocities of the Coma galaxies, it turns out that they move by velocities at nearly 900 km/s. For a galaxy possessing this random velocity along a typical line of sight to be gravitationally bound within the known dimensions of the cluster requires Coma to have a total mass of about 5 × 10^15 solar masses.
The total luminosity (that's how much it shines, compared to the Sun: The amount of light emitted by an object in a unit of time ) of the Coma cluster is measured to be about 3 × 10^13 solar luminosities; therefore, the mass-to-light ratio in solar units required to explain Coma as a bound system exceeds by an order of brightness what can be reasonably ascribed to the known stellar populations.
Zwicky made a survey of all the galaxies in the cluster and used measurements of the Doppler shift of their spectra to determine their velocities,
and then he then applied the virial theorem.
The wavelength of light emitted by an object moving toward an observer is shortened, and the observer will see a shift to blue.
If the light-emitting object is moving away from the observer, the light will have a longer wavelength and the observer will see a shift to red.
By observing this shift to red or blue, astronomers can determine the velocity of distant stars and galaxies relative to the Earth.
The virial of a particle is defined as the product of the particle's momentum, p, and its position, x. The virial theorem states that if the time average of a particle's virial is zero, then the particle's kinetic energy, T, is related to the product of the net force, F, acting on the particle and the particle's position:
For particles—or galaxies—moving under the influence of a gravitational force, is equal to the particle's gravitational potential energy, which depends on the total mass inside the particle's orbit.
Zwicky used the virial theorem to relate the total average kinetic energy and total average potential energy of the galaxies of the Coma cluster. He argued that the virial for a pair of orbiting masses is zero, and used the principle of superposition to extend the argument to a system of interacting mass points. This allowed him to use the position and velocity measurements he carried out to find the mass of the galaxy cluster.
A straightforward application of classical mechanics, the virial theorem relates the velocity of orbiting objects to the amount of gravitational force acting on them. Isaac Newton's theory tells us that gravitational force is proportional to the masses of the objects involved, so Zwicky was able to calculate the total mass of the Coma Cluster from his measured galactic velocities.
Zwicky measured the combined light output of all the cluster's galaxies, which contain about a trillion stars altogether. He then compared the ratio of the total light output to the mass of the Coma Cluster with a similar ratio for the nearby Kapteyn stellar system, and found that the light output per unit mass for the cluster fell short of that from a single Kapteyn star by a factor of over 100.
He then reasoned that the Coma Cluster must contain a large amount of matter not accounted for by the light of the stars.
This he called "dark matter."
Critic of the calculation
The bodies encircling within galaxy discs - in a galaxy cluster - galactic-wise are not all travelling at the same speed. There will be local systems of galaxies that will be moving extremely fast, some will be moving with moderate speeds according to the mass of the system. The magnitude will show great varians - and thereby the luminosity will indicate different velocities.
By aggregating the peak value of the cross-section Zwicky built-in a trend that tend to show too high magnitude. The correct value would be to cross-mean aggregate the squared values (not using the most probable highest velocity): Since the representation will tend to overrepresent the directions directly towards our viewing angle.
By the choice of calculation Zwickys cross-section tend to use "over exposed" values, and thereby produces an evidence of missing matter.
Hence no dark matter is needed.
But, by going with the average velocity:
(If you observe all the light that is in the sky, you can, by assuming that the rest of the galaxy is reminiscent of our own part, estimate how much visible matter there is altogether, and since matter (and energy) affects bodies with gravitational forces, it is possible to calculate how the rotational speed of the disc stars should be.)
Subsequent measurements of the rotational speed of the stars show that there is a lack of inconsistent amounts of matter. (As we see only about 10% of the required mass.) This is related to dark matter (it doesn't light up, otherwise we could see it) and is impossible to measure, as we can only observe what we see, and thus it is difficult to find out what it consists of.
Theories have pointed out that it could be neutrinos (which have a vanishingly small mass), brown dwarfs, planets, WIMPs or ordinary neutral hydrogen gas.
I.e. that dark matter is already necessary to explain the fact that the galaxies do not seem to follow the basic laws of physics.
The discovery of major deviations in the behavior of celestial bodies in relation to gravity and Newton's second law, using simple Newtonian equations to describe the power relationships of one of the universe's most dense galactic populations -
(because the net force are assumed to be constant)
- lead to the suggestion, that something had to be be wrong with the galactic cluster, and no questions asked regarding method or sufficiency for the purpose of calculating this seriously complex context, has left us with the idea of "dark matter" in an abundance, previously unknown - in a scale out of charter relative to known matter - constituting the greater part of all matter in the universe.
Ever since this postulate, the brightest minds have tried to figure out what that matter could be.
It's a bit problematic to suggest the whereabouts of such a substantial mass of matter. Science has showed us that the dark matter/dark energy cannot be caused by self inflicted changes in the local matter - nor can it be regional transfers from different areas of space - as that would cause a collapse of space, and since it is not matter as we know it - with nuclei and electrons, protons and neutrons and electromagnetism, weak and strong core forces - it's something else.
Theories - Out of order
Considerations whether the natural forces can support other atomic structures than currently known - or in fact the basic problem is a misinterpretation of informations derived from outdated scientific methods - due to the fact that the actual universe they are explaining, fundamentally has scaled out of comparison to the tiny solar systems the theories initially was intended for.
The fundamental forces of the universe: Gravity, the electromagnetic force, and the weak and strong nuclear powers are the bonds that provide the universe with the power of expansion - and viability - from the invisible tiniest relationships between matter and energy to the unimaginable cosmological contexts that we may or have not yet understood or discovered.
We are struggling to get this general world view - scientific heritage from the classical science - more than 100 years old, to play along with what we can actually see has played out in space:
- We find that the energy balance does not sum up
- nor are there enough matter in the world to account for what can be confirmed.
The question is though whether there actually is a lack of mass, a lot of other strange kinds of matter, or whether gravity simply works different over large scale galactic distances, than the applied theory assume.
Two things can happen when we discover that reality is acting strange according to what we know - we can change the way we think of reality and call for new stuff or new understandings - or we can call for new science.
One of the most interesting properties within a galaxy is the way sustainability are achieved. Depending on Newton and Einsteins theories leaves us with no other option that to invent some new stuff. Dark matter.
Or we can go by the dynamical view. Orbital cycles, angular momenti and morphological and kinematical equilibria - like Schwarzschild in a higher scale - are more likely the kind dynamics that straps down an initial part of matter to the highest efficiency of self-consistency over the aeons.
Photons are the hammer that nails the primordial path of relic dark matter
Distribution of the (primordial) dark matter and it’s role in matter formation
Within the Physical standard model we can only think back as far, as to a point we call the singularity, the dominant natural force here was gravity, and kept everything in that moment. But since it was before time, it was not part of this universe.
No other natural force apart from gravity could play any role. Yet, until the moment that this universe became, and we call that the big bang.
But still - this is not how everything was made - only the event that define the start of this universe. In fact no one really have any good answers to how the singularity went through a momental big bang and became to this universe.
But if we forward time just past the big bang a lot of stuff happens. Astrophysicists believe, that a very dense mixture of protons, neutrons, photons, electrons - and other subatomic particles - filled the universe. The temperature by then was so extremely high, that the electrons was unable to bind with the protons to form atoms.
Because the temperature is so high, all the particles are very energy-rich, and most photons at that time are gamma rays. The universe is still only 1/100 billionth of a trillionth of a second old, 10 trillion degrees hot – big as a house and lit up with extremely bright sapphire blue light but had no visual visibility what so ever, because of all the electrons roaming freely in space, exiled by ions.
Instead, all the particles scattered off of each other at high rates, keeping all the different species at the same temperature - in a thermal equilibrium - with each other. The photons also scattered off of the electrically charged protons and electrons in a degree so that they could not travel very far.
As the universe expanded, the temperature dropped to about one billion degrees Kelvin (K). At that point, the protons and neutrons began to bind together to form atomic nuclei. At roughly 390,000 years after the Big Bang, the continued expansion of the universe and subsequently cooling had dropped the temperature of the universe to about 3000 K.
By that point, all the electrons and protons had bound to form electrically neutral hydrogen atoms, and all the other charged particles had decayed. As the primordial hydrogen formed, the universe became so transparent to photons that they have been traveling throughout and within it for the entire ever since then.
These relic photons from the early universe have a microwave wavelength, and are known as the cosmic microwave background, or CMB. Given that space started inflation per se exponentially from within the first picosecond of time and ran for approx. a quarter of an hour, the electromagnetic radiation we monitor today are stretched with space, in what we see as the cosmic background radiation.
Density fluctuations, dark matter infrastructure and the cosmic microwave background
It is assumed that before the neutral hydrogen formed, matter was distributed almost uniformly in space - although small variations occurred in the density of both normal and dark matter because of quantum mechanical fluctuations. Gravity pulled the normal and dark matter in toward the center of each fluctuation, thus founding the cosmic web, matter has been following ever since.
While the dark matter continued to move inward, the normal matter fell in only until the pressure of photons pushed it back, causing it to flow outward until the gravitational pressure overcame the photon pressure and the matter began to fall in once more.
Each fluctuation “buzzed” in this way with a frequency depending at its size. This constant traction in and out had a high influence on the temperature of the normal matter. It heated up when it fell in and cooled off when it flowed out.
The dark matter, which does not interact with photons, remained unaffected by this effect. When the neutral hydrogen formed, areas into which the matter had fallen were hotter than the surroundings. Areas from which matter had streamed out, by contrast, were cooler.
The temperature of the matter in different regions of the sky - and the photons in thermal equilibrium with it - reflected the distribution of dark matter in the initial density fluctuations and the "buzzy" normal matter.
This pattern of temperature variations was frozen into the cosmic microwave background when the electrons and protons formed neutral hydrogen. So a map of the temperature variations in the CMB traces out the location and amount of different types of matter 390,000 years after the Big Bang.
The american physicists Ralph Alpher, Robert Herman, and George Gamow predicted the existence of the CMB in 1948. Seventeen years later, Bell Labs scientists Arno Penzias and Robert Wilson detected them.
Initial measurements showed the intensity of the relic photons to be constant across the sky to a fraction of 1 percent. In the early 1990s, however, NASA's Cosmic Background Explorer (COBE) spacecraft used a pair of radio telescopes to measure differences among relic photons to one part per million between two points in the sky.
A subsequent spacecraft, the Wilkinson Microwave Anisotropy Probe (WMAP), made an even more precise map. This revealed hot and cold spots about 1.8 degrees in size across the sky that vary in intensity by a few parts per million.
The angular size and the extent of variation indicate that the universe contained about five times as much dark matter as normal matter when the neutral hydrogen formed. Combined with measurements of supernovae and the clustering of galaxies, this indicates that dark energy comprises 73 percent of the universe, dark matter 23 percent, and normal matter just 4 percent.
Thinking through the box
In a classical scientific world view - everything is drafting an explanation - harmonic forces, equilibria, orderly and well known and depending on nothing but expectable facts.
From Newtons precisely clock maker theories, to Einsteins wild scientific ride with light speed and relativity, the view has widened broadly. As geniously it might seem - still confirming by observations after more 100 years, we are still struggling with the tough heritage from the harmonic and eternal and timeless predictable universe.
We are inclined to see the world in the light of Newton and Einstein's discoveries - two scientists who basically have defined how the world is linked and evolved into a plausible explanatory model reflecting what we can see.
Both have contributed to the understanding of how things - in what we call the universe - play in an interaction between mass and gravity - and time (and space), and space time.
Most of all, perhaps, because it is obvious to assume that the world is simple and predictable, and maybe also because it is so difficult to reconcile theory and practice with what we can actually see unfold in modern cosmology.
Basically, the cosmological standard model is the story of how the universe evolves - within the framework of natural constants and their interrelationship - over time - from a starting point to the universe as we know it today.
Indeed, the world becomes by the way we understand it - as Newton's gravitational force makes sense, it is because we mathematically can prove that events in the gravitational field are not only probable but also universal, thus qualifying gravity in the class of universal fundamental forces.
But with the same mathematical skills - but other precautions - one can prove that gravity in Newton's mechanical universe is not the only plausible explanation for a body following a particular path when it "falls" - the principle of the least energy cost, where a body follows the path that is most energy-stable, applying the same physical constants.
Instead of introducing a gravity constant, the effect is then calculated as the integral of the kinetic energy subtracted the potential energy over the time interval - and is equivalent with Newton's calculation method.
The natural constants are fundamental constants - as well as constants that occur when the basic laws of nature are described as quantitative connections.
For example, the gravitational constant g is included in Newton's law of gravity, and the speed of light c in R?mer's description of the hesitation of light and later in Einstein's special theory of relativity.
The universal constants define our perception of which relationships are fundamental, and thus reflect the accepted theory of the evolution of the universe.
When we consider space as a definitive result - where the universal constants of nature enter into the world and unite the forces that lead to this universe - it is actually because the universe is created in our imagination.
Theoretical physics, applied physics, mathematical models and philosophical considerations all serve the same purpose.
To confirm the world as we see it.
According to Einstein, time is found only in relation between observant and object, and any perception of time is therefore special according to this relationship.
The world view from classical physics was marked by Newton and Maxwell theories at the beginning of the last century. The classic mechanics and electrodynamics. It is also the story of how our understanding of space and the bodies found within it can be considered. That was a world view about to be changed.
Where Newton's understanding was that space was an absolute, and it was determinable whether a body was at rest or motion through a mechanical view, where the principles could be applied for everything that happened in space as a kind of law.
Newton's conception of space also included an assumption that the space was full of a world aether as a perfect medium spread perfect in all the universe, where the light and all forms of radiation would disperse. The aether was also assumed in complete rest and constituted a stationary and privileged system.
Scientific theories often originate from experiments and hypotheses about what it observes, and new knowledge formed from results that cannot else be explained on basis of existing assumptions.
Einstein's theory of relativity - or rather, the thesis “Zur Elektrodynamik bewegter K?rper”, (Electrodynamic Movement of Electrodynamics) - dealt with the perception of the internal structures of the physical laws. Einstein were of the opinion, that physics was nothing less than to be reconsidered on the basis of a completely new formal and general principle.
As the physical standard model is relativistic by nature, holding ground from presumed constants and predictable contexts, the outcome of events will often appear to be inevitable.
But just as observations are likely to confirm the hypotheses that underlie the model - in theory - practice will often show the given answers.
Science is predictable - this is exactly what characterizes science. Observations will confirm hypotheses and data confirm the models that are prerequisites for our view of reality.
The most important hypothesis that relates to time is that concurrence in general must be perceived in relations.
Einstein postulates that the physical laws must have the same form in any movement system.
He also postulates that the speed of light (in vacuum) always are the same, regardless the light source is moving or are at rest in relation to the observer.
This will have widespread and surprisingly consequences. Not only is concurrence relative in this sense, but also measurable physical sizes such as length, volume, mass, and time are relative.
The speed of light though is invariant, that is - being fixed and always the same - not a relative speed.
Spatial dimensions change with the movement, as does the temporal dimension in terms of duration of events.
Because for Einstein, space and time were not fundamental, or absolute, but linked together in a four-dimensional 'space-time', and although space and time appear integrated in the space-time cf. The theory of relativity, they are nevertheless different.
Einstein also concludes that the mass of a body varies with the speed of light, getting heavier at higher speeds, and the total energy of an object is calculated by multiplying the speed of light in the power of 2 with the mass.
The new thing was not the formula E = mc^2 itself, but that it was a consequence of a whole new theory of space and time that overruled the fundamentals of the classical physics.
Einstein's theory was in many ways distinct and seemed to lead to absurd results, e.g., that concurrence is relative, that a body's mass depends on speed, and that a clock in motion show time differently than a clock at rest.
Galactic filaments form along and follow web-like strings of dark matter
According to the standard model of the evolution of the universe, galactic filaments form along and follow web-like strings of dark matter. It is thought that this dark matter dictates the structure of the Universe on the grandest of scales.
Dark matter gravitationally attracts baryonic matter, and it is this "normal" matter that astronomers see forming long, thin walls of super-galactic clusters, supercluster complexes, galaxy walls, and galaxy sheets which are the largest known structures in the universe.
These are massive structures - thread-like formations, of the order of 200 to 500 million light-years, forming the boundaries between large voids in the universe. The filaments consist of gravitationally bound galaxies. Parts wherein many galaxies are very close to one another (in cosmic terms) are called superclusters.
Indications of these large scale cosmic structures come from discovery of the Lyman-alpha forests:
The Lyman-alpha forest is an important probe of the intergalactic medium
In astronomical spectroscopy, the Lyman-alpha forest is a series of absorption lines in the spectra of distant galaxies (and quasars) arising from the Lyman-alpha electron transition of the neutral hydrogen atom.
As the light travels through multiple gas clouds with different redshifts, multiple absorption lines are formed, and can be proped to determine the frequency and density of clouds containing gasses, as well as their temperature.
A technique basically based on combining the different spectres of light as infra-red and ultraviolet with the redshift.
Combining the Lyman-alpha proping with the The WMAP data reveal the full scale infrastructure of space, and how primordial dark matter was dispersed in the cosmic "root-web", dictating the structure of the Universe on the grandest of scales.
Nothing though indicates that Cosmos necessarily need to be synchronised and knowable in all directions.
The Universe can develop and organize it self in various regions of ever growing space, containing deep pockets of past as the meaningful picture of super size black hole singularities.
The whole point of “dark energy and dark matter” is that by conduite of knowing how well known light behaves and relates to time of origin, evidence are already there:
Something messes with space - and have been doing it for approx. 5 billion years. Since it is even possible to establish that the acceleration of the frequency of the expansion is increasing. It make no sense to deny it.
The question is not whether “something” is forcing space to expand even faster all the time, but more likely what it is, and how.
Everything in this world has a natural explanation, and there is no hokus-pokus about space. Just possibilities.
As space is expanding - we need to find a the natural explanation - or perhaps even a series of events - a process - that continuously make an impact on the fabric of space (influencing the scalar field), the challenge is to point out the source
Evolved to transit self sustainability
When you study cosmos, you will find that planets and stars have emerged in large clouds of dust and gas, and that size is essential for the development of differentiated celestial bodies, and that gravity is the determining factor in local formation of both planets and stars.
There are two contradictory and fundamental principles at stake in both the transformation of space into energy, energy into matter and matter into space:
Expansion and contraction, cooling or heating
Scalarity, as space gets bigger or smaller it turn colder or hotter.
In theory, the world has grown from an original super-dense state, to a space growing in size.
Growth is the fundamental factor that has led to the formation of Cosmic phenomena.
Change is the controlling factor in nature
Variations are the driving factor in the evolution of the universe.
One can therefore consider The Universe as a self-reproducing system, and the thermodynamic forces that are subject to the natural forces have led to this universe are, because of this interaction, an independent system resulting in nucleosynthesis, star formation and growth in galactic systems, and due to increasingly advanced natural processes also to formation of autonomous processes that lead to life in environments, thereby elevating the development to new levels.
Because the world began to grow light has been able to propagate in the universe.
Because of the expansion of space, the temperature differences between the objects in space significantly has marginalized. Since the release of cosmic background radiation - Space has become an increasingly colder and colder place, and for the same reason - it has become much colder outside the stars than inside - light will then emit from the stars. In a space in thermal equilibrium photons would not leave the stars.
Stellar phenomenas have three interesting basic properties in the evolution of the Cosmos, they:
- are growing
- can reproduce themselves
- exhibit nucleosynthesis - by fusion of atoms to heavier elements.
As the space expands, it simultaneously cools down from the same reason, and because the surface temperature of the stars is very high, photons will therefore leave into the space.
As the light has the same properties today as it had from the first star began to glow, we can therefore use that knowledge to determine how space has evolved while the light has spread in it.
As of 1925 we have been able to analyze distances in space due to the properties of light - by comparing brightness with the amount of light to determine distance, and the time the light has been traveling by determining how "stretched" the light waves have become due to the expansion of Space as a yardstick for the past.
These facts make the phenomena of the sky space comparable and enable differentiation of velocities in relation to distances and thus provide us with a tool to describe the topology of the space.
The consequence of this is that we have today mapped out all the parts of the cosmos we can see - on all available wavelengths of radiation - and with all available methods - have created a true picture of how space is constructed according to our observations.
Galactic clusters are the largest known structures in the universe, being connected in a cosmic web of vast filaments, the nodes looks uncannily like the neuronal structure of a human brain or root network of fungi describing the largest process in the world:
Dark matter is not a peculiar particle, but ordinary matter running in a vast network of cosmic “fusion channels”. Just like the root net of fungi disperse resources across a biotop. A web like structure, is supplying energy through the interconnected path to maintain equilibrium across the entirety of Cosmos balancing the growth of space.
We tend to think that there must be more that matters when it comes to establish how the known universe works.
As we by theory so far has ben extremely light fixated, we are looking everywhere to explain what we “see” - or not see in the dark.
Overlooking the detail that nothing is what it seems, and that alluring the stars of details from the light is leaving everything that does not shine, literally in the dark.
But our insight comes from light that has traveled through space for billions of years as the shine of a bygone era where space broke up and expansion tore the cosmos apart.
We don’t have a universe - but a “space” more like a container of many worlds, of which we can actually see 5 when we look long enough.
Observations shows us that gravity is descending with the distance and that no forces works over astronomical distances.
That Cosmos is breaking up in superstructures with galaxies floating unbound within clusters - yet the super structures are drawn and encapsulating in "big clumps" independently, but directional and systematically in a space expanding yet faster.
The largest known structures are (Sloan Great Wall, Shapely, Horologium-Reticulum and Pisces Cetus).
Looking at the critical path of matter distribution in a “universal” grid it indicates that both dark and normal matter are affected by an unknown force, thus becoming fundamental for the wider evolution of cosmos.
One of the most interesting properties is the way sustainability are achieved across space:
Interconnectivity, orbital cycles, angular momenta and morphological and kinematical equilibria - like Schwarzschild in a higher scale - are the sort dynamics that straps down an initial part of matter to uphold the highest efficiency of self-consistency over the aeons by keeping the channels super-heated and sustaining the energy level, preventing the expansion to drain Space for energy.
Within the nodes similar regulating processes keep the intergalactic media from freezing over:
The Coma cluster (A1656) is one of the most famous clusters of galaxies of all, and has received an overwhelming amount of scientific research. Not only because it is an extremely rich cluster containing tens of thousands of galaxies.
The spiral galaxy D100, is being stripped of its gas as it plunges toward the center of the giant Coma galaxy cluster. (Hubblesite 2019)
Observations show, that a cosmological large scale barrier exists that strips star-forming gas of external galaxies, as the D100 galaxy is drawn in by the Coma cluster's gravitational field - gas and dust clouds - effectively is shredded before they approach the proximity of the cluster galaxies.
If it wasn't because we have a pretty good idea why this happens - it would be a perfect candidate for dark matter. But in fact it is a matter of chemistry and an abundance of ions and positrons, suited to shred any foreign galaxy for hydrogen clouds and dust, as they are drawn into a galaxy cluster by gravity.
We can think of it - as a safe protection of space, or a kind of safety mechanism that prevent any part of Cosmos to pull an unequalized abundance of mass - and energy - from other regions of Space, and by only allowing mature stars to enter the proximity of internal galaxies, only their mass can eventually end up supplying a local galaxys' black hole, hence increase the net return to space, cluster-wise and enrich the intracluster medium, and redistribute most of the energy over the channels connecting the filament nodes?.
In relation to intergalactic blocking of active gases, and filtration of star-forming gas for new extra-galactic star formation, a kind of dark medium is found in the transgalactic space. But there's no mystic matter or force behind the process.
Image credit: (? Hubble image: NASA, ESA, M. Sun (University of Alabama) and W. Cramer and J. Kenney (Yale University); Subaru image: M. Yagi (National Astronomical Observatory of Japan))
The featured false-color picture is a digitally enhanced composite of images from Earth-orbiting Hubble and the ground-based Subaru telescope. Studying remarkable systems like this bolsters our understanding of how galaxies evolve in clusters.
This striking image combines data gathered with Hubble’s Advanced Camera for Surveys (ACS) and data from the Subaru Telescope in Hawaii. It shows just a part of the spectacular tail emerging from the spiral galaxy D100. Glowing blue clumps of young stars can be seen near the middle of the tail, where there is still enough hydrogen gas to fuel star formation.
The red path connects to the center of D100 consists mostly of glowing hydrogen because the outer gas, gravitationally held less strongly, has already been stripped away by ram pressure by the intracluster medium moving through the ambient hot gas in as it plunges toward the center of the giant Coma galaxy cluster.
The extended gas tail is about 200.000 light-years long, contains about 400.000 times the mass of our Sun, and stars are forming within it. Galaxy D99, visible to D100's lower left, appears red because it glows primarily from the light of old red stars - young blue stars can no longer form because D99 has been stripped of its star-forming gas.
The spiral arms disappear, and the galaxy is left with no gas and only old stars. This phenomenon has been known about for several decades, but Hubble provides the best imagery of galaxies undergoing this process.
The intracluster medium (ICM)
In astronomy, the intracluster medium (ICM) is the superheated plasma that permeates a galaxy cluster. The gas consists mainly of ionized hydrogen and helium and accounts for most of the baryonic material in galaxy clusters.
A hydrogen ion is created when a hydrogen atom loses or gains an electron. A positively charged hydrogen ion (or proton) can readily combine with other particles and therefore is only seen isolated when it is in a gaseous state or a nearly particle-free space
Positive charges in ions are achieved by stripping away electrons orbiting the atomic nuclei, where the total number of electrons removed is related to either increasing temperature or the local density of other ionized matter. This also can be accompanied by the dissociation of molecular bonds,
The ICM is composed primarily of ordinary baryons, mainly ionised hydrogen and helium. This plasma is enriched with heavier elements, including iron.
The average amount of heavier elements relative to hydrogen, known as metallicity in astronomy, ranges from a third to a half of the value in the sun.
Studying the chemical composition of the ICMs as a function of radius has shown that cores of the galaxy clusters are more metal rich than at larger radii.
Image Credit: NASA/IoA/J.Sanders & A.Fabian
In some clusters (e.g. the Centaurus cluster) the metallicity of the gas can rise above that of the sun.
The Chandra image of the Centaurus galaxy cluster shows a long plume-like feature resembling a twisted sheet.
The plume is some 70,000 light years in length and has a temperature of about 10 million degrees Celsius.
It is several million degrees cooler than the hot gas around it, as seen in this temperature-coded image in which the sequence red, yellow, green, blue indicates increasing gas temperatures. The cluster is about 170 million light years from Earth.
The plume contains a mass comparable to 1 billion suns. It may have formed by gas cooling from the cluster onto the moving target of the central galaxy, as seen by Chandra in the Abell 1795 cluster.
Other possibilities are that the plume consists of debris stripped from a galaxy which fell into the cluster, or that it is gas pushed out of the center of the cluster by explosive activity in the central galaxy.
A problem with these ideas is that the plume has the same concentration of heavy elements such as oxygen, silicon, and iron as the surrounding hot gas.
Due to the gravitational field of clusters, metal-enriched gas ejected from supernovae remains gravitationally bound to the cluster as part of the ICM.
By looking at varying redshift, which corresponds to looking at different epochs of the evolution of the Universe, the ICM can provide a history record of element production in galaxy.
Roughly 10% of a galaxy cluster's mass resides in the ICM. The stars and galaxies contribute only 1% to the total mass. Most of the mass in a galaxy cluster consists of dark matter and not baryonic matter.
Due to the gravitational field of clusters, metal-enriched gas ejected from supernovae remains gravitationally bound to the cluster as part of the ICM. By looking at varying redshift, which corresponds to looking at different epochs of the evolution of the Universe, the ICM can provide a history record of element production in galaxy.
Although the ICM on the whole contains the bulk of a cluster's baryons, it is not very dense, with typical values of 10^?3 particles per cubic centimeter. The mean free path of the particles is roughly 10^16 m, or about one lightyear.
The density of the ICM rises towards the centre of the cluster with a relatively strong peak. In addition, the temperature of the ICM typically drops to 1/2 or 1/3 of the outer value in the central regions. Once the density of the plasma reaches a critical value, enough interactions between the ions ensures cooling via X-ray radiation
Intermediate mass black holes
Intermediate-mass black holes are too massive to be formed by the collapse of a single star, which is how stellar black holes are thought to form. Their environments lack the extreme conditions—i.e., high density and velocities observed at the centers of galaxies—which seemingly lead to the formation of supermassive black holes.
There are three postulated formation scenarios for IMBHs. The first is the merging of stellar mass black holes and other compact objects by means of accretion. The second one is the runaway collision of massive stars in dense stellar clusters and the collapse of the collision product into an IMBH. The third is that they are primordial black holes formed in the Big Bang.
Astronomers believe that intermediate mass black holes may be the “seeds” that ultimately formed the supermassive black holes in the centers of galaxies like the Milky Way. Finding additional nearby examples should teach us about how these primordial galaxies from the early universe grew and evolved over cosmic time.
The Chandra X-ray Observatory is the world’s largest X-ray telescope. It allows scientists to access X-ray images of the exotic environment of the interstellar for further studies and explanation, and helps astronomers to understand the structure and evolution of the Universe.
Image credit: X-ray: NASA/CXC/Univ. of Alabama/W.P. Maksym et al & NASA/CXC/GSFC/UMD/D. Donato, et al; Optical: CFHT
A bright, long duration flare may be the first recorded event of a black hole destroying a star in a dwarf galaxy, The dwarf galaxy is located in the galaxy cluster Abell 1795, about 800 million light years from Earth.
A composite image of the cluster shows Chandra X-ray Observatory data in blue and optical data from the Canada-France-Hawaii Telescope in red, green and blue. An inset centered on the dwarf galaxy shows Chandra data taken between 1999 and 2005 on the left and Chandra data taken after 2005 on the right.
The X-ray flare in the inset provides the key evidence for stellar destruction. A star that wanders too close to a supermassive black hole should be ripped apart by extreme tidal forces. As the stellar debris falls toward the black hole, it should produce intense X-rays as it is heated to millions of degrees. The X-rays should fade as the hot gas spirals inward.
This discovery was part of an ongoing search of Chandra's archival data for such events. In the past few years, Chandra and other astronomical satellites have identified several suspected cases of a supermassive black hole ripping apart a nearby star. This newly discovered episode of cosmic, black-hole-induced violence is different because it has been associated with a much smaller galaxy than these other cases.
The black hole in this dwarf galaxy may be only a few hundred thousand times as massive as the Sun, making it ten times less massive than the Galaxy's supermassive black hole. This places it in what astronomers call an “intermediate mass black hole” category.
Black Hole-Powered Jet of Electrons and Sub-Atomic Particles Streams From Center of Galaxy M87
The jet emerging from the galactic core of M87 (NGC 4486). The jet extends to about 20 arc seconds (absolute length ca. 5 kly). Composite image of Hubble Telescope observations. The galaxy is too distant for the Hubble Telescope to resolve individual stars; the bright dots in the image are star clusters, assumed to contain some hundreds of thousands of stars each. Original caption: "Black Hole-Powered Jet of Electrons and Sub-Atomic Particles Streams From Center of Galaxy M87" The data used in this image was collected with Hubble's Wide Field Planetary Camera 2 in 1998 by J.A. Biretta, W.B. Sparks, F.D. Macchetto, and E.S. Perlman (STScI). This composite image was compiled by the Hubble Heritage team based on these exposures of ultraviolet, blue, green, and infrared light.
Even light has taken 55 million years to meet your eye - travelling staggering 500 million trillion km, you can actually check the spectacular phenomena by an ordinal 8" telescope: As M87 is visible in the sky in the east after dark and is visible for quite a large proportion of the night.
Between galaxies - and clusters of galaxies - the intergalactic space acts as an effective blockade of star-forming substances such as oxygen, hydrogen and dust, the way it works is a matter of chemistry and an abundance of ions and positrons, well suited to shred any foreign galaxy for hydrogen clouds and dust, as they are drawn into a galaxy cluster by gravity.
This limitation is supporting the built-up of extra large black holes, and aggregation of large clouds of positrons and neutralised star-forming material in the intergalactic medium.
It acts as a kind of cosmic fire extinguisher that shuts down and extinguishes the active galactic star formation in a merger galaxy - the cosmic way - by aggregation of large clouds of non co-moving neutralised star-forming material in the intergalactic medium.
By accumulating the larger part of any merged galaxy outside the cluster, the galactic inter cluster medium over time is building up a yet more steady metallic compound unbound from the galaxies seeding space for new stellar systems that can give birth to more advanced stellar formation, as space i being stretched and galaxies are being scarsely deported by the expansion, the intra cluster channels will redistribute compounds across the filaments and across channels by differences in vacuum and temperature
Looking back to this violent past - extinguishing and shredding of galaxies, black hole forming in the extremes of scale, super dense internal galactic environments, free positrons and neutral atomic elements ready to neutralize star forming matter and separate active processes and ressources.
Building up these gravity galactic mouse traps is literally closing and shutting down regions of space to reconstruction.
When we look carefully - and in the right spectrum - we see this story all over the sky.
If the Universe were a book, the grand-scale filaments represent the story that can be read, as the light is trapped in the cosmic maze it is bound to follow.
To describe the future, yet as uncertain as an empty page, space-time must find a way outside the critical path paved with relic dark matter.
One can only guess what will be on that page, and how the cosmological morphology that has evolved since BB will dominate the future.
As most of the universe's matter is following the large-scale structures, one can say that the types is set and the outline chosen, so the story will be about how matter is propagated within an ever-expanding foam of filaments, surrounded by still more of nothing.
It can also be imagined that the nodes in the long run will be blocked by an ever increasing amount of black holes - that will vacuum the network for matter, and that if that happens - in the long run - the baryonic pressure of electromagnetic particles in the network will drop to a degree so that the root web will shut down, because it is powered by energy that is constantly renewed, and space consequently will collapse around the black holes that eventually will become one.
Singularity.
The most perfect synchrony must come from uncertainties and errors. Probably, total balance will not allow for evolution.
Just do it
4 年Dark entities vs. fine structures in The Cosmos Interpretations of the observed results enables us to explain the data. Assuming then the causality is coverable by the cosmological standard model. But the fact that space is expanding faster and faster and galaxies are rotating too fast (leaving most of the content of the space is invisible) as explained by the “ghost”-effect from dark energy (and “ghostly”-dark matter), is a "machine-dependant" effect from a "broken" model to ensure it will deliver accurate results.