Can reality be confirmed by theory?
The Evolutionary track of the beginning of Universe
When we choose to base the physical model on the basis of axioms, theorems and constants, it's because it is not possible to confirm all explanations from directly observing reality.
According to our standing model The Universe has evolved from a singularity to the expanding space we know today.
Genesis, evolution and origin - how everything has become - where everything comes from. How on earth this world came about. Night and day, seasons, moon phases and the passing of time. These are in a class of questions that man has been asking ever since we turned our eyes at the overwhelming amount of glittering stars and wondered.
This is why it is necessary to limit the question of origin to a certain point - sort of common ground - of this universe. From a scientific point of view we are left to relate to what we can see. What, where and how.
Could the singularity - or could it not - have complex physical laws?
Before we discuss origin we need to choose how we think the space constitutes.
As we live in the opposite reality to a singularity we claim that Big Bang is preceded by a singularity with contradictive conditions, as we have a special interest in which could lead to this universe. A rule set to sustain the singularity could be:
1. Stay put
2. Don't move
Negative gravity can do that in a vanishing small context - Like in: Every body, get out! - Will ensure that nothing happens there.
As of why everything observed today is then by definition derived from The Big Bang which occurred everywhere - which at the time - also were the singularity.
The natural conditions are a result of the duration of events, and after Big Bang the only condition was extreme heat, Kelvin would probably have called it E_k (as in (total entropy)= no higher available temperature is possible to achieve) and Einstein: E (as in mc square).
Then a simple leak can do the job...
By arguing that the singularity collapsed out imbalance makes it subject for becoming the natural origin of this imperfect place we call the universe.
While incredibly small, and then all of a sudden expanded because of entropy, the board is wiped clean from from the Big Bang.
As this mean that the laws of physics as we know them has become imperative to the physical realm after the Big Bang, it then requires a strange set of conditions existing in the time before The Big Bang to establish this universe.
Since nothing escapes the laws of nature, it is relevant to argue that before any physical proces can start it need a natural cause. Natural causes are only natural because they are limited by natural forces like attraction, repulsion and gravity, without which any cause would be unnatural.
It is then most likely to propose that the Big Bang had some complex physical laws that not differed significantly from the present, but just worked differently, and that the physical laws for the relationship between matter, space and energy in the singularity probably didn't differ from the basic physical laws of nature that applies to the universe today, but by the odd conditions had other implications in context with the world.
They could be opposite though - like negative gravity and repulsive core forces. Since nature never let anything stay put, but rather have it in line for change from the beginning.
In the singularity there was no space for changes. Even if we think the known natural forces in situ, only the strong core force and gravity really would be able to govern the realm.
Even as it seems that the world all of a sudden jumped, from this super-dense state we call the singularity to a Cosmic size, by a proces we call Big Bang. We don't know how for sure, or what exactly caused it - or actually whatever the reason there was a big bang, and this reality entered the scene - it just so happened that scalarity max'ed out full power.
It is though very likely, that all energy in the world, really was crammed in to this tiny and unimaginable dense spot, and went off in the Big Bang like popping the cork from a bottle of champagne - only in every direction all at once.
Energy was immediately flung in all directions by the speed of light - at a takeoff temperature by the bang at - let's just say - a 1000 trillion degrees, and then it expanded - and stretched - space emerged itself out of nothing, thus became our expanding universe in a titanic flood of quarks, leptons - mostly like photons - and as the subsequently cooling just a short moment later began solidifying energy to matter, oh - and within the same second started forming protons and neutrons, that within the minut would stick together as the first hydrogen - and some helium. Busy, busy.
Everything. Almost at once.
But, from here it will now take hundred of hundred of thousand of years to form the rest of the atoms, as the environment is still far too hot to allow the weak core forces to bind to the electrons, that relentlessly was working themselves through this unimaginable hot medium as radiating waves.
Of course it is an hypothesis that cannot be proven. But it is rather probable, and it's very unlikely it shouldn't have happened this way. All the brightest minds - during the past century, have given it a thought, and it is officially how we think it have happened.
The singularity was not of this world in the first place - but by the Big Bang it became the origin of reality as we know it.
It may be the only option we have to include this necessary condition - as it is impossible to obtain evidence about based on the nature of the conditions, and it is very unlikely that it will ever be possible to demonstrate the probable connection to the origin of everything directly in the real world today anyways.
When we choose to base the physical model on the basis of axioms, theorems and constants, it is precisely because it is not possible to directly confirm the explanation from observing reality.
After all, science is about understanding, insight and probability, and evidence often grow out of the reality confirming the theory.
To establish order and predictability in the universe, we have invented the natural constants which are fundamental constants - as well as constants that occur when the basic laws of nature are described as quantitative connections - and enables us to generalize the conditions in which the universe unfolds over time.
Historically we have benefited from the invention of the natural constants, because they makes it possible too not only explain coherence during time, but also even establish important improvements of the laws of physics along, raising new questions about what we are observing.
But some things cannot be explained by the nature of things.
Apart from those, we confirm the theories, hypotheses, models and the underlying proof of the models by observations and empiric analyses from the real world.
By measuring the relative amounts of different elements in the universe. We have found that the universe contains about 74 percent hydrogen and 26 percent helium by mass, the two lightest elements.
All the other heavier elements - including elements common on earth, such as carbon and oxygen - make up just a fractional trace of all matter.
Of course this may not prove the Big Bang - but it makes the theory very probable as theoretical calculations show that these abundances would only have been made in a universe beginning from an extremely hot and dense state, and then immediately was cooled down - and expanded - and this is exactly the kind of universe that the Big Bang theory predicts.
Density fluctuations, dark matter infrastructure and the cosmic microwave background
We have no reason to believe other that the world originated because of unbalanced powers and errors. In an ungraceful event.
There are two contradictory and fundamental principles at stake in both the transformation of space into energy, energy into matter and matter into space:
- Expansion and contraction, cooling or heating
- Scalarity, as space gets bigger or smaller it turn colder or hotter.
Not so long ago, we obtained evidence for the Higgs Boson and the Higgs field, and this is great because they play a major role in how we think the formation of the cosmic infrastructure did happen.
We would also like to be able to explain how space sprung out - in an instant - from no size at all, to a million light years, just like that. By theory we blame a strange particle (the graviton) and it's natural field to cause this initial expansion. (But it's rather hard to prove).
Yet, with the Higgs field in place we can let it go and work hard in discovering how it prepared the world for cosmology, in a quest that will lead to homogeneity, isotropy and flatness of the Universe.
The secret tool was the quantum fluctuations that seed structure formation (the relentless big hammer).
This triggered the radiation-dominated era of the hot Big Bang; and the beginning of the processes of baryogenesis and dark matter production. The universe was extremely hot - we talk about degrees rising to the billions of billions of billions for several minutes. This heat led particles of matter in the early universe to obtain an incredible amount of energy, causing them to behave in a very different manner than particles of today. E.g, particles would move super fast back then and subsequently collide with much greater energy than today.
All of which there would have been no world of today.
We are talking about a really hot, dark and dense environment. We could say: By theory, 10 trillion degrees hot. Maybe a million more - or even less - but again, it's impossible to verify how hot it really was.
Because the temperature is so high, all the particles are extremely energy-rich, and most of the photons by that time actually are gamma rays. The universe is still only 1/100 billionth of a trillionth of a second old, 10 trillion (!) degrees hot – big as a house and lit up with an extremely bright sapphire blue light, but no visibility what so ever, because of all the electrons roaming freely in space, exiled by ions.
We can thank The Higgs field for producing these fluctuations, because that led to structure formation - and it also enabled the radiation-dominated epoch of the hot Big Bang to occur - (without which we couldn't be here today, and the world would probably just have become a gigantic sponge, filled with slowly evaporating cold gas - or a gigantic bright star, of burning metals we can't even dream of).
As all the particles scattered off of each other at high rates, keeping all the different species at the same temperature - space is kept in a thermal equilibrium, and as the photons also scattered off the electrically charged protons and electrons they could not travel very far either.
Moreover, in the modest extension of the Standard Model by three relatively light Majorana fermions—heavy neutral leptons (HNLs)—the Higgs field is important for baryogenesis, leading to the charge asymmetric Universe, and for that sake also dark matter.
As the universe expanded, the temperature dropped to about one billion degrees Kelvin (K). At that point, the protons and neutrons began to bind together to form atomic nuclei.
The primary conditions is all about energy - superheated pre-baryonic elements and the interaction between electromagnetic particles, dark matter and supercharged electromagnetic energy.
It makes perfect sense, that matter solidifies in two flavours - first dark and then non-dark matter - where the non-dark matter are governed by the weak- and strong core forces, and the dark matter is governed by the force of gravity.
When the relic force begin to loose the role as the dominating force - dark matter are the only matter that relate to new gravity, thus start compacting in the inflated new space-time, and the core forces governed by the electromagnetic force start combining baryon matter in the subsequently cooling as the radiation looses charges and quickly forms ions, photons and electrons. In a “space-quake” of oscillating charges and responding gravity waves.
By the time normal matter has begun forming, dark matter has already established pockets of matter and thereby also voids only roamed by radiation.
The nucleosynthesis following will grow yet more heated matter, and as protons are attracted to dark matter - and radiation and photons fast are traversing voids attracted by the core forces in the building of nuclei - the cores heats up and electrons can't bind - thus the protons looses the charges again and again as ions and protons - emitting a flood of neutrinos heating up the new space to the extremes.
It is assumed that before the neutral hydrogen formed, matter was then distributed almost uniformly in space - although small variations occurred in the density of both normal and dark matter because of quantum mechanical fluctuations. Gravity pulled the normal and dark matter in toward the center of each fluctuation, thus founding the cosmic web, matter has been following ever since.
While the dark matter continued to move inward, the normal matter fell in only until the pressure of photons pushed it back, causing it to flow outward until the gravitational pressure overcame the photon pressure and the matter began to fall in once more.
Each fluctuation “buzzed” in this way with a frequency depending at its size. This constant traction in and out had a high influence on the temperature of the normal matter. It heated up when it fell in and cooled off when it flowed out.
The dark matter, which does not interact with photons, remained unaffected by this effect. When the neutral hydrogen formed, areas into which the matter had fallen were hotter than the surroundings. Areas from which matter had streamed out, by contrast, were cooler.
The temperature of the matter in different regions of the sky - and the photons in thermal equilibrium with it - reflected the distribution of dark matter in the initial density fluctuations and the "buzzy" normal matter.
Figuring out how stars, galaxies and other large-scale astronomical arrangements throughout our universe have taken shape, focusing on the built-up of structure formation in space:
A pattern of temperature variations was frozen into the cosmic microwave background when the electrons and protons formed neutral hydrogen. So a map of the temperature variations in the CMB traces out the location and amount of different types of matter 390,000 years after the Big Bang.
By that making dense areas even more dense and heating all particles up in an thermal equilibrium mainly supporting dark matter to built up a vast filament making up the premature “root” basis for gravity to compact all matter.
We imagine the Higgs Boson floating freely in this state - attracting sub particles en mass, flocking around the boson like bees around flowers, unable to bind - as the field is still to hot and not yet frozen in a state where the boson can anchor to the field.
In the early universe - this is about 400,000 years after the Big Bang - conditions cooled enough to allow the formation of hydrogen atoms from free protons and electrons..
Observations reveal that galaxies was not randomly distributed, but comprise a gigantic cosmic web. The foundation of dark matter is derived from the alternating cooling and warming process, where large clouds of energy and matter have drawn slow sticky treads of dark matter through the resulting space.
By outstretching the swirling clouds of extremely hot particles and energy, - encircling and hammering away on the inactive and far more stationary strings of dark matter, almost like an electromagnetic whirlwind, long strands of matter is slowly extracted.
With the expansion of space, more and more threads is now entwisting together, docking along a yet more dense gravitation-borne stream of baryonic matter, and thereby channeling the the early Universe into a series of backbones.
Forming the critical path for the propagation of matter by the decreasing thermodynamics precede the cosmic architecture we know today.
After this early phase, known as "recombination," the universe began to take shape as objects—galaxies, stars, planets—coalesced from the elemental raw material left behind after the Big Bang.
As the threads grow larger and denser, the whirling clouds entangle nearby threads by simple thermo dynamic currents, resulting in a growth of stationary larger lumps of matter - and since the temperature is in a thermal equilibrium these big clouds cannot perform starformation - as the difference still is to small to ignite a synthesis of particles and energy to sub particles.
Voracious monsters of superdense elements - kind of black hole alike objects - just bigger, will begin aggregate along these backbones, and draw clouds after clouds around whirling around the core of dark matter, entangling yet more threads, just like whips of thermo dynamic currents encircling the superdense objects to gigantic knots of dark-matter-webs surrounded by clouds of super heated gas and dust.
As the black holes is building up, the temperature is dropping spot by spot, allowing local stellar-production to go into a phase of forming monstrously large quasars, that shread any nearby stars to atoms enriching the surrounding space with still more hot gas clouds.
The cooling phase enters a thermodynamic treshold. At a point, but scattered around in the network of channelled energy and particles the clouds will begin flare up, as the fusion can begin from the surface of the backbone, by ignition of super dense hydrogen to stars, and as the expansion is whirling ahead, the temperature finally drops and the surroundings of central knots of supermassive blackholes gradually will burst into fusion and slowly light up the sky until all the electrons are locked into the protons, letting the photons leave the stars when the temperature difference is large enough.
Business as usual for a vivid universe
Combining, tare apart, fling clouds and recombine - producing globular galaxies and forming galaxies per se in abundance - enriching the interstellar medium along the core-backbone which gradually is being fixated by its' own gravity in larger and larger clouds containing entangled collections of globular clusters and growing galaxies from the cooling intergalactic media. Expanding.
Because of dark energy/vacuum energy has inverted properties reacting to gravity, the new gravity force as became dominant caused the inflation by releasing the energy field corresponding rolling out cosmic distances in a blink of an eye.
The vacuum energy is still causing space to expand to this day and is represented by the cosmological constant and dark energy. It is depleted everywhere in the universe representing the relic zero-point radiation and are thereby a constant.
Vacuum energy furthermore is thought to have that odd property, that no matter how much it is stretched it remain constant.
Another way to describe dark energy (as "the scalar field for space") is that since nothing is stopping it (running away from gravity) it will continue to "expand space", and because of it's properties, this actually represents the most cost effective path to follow for this field we call distances in space, since stopping - or contracting - actually would cost an immense amount of energy by at least outbalancing the complete universe.
This is an indication of that gravity is increasing or that the kinetic energy-momentum of "space" is increasing with the expansion.
The density of dark energy is very low (7×10^?30 g/cm3) much less than the density of ordinary matter or dark matter within galaxies.
Dark energy are a constant energy density filling space homogeneously, but observed as scalar fields it can represent dynamic quantities whose energy density can vary in time and space. But scalar fields that change in space can be difficult to distinguish from a cosmological constant because the change are extremely slow.
Dark matter and dark energy are the structural foundation of this universe. But dark matter (DM) and dark energy (DE) are named "dark" from a reason, it can only be quantified by the effect it has on the realm - which is close to nil in the smallest of scale - but at the largest of scale it dominates by the force of abundance and can only be detected by the "foot print"
Objections to the model
The “ghost”-effect from dark energy (and “ghostly”-dark matter), is 'more likely' a "machine-dependant" effect from a "broken" model
No one has yet proven that neither dark matter nor energy actually is a real thing. It only exists as a hypothetical explanation for why the standard model cannot explain reality.
- Dark energy allegedly works by exerting of a negative, repulsive pressure, behaving like the opposite of gravity.
- Dark matter allegedly works as a kind of "super glue" that keep structures within and between galaxies constant
Cosmological observations tell us the standard model explains about 5% of the energy present in the universe. From the behaviour of light we can deduct that about 26% then should be dark matter, which would behave just like other matter, but only interacts weakly (if at all) with the Standard Model fields.
As the Standard Model does not supply any fundamental particles that constitutes good dark matter candidates, ever since this postulate, the brightest minds have tried to figure out what that matter could be.
In physical cosmology and astronomy, dark energy is regarded an unknown form of energy that affects the universe on the largest scales.
It has been hypothesised to account for the observational properties of distant type Ia supernovae, which show the universe going through an accelerated period of expansion. As the force that is causing the light waves to stretch - by inflating, yeah.. space.
Theory is that it is caused by the result of the complete universe is expanding, and in the proces stretching the light waves. It is a rather good explanation - a sort of dobler effect - and even most likely to be true, since it is almost bulletproof by countless observations.
As the first observational evidence for its existence came from supernovae measurements, which showed that the universe does not expand at a constant rate; rather, the expansion of the universe is accelerating.
There is a general consensus that a hypothesis should be falsifiable in order to be scientific, and as the natural sciences are empirical, it is based on experiences and observations in the real world.
But as science is made up by humans to understand the world, it is shaped by the spirit and culture from which it originated, as we try to be as objective and neutral as possible, we tend to value a theory by how well it predicts reality and measure the quality by describability of the world.
Predictability and accuracy are a must if a theory is to be regarded able to model what is happening in real life. It may be a premis for abstract models, but it doesn't not make it less problematic that science not necessarily need to be evident.
Considerable efforts have been taken place in providing rule sets and mathematical explanations about invisible topology to claim an establisment of what space is really about - and for what reasons it probably is growing at superscale in an unchained run against eternity - we also have invented 'mysterious' forces like 'dark energy' to set space, and hidden objects like dark matter to explain the fact that space is set for a wild run in form of stretching the fabric of space between the cosmological objects and explain why galaxies doesn't fling apart in the proces.
The world becomes from understanding - insight regarding natural contexts makes knowledge plausible, and it is by demonstration of coherence in between assumptions, events and constraints models are sprung.
The empirical layer of hypotheses makes it probable that reality has a character which follows the model that makes it evident.
Evidence can only be regarded as general and universal without context-dependent conditions.
When evidence is - apart from everything else - the context is disconnected from the reality they must confirm and thus becomes true only because it can be considered probable. Absence of reality is then not a constraint.
There can be many reasons why we choose to make interrelationships more likely to be true.
It can be the only option to include a necessary condition impossible to obtain evidence about based on the nature of the conditions, or it may be unlikely that it will be possible to demonstrate the relationship directly.
When we choose to base models on the basis of axioms, theorems and constants, it is because it is not possible to directly confirm it:
- The speed of light is constant
- Before Big bang there was only a singularity
- The world was created by the Big Bang
After all, science it is about understanding, insight and probability, and evidence then grows out of how well the theory is anchored in reality to confirm the hypotheses.
Theories, hypotheses, models and the underlying proof of the models is then confirmable by observations from the real world.
We tend to evaluate a theory by how well it predicts reality and measure the quality by describability of the world.
Predictability and accuracy are a must if a theory is to be regarded able to model what is happening in real life.
Theories - Out of order
Considerations whether the natural forces can support other atomic structures than currently known - or in fact the basic problem is a misinterpretation of informations derived from outdated scientific methods - due to the fact that the actual universe they are explaining, fundamentally has scaled out of comparison to the tiny solar systems the theories initially was intended for.
The fundamental forces of the universe: Gravity, the electromagnetic force, and the weak and strong nuclear powers are the bonds that provide the universe with the power of expansion - and viability - from the invisible tiniest relationships between matter and energy to the unimaginable cosmological contexts that we may or may not yet have fully understood, or even discovered.
But of course we are struggling to get this general world view - scientific heritage from the classical science - more than 100 years old, to play along with what we can actually see has played out in space:
- We find that the energy balance does not sum up
- nor are there enough matter in the world to account for what can be confirmed.
The question is though whether there actually is a lot more of mass, a lot of other strange kinds of matter, or whether gravity simply works different over large scale galactic distances, than the applied theory assume.
Two things can happen when we discover that reality is acting strange according to what we know - we can change the way we think of reality and call for new stuff or new understandings - either way new science is needed.
The answer will either uphold the Standard Model, which defines all of the known subatomic particles and how they interact, or introduce the possibility of an entirely undiscovered physics.
One of the most interesting properties within a galaxy is the way sustainability are achieved. Depending on Newton and Einsteins theories by method, we are left with no other option that to invent some new stuff. Dark matter.
Well, what do we know so far?
E.g. we know 'everything' about the Coma cluster - where we found evidence from dark matter: A 'smoking gun'.
We squeezed these facts out of light that have been traversing space for 55 mil. years - and by the relative luminosity (measured by relative brightness compared to our sun) from the combined sunshine from a trillion stars in more than 10.000 galaxies.
We didn't necessarily count them though - we combined the brightness and regarded them as a single classical object.
This is what we reason:
Dark matter allegedly works as a kind of "super glue" that keep structures within and between galaxies constant
This is how we know:
The Coma Cluster is one of the richest galaxy clusters known. It contains as much as 10,000 galaxies, each housing billions of stars.
The Swiss astronomer, Fritz Zwicky, in the 1930s discovered that the stars are rotating too fast in relation to the known content of this cluster. The reason this fact was possible to establish was because he captured a large numbers of galaxies in a single wide-angle photograph.
Zwicky then figures out that matter, on a very large scale - galactic way, sort of - can be attracted/stabilized, like in a fixed way by dark matter. Because gravity appear stronger than it should be, and that the galactic rotation are fixed - so that everything rotate with the same speed.
He find that the spatial distribution of galaxies in the Coma cluster resembles what one would expect theoretically for a bound set of bodies moving in the collective gravitational field of the system.
As Zwicky measures the dispersion of random velocities of the Coma galaxies, it turns out that they move by velocities at nearly 900 km/s. For a galaxy possessing this random velocity along a typical line of sight to be gravitationally bound within the known dimensions of the cluster requires Coma to have a total mass of about 5 × 10^15 solar masses.
The total luminosity (that's how much it shines, compared to the Sun: The amount of light emitted by an object in a unit of time ) of the Coma cluster is measured to be about 3 × 10^13 solar luminosities; therefore, the mass-to-light ratio in solar units required to explain Coma as a bound system exceeds by an order of brightness what can be reasonably ascribed to the known stellar populations.
Zwicky made a survey of all the galaxies in the cluster and used measurements of the Doppler shift of their spectra to determine their velocities,
and then he then applied the virial theorem.
The wavelength of light emitted by an object moving toward an observer is shortened, and the observer will see a shift to blue.
If the light-emitting object is moving away from the observer, the light will have a longer wavelength and the observer will see a shift to red.
By observing this shift to red or blue, astronomers can determine the velocity of distant stars and galaxies relative to the Earth.
The virial of a particle is defined as the product of the particle's momentum, p, and its position, x. The virial theorem states that if the time average of a particle's virial is zero, then the particle's kinetic energy, T, is related to the product of the net force, F, acting on the particle and the particle's position:
For particles—or galaxies—moving under the influence of a gravitational force, is equal to the particle's gravitational potential energy, which depends on the total mass inside the particle's orbit.
Zwicky used the virial theorem to relate the total average kinetic energy and total average potential energy of the galaxies of the Coma cluster. He argued that the virial for a pair of orbiting masses is zero, and used the principle of superposition to extend the argument to a system of interacting mass points. This allowed him to use the position and velocity measurements he carried out to find the mass of the galaxy cluster.
A straightforward application of classical mechanics, the virial theorem relates the velocity of orbiting objects to the amount of gravitational force acting on them. Isaac Newton's theory tells us that gravitational force is proportional to the masses of the objects involved, so Zwicky was able to calculate the total mass of the Coma Cluster from his measured galactic velocities.
Zwicky measured the combined light output of all the cluster's galaxies, which contain about a trillion stars altogether. He then compared the ratio of the total light output to the mass of the Coma Cluster with a similar ratio for the nearby Kapteyn stellar system, and found that the light output per unit mass for the cluster fell short of that from a single Kapteyn star by a factor of over 100.
He then reasoned that the Coma Cluster must contain a large amount of matter not accounted for by the light of the stars.
This he called "dark matter."
A broken model
The bodies encircling within galaxy discs - in a galaxy cluster - galactic-wise are not all travelling at the same speed. There will be local systems of galaxies that will be moving extremely fast, some will be moving with moderate speeds according to the mass of the system. The magnitude will show great varians - and thereby the luminosity will indicate different velocities.
By aggregating the peak value of the cross-section Zwicky built-in a trend that tend to show too high magnitude. The correct value would be to cross-mean aggregate the squared values (not using the most probable highest velocity): Since the representation will tend to overrepresent the directions directly towards our viewing angle.
By the choice of calculation Zwickys cross-section tend to use "over exposed" values, and thereby produces an evidence of missing matter.
Hence no dark matter is needed.
But, by going with the average velocity:
(If you observe all the light that is in the sky, you can, by assuming that the rest of the galaxy is reminiscent of our own part, estimate how much visible matter there is altogether, and since matter (and energy) affects bodies with gravitational forces, it is possible to calculate how the rotational speed of the disc stars should be.)
Subsequent measurements of the rotational speed of the stars show that there is a lack of inconsistent amounts of matter. (As we see only about 10% of the required mass.) This is related to dark matter (it doesn't light up, otherwise we could see it) and is impossible to measure, as we can only observe what we see, and thus it is difficult to find out what it consists of.
Theories have pointed out that it could be neutrinos (which have a vanishingly small mass), brown dwarfs, planets, WIMPs or ordinary neutral hydrogen gas.
I.e. that dark matter is already necessary to explain the fact that the galaxies do not seem to follow the basic laws of physics.
The discovery of major deviations in the behavior of celestial bodies in relation to gravity and Newton's second law, using simple Newtonian equations to describe the power relationships of one of the universe's most dense galactic populations -
(because the net force are assumed to be constant)
- lead to the suggestion, that something had to be be wrong with the galactic cluster, and no questions asked regarding method or sufficiency for the purpose of calculating this seriously complex context, has left us with the idea of "dark matter" in an abundance, previously unknown - in a scale out of charter relative to known matter - constituting the greater part of all matter in the universe.
It's a bit problematic to suggest the whereabouts of such a substantial mass of matter. Science has showed us that the dark matter/dark energy cannot be caused by self inflicted changes in the local matter - nor can it be regional transfers from different areas of space - as that would cause a collapse of space, and since it is not matter as we know it - with nuclei and electrons, protons and neutrons and electromagnetism, weak and strong core forces - it's something else.
It is likely that the Big Bang had physical laws that not differed significantly from the present, but just worked differently by conditions in context with the world.
Even if we think the natural forces into the singularity, only the strong core force and gravity would be able to govern the realm.
Big Bang though is very likely to be probable.
Theory has, that energy immediately was flung in all directions by the speed of light - at a takeoff temperature at the bang at, say - a 1000 trillion degrees - and expanded - and stretched - space itself out of nothing - thus became our expanding universe in a titanic flood of quarks, leptons - mostly like photons - and as the subsequently cooling just a short while later began solidifying energy to matter, within the same second started forming protons and neutrons, that within the minut would stick together as the first hydrogen - and some helium.
From here it will take hundred of thousand of years to form the rest of the atoms.
It's an hypothesis that cannot be proven, and it's very unlikely that it will ever be possible to demonstrate a probable connection to the origin of everything directly in the real world of today anyways, other than well prepared theory and adequate math.
The size of the universe is equivalent to the age of light plus uncertainty of the darkness leading to this visible space and out - never to be seen again - where space is expanding faster than light.
The Standard Cosmological model (Lambda-CDM model is based on redshift of light, Hubbles’ Constant/Cosmological constant, Dark energy and a modification of Einsteins field equations).
Everything by the Cosmological model is built on assumptions. There is no safe ground here.
The true value of a model lies in the extent to which it reflects the reality the model claims to describe.
Backtracing conditions based on the background of larger accuracy in the details, of course lead to higher precision.
It is obvious that the level of detail of the observation data is growing with increasing precision and that methods providing more accurate results describes the reality in growing quality by the better details of the observed.
The future will supply still less light to obtain data from, as it fade out of sight per design from the model.
In fact, by then it will be impossible to obtain any further evidence by the Lambda-CDM model, as it will be left by reality.
Explainability, persistence and establishment is fundamental for reality, which is knowable because of these anchors.
Modelling phenomenas mathematically is reality as the ghost in the machine. It is no different than a qualified guess since it represents the way we believe - or know - reality works by model.
The true value of a model lies in the extent to which it reflects the reality the model claims to describe. Modelling a mathematical connection to reality has brought the little details to focal. The devil is in the detail.
It is obvious that tailored methods is providing more accurate results describing the reality in growing quality by the better details of the observed. It is then possible to model reality with increasing certainty.
But reality can confirm theories only to a certain point - as they only allows us to theorize from calculations that can be confirmed by observations.
Modelled evidence is still based on qualified guesses, and fantasy calculations.
Covering the explanations in the physical standard model, authencity is up to scientists to deliver reality check on the theories - and models - and this allow us to change the way we think of reality. (Get smarter).
Just do it
4 年No one has yet proven that neither dark matter nor energy is actually found in reality. It only exists as a hypothetical explanation for why the standard model cannot explain reality. Where dark energy allegedly works by exerting of a negative, repulsive pressure, behaving like the opposite of gravity. Dark matter allegedly works as a kind of "super glue" that keep structures within and between galaxies constant There is a general consensus that a hypothesis should be falsifiable in order to be scientific, and as the natural sciences are empirical, it is based on experiences and observations in the real world. As science is made up by humans to understand the world, science is also shaped by the spirit and culture in which it has originated, even as we try to be as objective and neutral as possible, we tend to value a theory by how well it predicts reality and measure the quality by describability of the world. Predictability and accuracy are a must if a theory is to be regarded able to model what is happening in real life It may be a premis for abstract models, but it doesn't not make it less problematic that science not necessarily need to be evident.