A Space & astronomy forum. SpaceBanter.com

Go Back   Home » SpaceBanter.com forum » Astronomy and Astrophysics » Astronomy Misc
Site Map Home Authors List Search Today's Posts Mark Forums Read Web Partners

The Failure of Mainstream Cosmology



 
 
Thread Tools Display Modes
  #1  
Old April 4th 17, 02:03 PM posted to sci.astro
Pentcho Valev
external usenet poster
 
Posts: 8,078
Default The Failure of Mainstream Cosmology

"...somebody is going to have to take back Nobel prizes awarded in 2011 for the discovery of the accelerating expansion of the universe... [...] ...mainstream cosmology has done such a bad job of solving the dark energy problem that it will likely be some nonmainstream idea..." http://www.sciencemag.org/news/2017/...nergy-illusion

My comment in Science:


There is no expansion of the universe - it is STATIC.

Star light slows down as it travels through the space vacuum, an effect caused by a factor equivalent to vacuum friction. For not so distant stars this is expressed as Hubble redshift but beyond a certain distance the star light does not reach us at all (Olbers' paradox).

The idea that vacuum can slow down light is largely discussed, but only in a quantum gravity context:

Sabine Hossenfelder: "It's an old story: Quantum fluctuations of space-time might change the travel-time of light. Light of higher frequencies would be a little faster than that of lower frequencies. Or slower, depending on the sign of an unknown constant. Either way, the spectral colors of light would run apart, or 'disperse' as they say if they don't want you to understand what they say. Such quantum gravitational effects are miniscule, but added up over long distances they can become observable. Gamma ray bursts are therefore ideal to search for evidence of such an energy-dependent speed of light."

I think it is time to start discussing the parallel idea: that slowing down light by vacuum produces the Hubble redshift (in a STATIC universe):

Paul Davies: "This leads to the prediction of vacuum friction: The quantum vacuum can act in a manner reminiscent of a viscous fluid."

New Scientist: "Vacuum has friction after all. In quantum mechanics, the uncertainty principle says we can never be sure that an apparent vacuum is truly empty. Instead, space is fizzing with photons that are constantly popping into and out of existence before they can be measured directly. Even though they appear only fleetingly, these "virtual" photons exert the same electromagnetic forces on the objects they encounter as normal photons do."

Natu "As waves travel through a medium, they lose energy over time. This dampening effect would also happen to photons traveling through spacetime, the researchers found."

Wilfred Sorrell: "The cosmological redshift is caused by the tired-light phenomenon originally proposed by Zwicky."

Pentcho Valev
  #2  
Old April 5th 17, 12:13 PM posted to sci.astro
Pentcho Valev
external usenet poster
 
Posts: 8,078
Default The Failure of Mainstream Cosmology

I made two more comments but Adrian Cho rejected them:

HYPOTHESIS: As the photon travels through the viscous vacuum (in a STATIC universe), it loses speed analogously to a golf ball losing speed due to the resistance of the air.

On this hypothesis the resistive force (Fr) is proportional to the velocity of the photon (V):

Fr = - KV

That is, the speed of light decreases with time in accordance with the equation:

dV/dt = - K'V

Clearly, at the end of a very long journey of photons (coming from a very distant object), the contribution to the redshift is much smaller than the contribution at the beginning of the journey. Light coming from nearer objects is less subject to this effect, that is, the increase of the redshift with distance is closer to LINEAR for short distances. For distant light sources we have:

f' = f(exp(-kt))

where f is the initial and f' the measured (redshifted) frequency. For short distances the following approximations can be made:
I
f' = f(exp(-kt)) ~ f(1-kt) ~ f - kd/λ

where d is the distance between the light source and the observer and λ is the wavelength. The equation f'=f-kd/λ is only valid for short distances and corresponds to the Hubble law. The equation f'=f(exp(-kt)) shows that, at the end of a very long journey (in a STATIC universe), photons redshift much less vigorously than at the beginning of the journey. This provides an alternative explanation of the observations that brought the 2011 Nobel Prize for Physics to Saul Perlmutter, Adam Riess and Brian Schmidt.

Here is a similar interpretation:

http://arxiv.org/pdf/physics/9911069.pdf
Eugene I. Shtyrkov, The Evolved-Vacuum Model of Redshifts: "There are also alternative models of redshifts which obey the redshift-distance relation and based on an idea of gradual change of light parameters due to interaction between light and matter while the light is traveling gigantic distances through space for a very long time. There are two candidate ways for such interaction to cause redshifts: gradual energy loss by the photon due to absorption during propagation of light with a constant velocity (tired-light model, see, for instance, [8]) and propagation of light with the variable velocity and without absorption in free space (variable-light-velocity models). [...] Thus we come to a very important conclusion: the induction wave, and hence the light one, must travel in vacuum with conservation of wave length even when the parameters are time dependent. [...] ...we obtain a simple differential equation for the light velocity: dc(t)/dt = -Ho.c(t) (15) [...] Although reproducing the conclusions of the tired-light model, namely, about simultaneous decreasing the electric field strength and frequency, this model has a different physical interpretation. Instead of energy loss due to absorption at constant light velocity, this mechanism is based on gradual change of the vacuum parameters that results in declining of the electric field strength. The electromagnetic wave is gradually slowing down, with conservation of the initially shifted wavelength (lambda)_shift. The frequency perceived by observers at any point on the light path depends on the light velocity at the observation time."

Clearly variable-speed-of-light interpretations of the Hubble redshift (for a STATIC universe) converge to an equation of the type dc(t)/dt = -Ho.c(t) (I presented it as dV/dt = -K'V).

Dark energy is a fudge factor - now other fudge factors will replace it. Unlike special relativity, general relativity is empirical, not deductive. It was not deduced from postulates (has no postulates). General relativity was the result of endlessly changing and fudging equations until some final version managed to match known in advance experimental results and pet assumptions. And fudge factors belong to the essence of general relativity.

Can one introduce a fudge factor analogous to the cosmological constant in Lorentz transformation equations? One cannot, and the reason is simple: Special relativity is DEDUCTIVE (even though a false assumption and an invalid argument have spoiled it from the very beginning) and fudging is impossible by definition - one has no right to introduce anything that does not follow from the postulates.

The only alternative to deductive theory is empirical concoction (a "theory" that is not even wrong) - Einstein clearly explains this he

Albert Einstein: "From a systematic theoretical point of view, we may imagine the process of evolution of an empirical science to be a continuous process of induction. Theories are evolved and are expressed in short compass as statements of a large number of individual observations in the form of empirical laws, from which the general laws can be ascertained by comparison. Regarded in this way, the development of a science bears some resemblance to the compilation of a classified catalogue. It is, as it were, a purely empirical enterprise. But this point of view by no means embraces the whole of the actual process ; for it slurs over the important part played by intuition and deductive thought in the development of an exact science. As soon as a science has emerged from its initial stages, theoretical advances are no longer achieved merely by a process of arrangement. Guided by empirical data, the investigator rather develops a system of thought which, in general, is built up logically from a small number of fundamental assumptions, the so-called axioms."

Special relativity was indeed "built up logically from a small number of fundamental assumptions" but general relativity was, to use Einstein's words, "a purely empirical enterprise". Einstein and his mathematical friends changed and fudged equations countless times until "a classified catalogue" was compiled where known in advance results and pet assumptions (such as the Mercury's precession, the equivalence principle, gravitational time dilation) coexisted in an apparently consistent manner. Being an empirical concoction, general relativity allows Einsteinians to introduce, change and withdraw fudge factors until the "theory" manages to predict anything Einsteinians want. Then the prediction turns out to be confirmed by observations (surprise surprise).

The fudge-factor activity is inglorious and Einsteinians don't discuss it openly, but sometimes the truth comes out inadvertently. So conventional dark matter models based on general relativity "need four free parameters to be adjusted to explain the data" (how many fudge factors LIGO conspirators needed in order to model the nonexistent gravitational waves is a deep mystery):

Quote: "Verlinde's calculations fit the new study's observations without resorting to free parameters – essentially values that can be tweaked at will to make theory and observation match. By contrast, says Brouwer, conventional dark matter models need four free parameters to be adjusted to explain the data.

Pentcho Valev

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
More ho-hum death/destruction cosmology from the mainstream Imperishable Stars Misc 6 September 26th 04 01:45 AM
Another nail in the coffin of the Big Bang! Sco Mainstream: 0,New Cosmology 1 Imperishable Stars Misc 2 September 24th 04 06:11 AM
The usual ho-hum destructive/death cosmology courtesy of the mainstream Imperishable Stars Misc 11 September 18th 04 10:02 PM
More ho-hum death/destruction cosmology from the mainstream Imperishable Stars Misc 0 September 14th 04 11:04 PM
New Particle Baffles Physicists: Score - Mainstream: 0, New Cosmology:1 Imperishable Stars Misc 4 September 13th 04 12:11 AM


All times are GMT +1. The time now is 08:25 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 SpaceBanter.com.
The comments are property of their posters.