"Presently the Hubble’s law is explained by Doppler shift being larger from distant stars. This effectively supports the hypothesis of expanding universe. In the mass polariton theory of light this hypothesis is not needed since redshift becomes automatically proportional to the distance from the star to the observer”, explains Professor Jukka Tulkki."
If this proves to be true, is there a chance that our whole picture of the expanding cosmos needs a full revisiting?
Such an explanation for observed redshift apparently comes up again and again, but thus far hasn't held up to experiment... it even had a name: "tired light". https://en.wikipedia.org/wiki/Tired_light
It'll be interesting to see if this sparks new debate on the subject.
'Tired light' appears to be an umbrella term for various disparate theories that are only related in their attempt to provide an alternative explanation for red shift. If this particular hypothesis, which is not explicitly cosmological, is validated in terrestrial experiments, I think its cosmological implications would be unavoidable.
They didn't run any experiments. They only have some a theoretical model and simulations in the computer.
The main part of the article, that says that the momentum of the light is split in the photon and in the density wave looks good. (It almost looks obvious, but it's not my expertise area https://xkcd.com/793/ .)
The relation with the Hubble constant is totally unexpected. It is not in the abstract of the peer review article. (I can't read the full text.) So it's probably only a declaration for the press release. Take it with a grain of salt.
Not necessarily. It might be that this effect is real (this was a computational simulation so an experimental verification is needed.) and needs to be accounted when doing cosmology. But it might very well account just a part of the redshift, and the remainder then needs to be accounted by something else, the default explanation being the expanding universe.
The reckless child in me wants to jump for joy at the absurdity of having a long-held tenent of my understanding of the universe shattered. The quiet pragmatist is begging for the reserved "Well let's wait and see". I didn't expect an article on recent developments in computational physics to render such an emotional reaction. Either way, a very thought provoking article!
No, there are too many observations that are consistent with an expanding universe. For example, primordial nucleosynthesis [1] would either not have occurred in a non-expanding universe, or would not have stopped so soon. In fact, it can be used to show that it is consistent with having only 3 families of light neutrino species.
But if we subtract this (presumably minor) effect from the observed redshift of distant galaxies, we're necessarily left with lower expansion speeds. In turn this means there's less need for gravity to hold them together in the various structures we observe from the galactic level up. This has the implication (I presume) of affecting inferred requirements of dark matter and dark energy.
However, what observations (besides assumptions about the source of background microwave radiation) are consistent with the pejoratively-named BigBang theory? We don't even know how big the universe is. It mat be far larger than the observable universe. And all the pocket universes are technically part of the same multiverse -- what makes us think there was for sure a big bang?
All that the "Big Bang" refer to is 1) the fact that the (visible) universe was once much hotter and denser than it is now and 2) if we extrapolate back in time, using the theories that we have, there was a point where the density was infinite.
Of course, we know that our theoretical model break down before that (since we don't have an experimetally verified theory of gravity compatible with quantum mechanics). Physicist that work in the field (and I used to be one of them) are well aware that we cannot extrapolate past the regions of validity of our theories and no one claim that there was indeed a singular point ("Big Bang") as mentioned in descriptions aimed at a non-expert audience.
But only if interstellar mass is evenly distributed... which it isnt. If this is to replace hubble we should see differences in redshift where the light passes through more matter. There should be differences between redshift distance measurements and other non-redshift distances. That shouldnt be too hard to detect (or not).
It could be extremely hard. We don't have many good ways to measure how far away objects are in space. In fact the main way we have been estimating distance so far is by measuring red shift. We also have very limited means for measuring the density of the intergalactic medium.
I just don't think we have any good ways to distinguish whether a high redshift is due to the object being very distant or the intergalactic medium in that direction being particularly dense, for most objects.
I guess the geometry of galactic filaments may be examined for distortions/textures relating to non-standard red shifting. We may see the filaments are bent from our viewpoint in accordance with the presence of voids or other filaments along the way.
There are some very good ways. Thats how redshifting was proven in the first place. Cephiads (sp), pulsing stars, were used to prove that galaxies were a thing. They give very accurate distances at ranges where redshift is detectable.
Wasn't it the other way around? A type of supernovas ("standard candles") were discovered before the expansion and used to discover the expansion? The formula for distance only uses absolute and apparent magnitudes to calculate distance?
Indeed, I recall that redshift measurement is performed by comparing the shift on emission spectra (which is absolute), not some bizarre ”degree of red dimming” which would be hard to quantify.
I don't see how it is pertinent, as space is not "a transparent medium", it is void.
Pretty much all the photons emitted by a star that we can detect cross only the star's gas surrounding, then the emptiness of space, then our atmosphere.
So, it does not vary with the distance of the star, thus the author's argument does not hold (anyway, he was being super-speculative here).
The space is not void as there is non-zero density of matter that we see in absorbtion lines of quasars [1]. This implies that the space has to be treated as transparent medium with non-trivial optical properties.
I'm unconvinced by the paragraph beginning, "However, the above result leads to a striking contradiction with the covariance principle, which is a fundamental
prerequisite of the special theory of relativity."
It is pretty well accepted that light can have mass, in the sense that the authors are using the term. For example, light trapped as a standing wave between two mirrors in an etalon must have zero momentum. (By symmetry: if it has momentum, which way does the vector point?) But that light has energy, so it has mass in the sense that m must be non-zero to satisfy E² = p²c² + m²c⁴. More precisely, the momentum-energy vector of the light is timelike instead of lightlike.
This makes me suspect the whole paper, because it seems really plausible that a light wave in a medium is partly like a standing wave. In particular, there is a limit of a dense medium whose momentum is negligible. (Maybe the refractive index approaches 1 in a very dense medium, but I can't imagine why that would be so.) I'd have to spend a morning reading and thinking about it to be sure.
Also, the paper is suspiciously detailed; these questions should be resolvable at a much higher level of abstraction, and certainly without computer simulations.
Presently the Hubble’s law is explained by Doppler shift being larger from distant stars. This effectively supports the hypothesis of expanding universe. In the mass polariton theory of light this hypothesis is not needed since redshift becomes automatically proportional to the distance from the star to the observer”, explains Professor Jukka Tulkki.
I'm unsure about the commercial licensing aspect of it if that's what you're asking, but mayavi has both a GUI interface as well as a scripting interface depending on how you want to use it. In my experience it's essentially the library to use for 3D plotting when matplotlib isn't cutting it.
(I just checked wikipedia and it seems it's released under BSD license. Unsure if other products available which are built on top of it)
Don't know why you were downvoted. Doing e.g. dataset -> .vtk file -> ParaView to add whatever vectors, color maps, surfaces etc. you want -> .obj file -> Blender to do final lighting, coloring and raytrace isn't unheard of. I've seen it done several times in CFD, and I've done it myself.
It becomes a bit of a pain for animating datasets, since you need to do Blender scripting with Python which is far from intuitive. But it's doable.
>alto University researchers show that in a transparent medium each photon is accompanied by an atomic mass density wave. The optical force of the photon sets the medium atoms in motion and makes them carry 92% of the total momentum of light, in the case of silicon.
Wait a min. Can this be used to build optical logic gates?
Isn't the problem with optical computers is that we don't have a way to switch light using light only. I though maybe this will enable to do something like that..because it says that light through a medium affecting a physical property, and that too at the speed of light...
There is a summary statement that I found to be especially helpful in framing the question:
"To summarize... it's probably OK to "redo" the budget in such a way that a part of the momentum of the photon is attributed to the dielectric material when the photon enters it, and then it is returned back to the photon. In this way, one may justify the Abraham's form - and probably many other forms - but why should one really do it?"
That is explained in the second paragraph of the article:
In the literature, there has existed two different values for the momentum of light in the transparent medium. Typically, these values differ by a factor of ten and this discrepancy is known as the momentum paradox of light.
However, IMO none of these explanations are particularly satisfying because none of them (AFAICT) explain where these two different formulations come from. They all leave you with the impression that two physicists just pulled two different equations out of their ass and now they're arguing over which one is right as if it were some sort of theological discussion. (This is a common problem in physics pedagogy.) I would really love to hear from a physicist who understands and can explain the basis for the two different formulations.
If this proves to be true, is there a chance that our whole picture of the expanding cosmos needs a full revisiting?