• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

  • Christian Forums is looking to bring on new moderators to the CF Staff Team! If you have been an active member of CF for at least three months with 200 posts during that time, you're eligible to apply! This is a great way to give back to CF and keep the forums running smoothly! If you're interested, you can submit your application here!

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
How the Universe Stopped Making Sense | Space

This is a pretty good new article about the growing tension between the Planck data sets and the SN1A-Cepheid data sets as it relates to estimating the Hubble constant, along with a nifty color graph to explain the problem. The different colored areas represent the "ranges" of options which are "constrained" by various observations. The yellow region represents the SN1A-Cepheid constraints.




Essentially the white doted circle in the image represents the "range" of the Hubble constant that is predicted/constrained by Planck, and the yellow rectangular area to the right of the circle represents the Cepheid range. They don't overlap anymore and that's the problem. Eyeballing the other "constraints" would seem to favor a number at the center of the Planck range, but so far nobody has found any serious problems the methodology related to the supernova-Cepheid variable constraints, and it's been pretty consistent over time as well. It's actually quite a dilemma.
 
Last edited:
Reactions: Astrophile

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
FYI, this the the link to the pre-published version of the paper on Arxiv that is being discussed in that article.

FYI, if you're interested in the various methods that are being used to measure the Hubble constant, and that are used to create the colorful image above, you might begin with the link below that has a short video explaining how lensing data is being used to plot the medium blue color part of the graph from the "TDSL" (time delay distance for strong lensing) data sets.

H0LiCOW

There's a very informative, 7 minute video explaining how "strong lensing" data is used to calculate the Hubble constant. Ok, admittedly they aren't necessarily the best "actors" on the planet, but hey, what do you expect from "scientists"?

There's also a nice graph on that website (shown below) that lists the basic results, which shows that their methodology is pretty consistent with the Cepheid-SN1A results in terms of calculating the Hubble constant, but there is pretty significant tension between the lensing results and the methods that are related to the cosmic microwave background. Essentially their figure for the Hubble constants comes up with a number that is within the margins of error with the Cepheid-SN1A methods, but they are well outside the margins of error compared to the methods that are related to the cosmic microwave background.


 
Last edited:
Reactions: Astrophile
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
FYI, here's a link to the 2019 Planck Collaboration results.

https://arxiv.org/pdf/1807.06209.pdf

 
Last edited:
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
I started reading through the Planck Collaboration paper this morning to see what types of assumptions it makes which might be worth revisiting. I've already found two assumptions that seem to warrant further scrutiny IMO.

1. Neutrino masses:

We assume three neutrinos species, approximated as two mass-less states and a single massive neutrino of mass mν=0.06 eV.

Neutrino oscillation experiments would tend to suggest that neutrinos do actually *have* (rest) mass. To date, the best "constraint" on neutrino mass puts limits on that mass to less than 1.1 eV.

Neutrino Experiment Reveals (Again) That Something Is Missing from Our Universe

The first results from KATRIN have been announced, and the researchers came to an early conclusion: Neutrinos have a mass no higher than 1.1 electron volts (eV).

That's a relatively weak constraint compared to the number that is being used by Planck. It seems a little dubious however for Planck to assume that two of the three neutrinos are "massless". I'm not exactly sure how changes to these specific assumptions might change the Planck figures, but IMO it warrants further scrutiny.

Neutrino masses | All Things Neutrino


While two of the three masses would appear to be close to each other, and third one is larger (or smaller) than the other two, it's not likely that any of them have a "zero" mass.

2. Neutron decay times:

We now use a fixed fiducial neutron decay-constant value of τn=880.2 s, neglecting uncertainties.

Free neutron decay - Wikipedia

Outside the nucleus, free neutrons are unstable and have a mean lifetime of 881.5±1.5 s (about 14 minutes, 42 seconds).


The bottle measurements of neutron decay average out to 879 s, whereas the beam method produces a figure around 888 s. While Planck's figure of 880.2 is within that range, it's at lower end of the range. Again, I don't fully understand what the net effect of this assumption might be on Planck's final figures, but it's another area of uncertainty that seems like it's worth exploring.
 
Last edited:
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
From the reading I've done thus far, it looks to me like the baryon acoustic oscillations associated with the cosmic microwave background have become a bit of a noose around the neck of the LCMD model.


This image pretty much tells the story IMO. Whereas the Cepheid, SN1A and lensing estimates are all compatible with one another, they simply don't seem to be compatible with estimates based on BAO estimates of the CMB.

While it might be possible to change some of the percentages of dark matter and energy and still keep the nucleosynthesis predictions to work with SN1A, Cepheid, and lensing studies, it's almost impossible to change the percentages and still get a fit to the BAO curve.

So........

Either the CMB isn't actually related to a "surface of last scattering" as presumed, redshift isn't actually caused by expansion, or both.

If you notice the line related to Planck 2019 data is very short compared to the other methods. It's more "tightly constrained" mostly because of the BAO curve. It doesn't appear to be possible to change the various percentages of ordinary matter, dark matter and dark energy very much and still get a decent fit to that BAO curve, hence the error bars associated with Planck 2019 are pretty small. There's just not much wiggle room.

If redshift is actually caused by expansion, then the lensing, Cepheid and SN1A data all line up well, but then the CMB can't actually be related to a surface of last scattering. It may not seem like a lot of difference, but in actuality, it's pretty serious tension.
 
Last edited:
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
From the Planck 2019 paper:

The latest measurement from Riess et al.(2019) is discrepant with the Planck base-ΛCDM value for H0 at about the 4.4σ level. This large discrepancy, and its possible implications for cosmology, is discussed in Sect. 5.4

From section 5.4:


It's very interesting IMO, and quite telling that this is the second time in the last two decades that the expansion interpretation of cosmological redshift has failed to correctly "predict" very important future observations.

Earlier expansion models failed to predict acceleration to begin with, and now they (apparently) fail to predict "jerk" too.

In the context of GR theory alone, it was mathematically possible and justifiable to resurrect what Einstein referred to as his "greatest blunder", and reintroduce a non-zero "constant" in the context of GR, but there's actually no inherent (to GR theory) mathematical justification for adding a new *variable* to GR theory. That's quite a major mathematical and philosophical departure from GR theory actually.

It wouldn't actually be a "lambda" CDM expansion model anymore, because it would have be something *other* than a constant, rather it would be a variable that changes over time. Simply replacing a constant with a variable seems a little, well, "irreverent" in the context of GR theory, particularly if one is trying to justify their model based on the validity of GR theory.

IMO the "logical" thing to do would be to reevaluate the original assumption about cosmological redshift being "caused" by "space expansion/acceleration/jerk" and reconsider the *other* possible explanation for cosmological redshift that Edwin Hubble discussed and eventually embraced, namely "tired light"/plasma redshift options.

Instead however, it looks like the "push" in terms of tightening up the error bars of various observations is get the tension between the observations to the magic 5+ sigma level with the express intent of claiming a "discovery" of yet some other ad hoc element to be included into the expansion model.

The last failed "test" of the expansion model wasn't used to the falsify the expansion interpretation of cosmological redshift, and this second (apparent) failure doesn't seem to leading toward a reassessment of the expansion interpretation of redshift either. It only seems to demonstrate that there's no actual way to falsify the expansion interpretation of redshift which would seem to move expansion oriented cosmology models out of the realm of falsifiable science, and squarely into the realm of "unfalsifiable dogma". That's scary.

Adios ΛCDM, and say hello to "Jerk-CDM"?
 
Last edited:
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
IMO this issue demonstrates two key points.

The actual "predictive" track record of the expansion interpretation of photon redshift has been very poor, and the assumption that photon redshift is caused by "space expansion" has become an unfalsifiable "dogma".

The expansion interpretation of redshift didn't accurately "predict" the SN1A data. Rather the SN1A data was quite a "shock", and it required the introduction of 70 percent of a never seen substance called "dark energy" to salvage the expansion interpretation of redshift.

Assuming the current tension is not resolved, the "new and improved" expansion interpretation of redshift didn't accurately predict the BAO data from the CMB either. The "fix" for that problem is unknown at this time, but suffice to say there's no internal justification from GR theory itself to add whole new variables into a GR formula. To do so amounts to an affirming the consequent fallacy without even explaining what "dark energy" might be, let alone explaining how it could possibly increase in density throughout an expansion process which should cause it to *decrease* in density, not increase in density.

Assuming the current 4.4 sigma tension rises to 5+ sigma, one could make the argument that the second major failure of the expansion model to predict future observations in the past two decades is a falsification of the expansion interpretation of redshift.

To *not* allow the expansion interpretation to be called into question and falsified amounts to special pleading, and suggests that the expansion *concept* as the cause of redshift has morphed into a type of "sacred dogma" which cannot be falsified by any "test". The expansion interpretation of redshift has failed one major test already in SN1A observations, and it's apparently failed a second "test" as well in the CMB data. It seems to me that the failures of the expansion model relate back to it's core assumption as to the cause of redshift, not some hypothetical "fix" we might "slap on" to another "new and improved" expansion model.

Other types of observations at higher redshifts also suggest that the core assumption about the the cause of redshift is the real problem. We're already finding massive quasars in the distant universe which were not 'predicted' to exist based on the expansion model. We're also finding "mature" and massive galaxies at very distant redshifts which defy the evolutionary model of galaxy formation. We're also finding H-Alpha lines from distant galaxies during a period of time where the universe is predicted to be "opaque" to such wavelengths in the expansion model.

The bottom line is that the actual "predictive" track record of the expansion interpretation of redshift has been quite poor, so there's no convincing evidence that it's worth salvaging by adding even more ad-hoc components to it.

The other problem with attempting to replace a "cosmological constant" with a cosmological "variable" in a GR formula is that such a concept has no mathematical precedent in GR. Einstein described even the the addition of a non zero constant to GR his "greatest blunder" and he ultimately removed it. The constant was resurrected again when the expansion model failed the SN1A tests and dark energy was added to the expansion model. According to the Planck paper, the only workable solutions to the current problem would involve removing the constant again and replacing it with a whole new variable that isn't even directly related to GR theory to start with. That sure seems like a highly inelegant resolution to the new problem. Why not just allow the expansion model to die a natural scientific death and look for other possible explanations for redshift?
 
Last edited:
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
Well, just to add some more confusion to the estimates, a new study based on angular diameter distances to strong gravitational lenses was published last month:

A measurement of the Hubble constant from angular diameter distances to two gravitational lenses | Science

Using these absolute distances to calibrate 740 previously measured relative distances to SNe, we measure the Hubble constant to be H0=82.4(+8.4,−8.3) kilometers per second per megaparsec.

In relationship to this graph, this new angular diameter study put the constant somewhere between 74.1 and 90.8. This range overlaps with Shoes 2019 and HOLiCOW 2019 studies at the high end of their error bars, and goes off the graph to the right. It's also compatible with the SHOES and HOLiCOW studies, but not the Plank estimates.

So why does any of this matter?

How Can a Star Be Older Than the Universe? | Space

A very old star named nicknamed Methuselah is currently estimated to be 14.27 billion years old, plus or minus 800 million years. While that estimate (including for errors) is ultimately compatible with the Planck expansion estimates, it's not compatible with SN1A, Cepheid and lensing study estimates.


The rate of expansion ultimately determines the estimated age of the universe, so a faster expansion rate equates to a younger universe. While the SN1A, Cepheid and lensing estimates of the Hubble constant are all compatible with each other, they're not compatible with the estimated age of the oldest stars or the Planck estimates.
 
Last edited:
Upvote 0

Michael

Contributor
Site Supporter
Feb 5, 2002
25,145
1,721
Mt. Shasta, California
Visit site
✟320,648.00
Gender
Male
Faith
Christian
Unsurprisingly, aside from the core assumption that redshift is caused by expansion, I can't really find any questionable assumptions or obvious methodology problems in either the latest SN1A study or the latest Planck methods of calculating H0.

If one begins with the premises that the universe is expanding and the CMB comes from a 'surface of last scattering', the methods used in each of the two studies seem alright to me. The fact that their H0 calculations are inconsistent with one another however doesn't bode well for an expansion interpretation of redshift. In theory, the SN1A estimates should have been able to predict and match up with the CMB estimates, but that's not what happened.

Assuming nobody else finds any serious methodology problems, this is quite a dilemma. In their shoes I'd be tempted to wait to see a deep field JWST image before deciding what to try to do about it. Adding a whole new variable to GR isn't a very attractive option IMO.

I think the JWST images are likely to put the final nails in the coffin of the belief that redshift is caused by expansion, which would in fact explain why the numbers don't match.
 
Upvote 0