Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.
Then there's no point in talking about propabilities.
How do you know that?
Exactly. And for all we know, the nature of this universe is the only nature that can possibly be.
I am indeed using it to support the premise of actual design of the universe, but those who have determined the fine tuning have not. That is the point of scientific evidence. Do you wish to throw out peer reviewed evidence just to make sure I don't have the evidence to support my position? That seems rather illogical.It matters, because you're appealing to it as evidence of design. If it's impossible for a universe where things are unmeasurable to exist, that assertion is meaningless.
That's the whole problem. We have a sample set of exactly one. Talking about the probability of another universe working with is pointless since we A) can't even be aware of all the possible variables and B) can't even test this to see how accurate the probability is.
I take the word of those more qualified than I that have determined that.
I take the word of those more qualified than I that have determined that.
If the constants can't change, then the universe is inevitable.
Why? You could just read their stuff.
I believe that they took that into account.
That might be true of the equations of Smolin, however, most of the peer group didn't have issue with those calculations. Ingrained within the mathematics were alternate universes. I am not as mathematical minded as our Professor here, so I really can't say personally just how accurate that assessment was. We do however know the needed requirements for life in this universe and what it would have done if the smallest measure was differed in its creation.
So? It is completely chance. What is your point?
I didn't say they could?
They are not mere accidents or coincidence.
I said it is supportive evidence.
How many combinations of those constants results in a universe that produces life?
The premise of the fine-tuned Universe assertion is that a small change in several of the dimensionless fundamental physical constants would make the Universe radically different. As Stephen Hawking has noted, "The laws of science, as we know them at present, contain many fundamental numbers, like the size of the electric charge of the electron and the ratio of the masses of the proton and the electron. ... The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life."[9]
If, for example, the strong nuclear force were 2% stronger than it is (i.e., if the coupling constant representing its strength were 2% larger), while the other constants were left unchanged, diprotons would be stable and hydrogen would fuse into them instead of deuterium and helium.[10] This would drastically alter the physics of stars, and presumably preclude the existence of life similar to what we observe on Earth. The existence of the di-proton would short-circuit the slow fusion of hydrogen into deuterium. Hydrogen would fuse so easily that it is likely that all of the Universe's hydrogen would be consumed in the first few minutes after the Big Bang.[10] However, some of the fundamental constants describe the properties of the unstable strange, charmed, bottom and top quarks and mu and tau leptons that seem to play little part in the Universe or the structure of matter.[citation needed]
The precise formulation of the idea is made difficult by the fact that physicists do not yet know how many independent physical constants there are. The current standard model of particle physics has 25 freely adjustable parameters with an additional parameter, the cosmological constant, for gravitation. However, because the standard model is not mathematically self-consistent under certain conditions (e.g., at very high energies, at which both quantum mechanics and general relativity are relevant), physicists believe that it is underlaid by some other theory, such as a grand unified theory, string theory, or loop quantum gravity. In some candidate theories, the actual number of independent physical constants may be as small as one. For example, the cosmological constant may be a fundamental constant, but attempts have also been made to calculate it from other constants, and according to the author of one such calculation, "the small value of the cosmological constant is telling us that a remarkably precise and totally unexpected relation exists among all the parameters of the Standard Model of particle physics, the bare cosmological constant and unknown physics."[11]
Martin Rees[12] formulates the fine-tuning of the Universe in terms of the following six dimensionless constants:
Fine-tuned Universe - Wikipedia, the free encyclopedia
- N = ratio of the strengths of gravity to that of electromagnetism;
- Epsilon (ε) = strength of the force binding nucleons into nuclei;
- Omega (ω) = relative importance of gravity and expansion energy in the Universe;
- Lambda (λ) = cosmological constant;
- Q = ratio of the gravitational energy required to pull a large galaxy apart to the energy equivalent of its mass;
- D = number of spatial dimensions in spacetime.
How do we know that these numbers are all independent of one another? If any two are dependent and we make a change in one and make anything but the corresponding change in the other, then of course the imagined universe will fail. It would be like defining a system of math which defines multiplication such that 2*a = a+a, but then assigning 2+2 = 4 while 2*2 = 8.
The premise of the fine-tuned Universe assertion is that a small change in several of the dimensionless fundamental physical constants would make the Universe radically different. As Stephen Hawking has noted, "The laws of science, as we know them at present, contain many fundamental numbers, like the size of the electric charge of the electron and the ratio of the masses of the proton and the electron. ... The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life."[9]
If, for example, the strong nuclear force were 2% stronger than it is (i.e., if the coupling constant representing its strength were 2% larger), while the other constants were left unchanged, diprotons would be stable and hydrogen would fuse into them instead of deuterium and helium.[10] This would drastically alter the physics of stars, and presumably preclude the existence of life similar to what we observe on Earth. The existence of the di-proton would short-circuit the slow fusion of hydrogen into deuterium. Hydrogen would fuse so easily that it is likely that all of the Universe's hydrogen would be consumed in the first few minutes after the Big Bang.[10] However, some of the fundamental constants describe the properties of the unstable strange, charmed, bottom and top quarks and mu and tau leptons that seem to play little part in the Universe or the structure of matter.[citation needed]
The precise formulation of the idea is made difficult by the fact that physicists do not yet know how many independent physical constants there are. The current standard model of particle physics has 25 freely adjustable parameters with an additional parameter, the cosmological constant, for gravitation. However, because the standard model is not mathematically self-consistent under certain conditions (e.g., at very high energies, at which both quantum mechanics and general relativity are relevant), physicists believe that it is underlaid by some other theory, such as a grand unified theory, string theory, or loop quantum gravity. In some candidate theories, the actual number of independent physical constants may be as small as one. For example, the cosmological constant may be a fundamental constant, but attempts have also been made to calculate it from other constants, and according to the author of one such calculation, "the small value of the cosmological constant is telling us that a remarkably precise and totally unexpected relation exists among all the parameters of the Standard Model of particle physics, the bare cosmological constant and unknown physics."[11]
Martin Rees[12] formulates the fine-tuning of the Universe in terms of the following six dimensionless constants:
Fine-tuned Universe - Wikipedia, the free encyclopedia
- N = ratio of the strengths of gravity to that of electromagnetism;
- Epsilon (ε) = strength of the force binding nucleons into nuclei;
- Omega (ω) = relative importance of gravity and expansion energy in the Universe;
- Lambda (λ) = cosmological constant;
- Q = ratio of the gravitational energy required to pull a large galaxy apart to the energy equivalent of its mass;
- D = number of spatial dimensions in spacetime.
Well considering we have no evidence for anymore than the one we have what do you say?
The book has been reviewed by Physicists that would most certainly make known if the numbers Smolin used were not accurate.
What experiments?
On all the constants. For example:
"Lightspeed" redirects here. For other uses, see Speed of light (disambiguation) and Lightspeed (disambiguation).
Speed of light![]()
Sunlight takes about 8 minutes 17 seconds to travel the average distance from the surface of the Sun to the Earth.
Exact values metres per second 299,792,458 Planck length per Planck time
(i.e., Planck units) 1 Approximate values kilometres per second 300,000 kilometres per hour 1,080 million miles per second 186,000 miles per hour 671 million astronomical units per day 173 Approximate light signal travel times Distance Time one foot 1.0 ns one metre 3.3 ns from geostationary orbit to Earth 119 ms the length of Earth's equator 134 ms from Moon to Earth 1.3 s from Sun to Earth (1 AU) 8.3 min from nearest star to Sun (1.3 pc) 4.2 years from the nearest galaxy (the Canis Major Dwarf Galaxy) to Earth 25,000 years across the Milky Way 100,000 years from the Andromeda Galaxy (the nearest spiral galaxy) to Earth 2.5 million years The speed of light in vacuum, commonly denoted c, is a universal physical constant important in many areas of physics. Its value is exactly 299,792,458 metres per second, a figure that is exact because the length of the metre is defined from this constant and the international standard for time.[1] This is approximately 186,282.4 miles per second, or about 671 million miles per hour. According to special relativity, c is the maximum speed at which all energy, matter, and information in the universe can travel. It is the speed at which all massless particles and associated fields (including electromagnetic radiation such as light) travel in vacuum. It is also the speed of gravity (i.e. of gravitational waves) predicted by current theories. Such particles and waves travel at c regardless of the motion of the source or the inertial frame of reference of the observer. In the theory of relativity, c interrelates space and time, and also appears in the famous equation of massenergy equivalence E = mc2.[2]
The speed at which light propagates through transparent materials, such as glass or air, is less than c. The ratio between c and the speed v at which light travels in a material is called the refractive index n of the material (n = c / v). For example, for visible light the refractive index of glass is typically around 1.5, meaning that light in glass travels at c / 1.5 ≈ 200,000 km/s; the refractive index of air for visible light is 1.000293, so the speed of light in air is 299,705 km/s or about 88 km/s slower than c.
In most practical cases, light and other electromagnetic waves can be thought of as moving "instantaneously", but for long distances and very sensitive measurements their finite speed has noticeable effects. For example, in videos of an intense lightning storm on the Earth's surface taken from the International Space Station, the expansion of light wavefronts from individual flashes of lightning is clearly visible, and allows estimates of the speed of light to be made from frame-to-frame analysis of the position of the light wavefront. This is not surprising, as the time for light to propagate completely around the Earth is of the order of 140 milliseconds. This transit time is what causes the Schumann resonance. In communicating with distant space probes, it can take minutes to hours for a message to get from Earth to the spacecraft, or vice versa. The light we see from stars left them many years ago, allowing us to study the history of the universe by looking at distant objects. The finite speed of light also limits the theoretical maximum speed of computers, since information must be sent within the computer from chip to chip. Finally, the speed of light can be used with time of flight measurements to measure large distances to high precision.
Ole Rømer first demonstrated in 1676 that light travelled at a finite speed (as opposed to instantaneously) by studying the apparent motion of Jupiter's moon Io. In 1865, James Clerk Maxwell proposed that light was an electromagnetic wave, and therefore travelled at the speed c appearing in his theory of electromagnetism.[3] In 1905, Albert Einstein postulated that the speed of light with respect to any inertial frame is independent of the motion of the light source,[4] and explored the consequences of that postulate by deriving the special theory of relativity and showing that the parameter c had relevance outside of the context of light and electromagnetism. After centuries of increasingly precise measurements, in 1975 the speed of light was known to be 299,792,458 m/s with a measurement uncertainty of 4 parts per billion. In 1983, the metre was redefined in the International System of Units (SI) as the distance travelled by light in vacuum in 1/299,792,458 of a second. As a result, the numerical value of c in metres per second is now fixed exactly by the definition of the metre.[
Speed of light - Wikipedia, the free encyclopedia
How do we know that these numbers are all independent of one another? If any two are dependent and we make a change in one and make anything but the corresponding change in the other, then of course the imagined universe will fail. It would be like defining a system of math which defines multiplication such that 2*a = a+a, but then assigning 2+2 = 4 while 2*2 = 8.
Are you claiming that fine tuning is false?
On all the constants. For example:
Are you claiming that fine tuning is false?
Can a universe different than ours still have intelligent life?
Also, if the chances of a universe with life is one in a trillion and there have been 10 trillion universes then our universe is not that improbable. It is the same as the lottery example.
I would say that we need evidence that other universes do not exist before we can make any probability calculations.
Where did Smolin show that God designed the Universe?
Are you claiming to have evidence that God fine tuned anything?